1 00:00:06,000 --> 00:00:22,000 If you've spent any time online recently, you'll know that the Internet is a really wild place. It feels as if we're learning in startling ways here 30 years on that. The norms that we have in the real world don't really carry over into our online spaces. 2 00:00:22,000 --> 00:00:24,000 But what if. 3 00:00:24,000 --> 00:00:26,000 It didn't have to be that way. 4 00:00:26,000 --> 00:00:33,000 What would we need for society to be better online and governed online in the same ways that it's governed in the real world? 5 00:00:33,000 --> 00:00:39,000 Hi, everyone. I'm Chris Freeland, and I'm a librarian at the Internet Archive. I wanna welcome you to today's book. Talk. 6 00:00:39,000 --> 00:00:50,000 In governable spaces. Author explores how we can transform digital spaces into more democratic and creative environments inspired by governance legacies of the past. 7 00:00:50,000 --> 00:01:15,000 Joining Nathan for today's conversation will be Lily Aroni, an author at the University of California, San Diego, and faculty director of the UC. San Diego Labor Center. As we start up here, I'd like to make sure that you have grabbed your copy of governable spaces. Duncan, who's working behind the scenes here today with Caitlin. Hello, both of you. And thank you. Just gonna share a link on the chat so that you can either download and read. 8 00:01:15,000 --> 00:01:22,000 The Free Open Access edition, or you can purchase it in print from a number of booksellers that Nathan is listed on his website. 9 00:01:22,000 --> 00:01:26,000 So before we dive in, let's take care of some logistics. 10 00:01:26,000 --> 00:01:46,000 Please, use the chat to say Hello, introduce yourself and let us know where you're joining in from today. Our chat is open for discussion. Please keep it relevant and respectful, and feel free to drop in questions, using the chat for our panelists to pick up during our Q. And a session. As we round towards the end of our discussion today. 11 00:01:46,000 --> 00:02:10,000 No question that always comes up. Yes, we are recording this session, and tomorrow participants will receive an email that contains a link to that recording the links that we're sharing here in the chat as well as the Chat Transcript. So you don't need to be worried about furiously writing everything down. We're capturing all of it. It'll be shared on Internet Archive tomorrow, and you'll get an email in your inbox. 12 00:02:10,000 --> 00:02:26,000 So now I would like to welcome Dave Hansen to the screen. Dave is the Executive Director of Authors Alliance, where a cohost for these book talks, and Dave's going to help provide additional background for today's conversation over to you, Dave. 13 00:02:26,000 --> 00:02:51,000 Hey, Chris? Hi, everyone I'm really glad to be back. We've had a a hiatus over the summer with our book series. And it's yeah, it's really exciting to get started again. We have a great lineup for this fall. So thank you. To the many people who are back for those of you who haven't been with us. Authors, alliance and Internet Archive. 14 00:02:51,000 --> 00:03:04,000 Cosponsor, this book series, because we have a lot of things in common when it comes to trying to understand and address issues related to access to knowledge. 15 00:03:04,000 --> 00:03:26,000 And the free exchange of ideas, especially on the Internet. So which I think we're gonna get into a bit today with governor spaces. You know, as I think about how authors exchange their ideas with the world, and how we can best support them, and doing that. So much of it depends on. 16 00:03:26,000 --> 00:03:32,000 Having well run spaces online for those types of interactions to happen. 17 00:03:32,000 --> 00:03:39,000 And for those of you who know authors, you know, we we spend a lot of time thinking about law and policy, and actually. 18 00:03:39,000 --> 00:03:57,000 Over the last few years we found ourselves more and more involved in some of the legal disputes around governance of online spaces, and what kind of legal rules and liability rules attached to online providers when they're hosting content from. 19 00:03:57,000 --> 00:04:00,000 From authors and and others. 20 00:04:00,000 --> 00:04:11,000 So I'm really excited for this talk today. I guess I should say a little bit about authors alliance for those of you who are new and haven't heard of us before. 21 00:04:11,000 --> 00:04:30,000 So authors alliance is a nonprofit that was formed about 10 years ago by authors who, really have, as their main objective to have their works be read they wanna benefit the world by seeing widespread dissemination of their works. And wanna see the kind of maximum reach and impact. 22 00:04:30,000 --> 00:04:38,000 Of their writing. And so we have about 3,000 members. 23 00:04:38,000 --> 00:04:58,000 Authors of all types, everyone, from Nobel laureates to journalists, to fan fiction writers, and kind of everything in between. And what really unites us is that we want to see the world become a better place through the addition of our creative outputs. 24 00:04:58,000 --> 00:05:16,000 Membership is free. So I would be remiss if I did not invite you to join. You're very welcome to join and doing so gives a few benefits. One is a closer connection to us, so that you can help inform kind of the direction that we go and. 25 00:05:16,000 --> 00:05:25,000 Help us voice the provide a a platform for the voice of authors who wanna support the public interest? 26 00:05:25,000 --> 00:05:39,000 And you'll also hear more about some of the cool stuff that we're doing and some of the resources that we put out particularly educational resources to help authors navigate some of the complex law and policy issues that they face, especially online. 27 00:05:39,000 --> 00:05:53,000 So with that I'm excited to be able to introduce our our guest today. So 1st is Lily Irani, who is an associate professor of Communication. 28 00:05:53,000 --> 00:05:58,000 And Science studies at the University of California, San Diego. 29 00:05:58,000 --> 00:06:03,000 Where she is the faculty director of the Uc. San Diego labor. 30 00:06:03,000 --> 00:06:07,000 She's the author of chasing innovation. 31 00:06:07,000 --> 00:06:14,000 Making entrepreneurial citizens in Modern India, by published by Princeton University Press. 32 00:06:14,000 --> 00:06:20,000 And also redacted with Jesse Marks by taller California. 33 00:06:20,000 --> 00:06:25,000 She organizes tech workers. San Diego. 34 00:06:25,000 --> 00:06:31,000 And serves on the steering committee of the transparent and responsible use of surveillance technology. 35 00:06:31,000 --> 00:06:33,000 Sd Coalition. 36 00:06:33,000 --> 00:06:37,000 And the Board of the United Taxi Workers. 37 00:06:37,000 --> 00:06:43,000 She is co-founder of data, worker organization projects and activism tool. 38 00:06:43,000 --> 00:06:57,000 Took up to Con. I think I said that right. I googled it, but I found no pronunciation guide. So, Lily. We'll be moderating. And in conversation with Nathan Schneider. 39 00:06:57,000 --> 00:07:07,000 Who is himself assistant Professor of Media Studies at the University of Colorado Boulder, where he leads the Media's design lab. 40 00:07:07,000 --> 00:07:12,000 And the Ma program in media and public engagement. 41 00:07:12,000 --> 00:07:14,000 He's the author of 4 books. 42 00:07:14,000 --> 00:07:20,000 The most recent. We're gonna hear about todayable spaces, democratic design for online life. 43 00:07:20,000 --> 00:07:22,000 Published by the University of California Press. 44 00:07:22,000 --> 00:07:27,000 And also author of everything for everyone. 45 00:07:27,000 --> 00:07:33,000 The radical tradition that is shaping the next economy published by bold type books. 46 00:07:33,000 --> 00:07:34,000 In 2,018. 47 00:07:34,000 --> 00:07:42,000 So I think I have said enough. I I had longer bios that I could have pulled from, but I think you get the sense that both of these folks. 48 00:07:42,000 --> 00:07:47,000 Know what they're talking about. Which is part of the point of some of these introductions. 49 00:07:47,000 --> 00:07:57,000 And if you've read anything even just a little bit of governmentable spaces, I think you'll appreciate that. This is touching on a very important. 50 00:07:57,000 --> 00:07:59,000 Issue, right? Now. 51 00:07:59,000 --> 00:08:05,000 So with that I will turn it over to Lily and Nathan. 52 00:08:05,000 --> 00:08:10,000 Thanks. 53 00:08:10,000 --> 00:08:14,000 Oh, my God! Hi! This is so exciting! To be here to celebrate. 54 00:08:14,000 --> 00:08:20,000 Your book. I've known Nathan for a long time. You'll I think. 55 00:08:20,000 --> 00:08:22,000 We met. 56 00:08:22,000 --> 00:08:24,000 You're a journalist. 57 00:08:24,000 --> 00:08:31,000 Activists. And I was a grad student activist. And so we've been kind of on this learning path, together with our different projects, like. 58 00:08:31,000 --> 00:08:40,000 Just trying to build things with people and then figure out the worst to even talk about what are the problems that we're facing and like, how do we talk about it with other people? 59 00:08:40,000 --> 00:08:54,000 So yeah, like this book, just shares so much learning that you've done, and so much kind of reading and thinking you've done with other people, and so really grateful for you, overcoming writers perfectionis and putting it out into the world. 60 00:08:54,000 --> 00:09:01,000 I thought maybe a place we could start. The conversation is you know, starting with. 61 00:09:01,000 --> 00:09:15,000 I don't know. Many of us might have an experience of, you know, X twitter which you know has for me at least turned into kind of hellscape of crypto, spam, and Elon Musk, suing other capitalists for not advertising with him. Enough. 62 00:09:15,000 --> 00:09:33,000 But it's always been a place of what you call in the book, kind of implicitism, like a, you know, really centralized form of governance that is convenient for us, and we've done a lot over the last decades like advocate for slightly different kinds of governance, different content, moderation policies. 63 00:09:33,000 --> 00:09:34,000 But. 64 00:09:34,000 --> 00:09:36,000 It doesn't really feel like. 65 00:09:36,000 --> 00:09:43,000 Our culture of social media. Or maybe I would say, like being here at the border in the Us. Is like moving in the right direction, like. 66 00:09:43,000 --> 00:10:00,000 We're, you know, here in San Diego and on the Kumi Island, like we're leaning into strengthening the border, you know. Like anti immigrant rhetoric as a kind of false solution to a lot of the issues we have in our kind of public spaces, like housing. 67 00:10:00,000 --> 00:10:05,000 So something strange and terrible is happening. I worry. 68 00:10:05,000 --> 00:10:09,000 And in your book you write. There's optimism in the rot. 69 00:10:09,000 --> 00:10:27,000 But also you warn that self governance is not a solution. It's a practice for problem solving. So where do you find optimism and the possibilities of where we find ourselves. At least, you know you and I both live in the United States and settle on Settler Land. That's been similar to colonized. 70 00:10:27,000 --> 00:10:30,000 Like, where do you find the possibilities for optimism? 71 00:10:30,000 --> 00:10:42,000 And where do you think self governance might help. 72 00:10:42,000 --> 00:10:43,000 Be with you. 73 00:10:43,000 --> 00:10:47,000 I've learned so much from from your work, and. 74 00:10:47,000 --> 00:10:55,000 From our relationship. And thank you, Dave and Chris and Caitlin, and that archive and the author's alliance for. 75 00:10:55,000 --> 00:10:57,000 For hosting this 76 00:10:57,000 --> 00:11:02,000 It's it. These are organizations that are just so deeply resonant with. 77 00:11:02,000 --> 00:11:06,000 You know the values I tried to to bring into this work. 78 00:11:06,000 --> 00:11:08,000 You know. I I think we have. 79 00:11:08,000 --> 00:11:12,000 To, you know. To me the hope is kind of in plain sight. 80 00:11:12,000 --> 00:11:15,000 And that is the. 81 00:11:15,000 --> 00:11:16,000 The kind of. 82 00:11:16,000 --> 00:11:39,000 Initial impulse that that motivates this book is was this conversation series of conversations with my mother while I was having trouble with managing a complicated email list where somebody was misbehaving on the Internet. And I started realizing, Wow, I I'm I'm the admin of this email list. And I didn't really know how to address this. 83 00:11:39,000 --> 00:11:45,000 Problem democratically, and nothing in the tool set of the email list was helping me do that. It was all about. 84 00:11:45,000 --> 00:11:53,000 How I could just make the problem go away by removing people, or, you know otherwise asserting raw power. 85 00:11:53,000 --> 00:12:02,000 And meanwhile, having conversations with my mother about her garden club, which she had just been elected president of the Garden Club for her neighborhood. 86 00:12:02,000 --> 00:12:03,000 And. 87 00:12:03,000 --> 00:12:13,000 I was just struck by how that Garden club had more sophisticated practices of governance than I had ever experienced in any online space. 88 00:12:13,000 --> 00:12:17,000 And and so that's it was just kind of this. 89 00:12:17,000 --> 00:12:21,000 Shock it, shock recognition, that. 90 00:12:21,000 --> 00:12:24,000 Maybe the ways of solving these. 91 00:12:24,000 --> 00:12:26,000 Problems with online space. 92 00:12:26,000 --> 00:12:34,000 Are right in front of us, and there are things that we have known before. Both in. 93 00:12:34,000 --> 00:12:41,000 You know, in the context of, you know, these colonizer societies as well as an indigenous communities that have preceded them. 94 00:12:41,000 --> 00:12:46,000 That that we know how to self govern. And actually we turned off. 95 00:12:46,000 --> 00:12:49,000 A lot of that knowledge. When we go online. 96 00:12:49,000 --> 00:12:55,000 And we've created tools that don't open doors for that knowledge and that actually inhibit it. 97 00:12:55,000 --> 00:12:58,000 And this took me on a journey through the history of. 98 00:12:58,000 --> 00:13:02,000 The development of of online spaces, and how. 99 00:13:02,000 --> 00:13:06,000 Things came to be the way they are. But it. 100 00:13:06,000 --> 00:13:13,000 Came to, you know, brought me to a recognition of the stakes. These things that are hidden in plain sight, these garden clubs. 101 00:13:13,000 --> 00:13:28,000 Are actually an essential fabric of democratic life. This is a an insight that goes back to like Alexis de Tocqueville, Clr. James Wb. Dubois. So many thinkers on democracy. 102 00:13:28,000 --> 00:13:36,000 In recent centuries have stress that you can't have democracy at larger scales unless you're also developing it and practicing it. 103 00:13:36,000 --> 00:13:40,000 At at in, at the scale of everyday life. 104 00:13:40,000 --> 00:13:43,000 And that start that that to me. 105 00:13:43,000 --> 00:13:48,000 You know that? That's what connects this question of like, how do we run. 106 00:13:48,000 --> 00:13:53,000 You know an email list to the question of what is going on in our democracy. 107 00:13:53,000 --> 00:14:06,000 As a whole. Why are we having this resurgent authoritarianism, this this really scary appeal of of almost monarchist politics, taking hold around the world. 108 00:14:06,000 --> 00:14:07,000 And I think, actually, that. 109 00:14:07,000 --> 00:14:16,000 You know that larger scale kind of imaginary, of how we run our governance, and the kind of everyday. 110 00:14:16,000 --> 00:14:24,000 Scale of how we live in our in our online lives are actually intimately connected. 111 00:14:24,000 --> 00:14:35,000 Nathan, I just realized that. You'd also prepared kind of a small overview of the book. Right? Do you wanna actually go into that right now, or do you want to just keep the conversation. 112 00:14:35,000 --> 00:14:38,000 Going, we can definitely come back to picking up some of these threads. 113 00:14:38,000 --> 00:14:41,000 Yeah, I think we can keep the conversation going. 114 00:14:41,000 --> 00:14:42,000 Yeah. 115 00:14:42,000 --> 00:14:44,000 Billy. Okay. 116 00:14:44,000 --> 00:14:45,000 Well. 117 00:14:45,000 --> 00:14:46,000 So. 118 00:14:46,000 --> 00:14:49,000 You talk about. Okay, this is. 119 00:14:49,000 --> 00:15:03,000 You know, we need them. I like we like talked about this before. But I'm actually just responding to what you said right now. So like when you talk about Tocqueville and the kinds of skills of democracy that we already have kind of in plain sight. 120 00:15:03,000 --> 00:15:05,000 You know, I also wanna kind of. 121 00:15:05,000 --> 00:15:09,000 Draw attention to the fact that, like, you know, the Garden Club. 122 00:15:09,000 --> 00:15:33,000 Is also, you know, the garden's in a bill, and tradition right like we've got suburb in Los Angeles like I grew up in the suburbs of La. The Garden Club can be like, predicated on keeping certain people like out of the neighborhood using the police to do that. And in your book, you actually draw like you start with the Garden Club. But you actually kind of take us through examples from like transformative justice and like abolitionist movement. So like. 123 00:15:33,000 --> 00:15:35,000 How 124 00:15:35,000 --> 00:15:43,000 You know, like, what are the kinds of practices that we have in plain sight? And what are the kinds of practices that might require us to learn some different muscles. 125 00:15:43,000 --> 00:15:57,000 Than the ones that we've grown up. You know the ones we've grown up with, like I me growing up in the Mall and on commercial and then working in Silicon Valley like, there's a lot of democratic muscles. I actually didn't have. 126 00:15:57,000 --> 00:16:07,000 And a kind of society that was optimized for me, producing cool, startup ideas for Sergeant to mostly own. 127 00:16:07,000 --> 00:16:08,000 The 128 00:16:08,000 --> 00:16:13,000 You know, it was actually in a conversation with you some months ago that I realized that. 129 00:16:13,000 --> 00:16:15,000 You know the what I. 130 00:16:15,000 --> 00:16:19,000 Do as a as a scholar before that, as a journalist is. 131 00:16:19,000 --> 00:16:23,000 Is the study of negative space, like, what are the things that we. 132 00:16:23,000 --> 00:16:24,000 Don't! 133 00:16:24,000 --> 00:16:26,000 You know, that aren't normal. That could be. 134 00:16:26,000 --> 00:16:31,000 And and this book is really an exercise in that. And in order to. 135 00:16:31,000 --> 00:16:35,000 Find those places, you know. We have to see. 136 00:16:35,000 --> 00:16:39,000 Where are the cracks in the, in, the, in the dominant mold. 137 00:16:39,000 --> 00:16:43,000 And and I look at a bunch of different. 138 00:16:43,000 --> 00:16:49,000 Cracks out there that, I think are are deeply connected. For instance, in the middle of the book, after exploring this. 139 00:16:49,000 --> 00:16:51,000 This history. 140 00:16:51,000 --> 00:16:55,000 You know I look at 2 particular cracks. 141 00:16:55,000 --> 00:17:01,000 Groups of people who probably wouldn't want to be in a room together, but who, I think, have actually. 142 00:17:01,000 --> 00:17:04,000 Some interesting things in common. Those are. 143 00:17:04,000 --> 00:17:08,000 People who are exploring transformative justice, as you just mentioned. 144 00:17:08,000 --> 00:17:10,000 People are trying to explore. 145 00:17:10,000 --> 00:17:15,000 Forms of addressing harm and conflict outside of. 146 00:17:15,000 --> 00:17:17,000 Violence and incarceration. 147 00:17:17,000 --> 00:17:27,000 And then also people trying to build new kinds of organizations with blockchains in the kind of crypto subculture. Very different languages, very different. 148 00:17:27,000 --> 00:17:32,000 You know, cultures and all sorts of ways, but in both cases they are. 149 00:17:32,000 --> 00:17:34,000 They're they're confronted with a problem. 150 00:17:34,000 --> 00:17:38,000 They have to find ways to solve. 151 00:17:38,000 --> 00:17:42,000 to to to do governance. 152 00:17:42,000 --> 00:17:48,000 Collectively, without relying on someone telling them what to do without 153 00:17:48,000 --> 00:17:52,000 Kind of a kind of admin who's in charge. 154 00:17:52,000 --> 00:17:56,000 And this is what our online spaces are set up for is for you. 155 00:17:56,000 --> 00:17:59,000 Whoever's running the server to be in charge. 156 00:17:59,000 --> 00:18:06,000 A kind of top down model that that I call implicit feudalism. This is the idea that. 157 00:18:06,000 --> 00:18:07,000 That you. 158 00:18:07,000 --> 00:18:16,000 That that on every online service, on every platform, and don't even think about like the big companies. Think about the small scale communities first.st 159 00:18:16,000 --> 00:18:21,000 That, you know. I think about your group chat. Somebody's in charge of that group. Chat. 160 00:18:21,000 --> 00:18:23,000 The system is set up for that. 161 00:18:23,000 --> 00:18:25,000 And 162 00:18:25,000 --> 00:18:26,000 Black. 163 00:18:26,000 --> 00:18:27,000 Yeah, absolutely. 164 00:18:27,000 --> 00:18:28,000 Like slack channel of my students. Yeah. 165 00:18:28,000 --> 00:18:29,000 Hit. 166 00:18:29,000 --> 00:18:30,000 Pick your tool. 167 00:18:30,000 --> 00:18:37,000 And yet in these spaces, both, whether they're trying to manage a decentralized blockchain or. 168 00:18:37,000 --> 00:18:41,000 Address harm in in neighborhoods. 169 00:18:41,000 --> 00:18:48,000 These are spaces where people are having to solve problems of action and collective governance, that. 170 00:18:48,000 --> 00:18:51,000 That the the current tools are not set up for. 171 00:18:51,000 --> 00:18:54,000 In the case of the transform of justice folks. 172 00:18:54,000 --> 00:19:00,000 In a lot of the the documents out of that. Those communities. They've actually just said, even though these are very online people. 173 00:19:00,000 --> 00:19:02,000 Don't do these processes online. 174 00:19:02,000 --> 00:19:07,000 Our online spaces are not set up for it. And to me, that's a really profound. 175 00:19:07,000 --> 00:19:09,000 You know that these are not like. 176 00:19:09,000 --> 00:19:21,000 People who hate technology. These are people who are on Instagram all the time and all that, but they are. But they know that they're these sensitive processes of addressing harm. We have not built our technology for that. 177 00:19:21,000 --> 00:19:27,000 And and that poses a question to me like, what would it look like if if that was. 178 00:19:27,000 --> 00:19:32,000 If that was the kind of used case that we were building tools around, how would our tools be different. 179 00:19:32,000 --> 00:19:50,000 In the context of of blockchains. This is a space where people are trying to build something that breaks the implicit feudalist logic without necessarily using those words in the sense of trying to create tools that are co-governed among multiple participants. And and and from. 180 00:19:50,000 --> 00:20:04,000 out of those technologies. Actually, it's created the problem of self governance in a way that we haven't seen online before, like and and as a result it's forced the development of a whole bunch of new. 181 00:20:04,000 --> 00:20:12,000 Governance mechanisms and technologies. To solve the problem of collective decision making, to enable it to to be a practice. 182 00:20:12,000 --> 00:20:14,000 And in some ways the. 183 00:20:14,000 --> 00:20:19,000 You know, in both of these cases they they point to. 184 00:20:19,000 --> 00:20:25,000 The the, the recognition that oh, this isn't something we've even tried to address yet. 185 00:20:25,000 --> 00:20:36,000 And and as they enter into these challenges, either by refusing to engage on on online, or to actually have to build a whole new set of tools from scratch. 186 00:20:36,000 --> 00:20:38,000 It's a 187 00:20:38,000 --> 00:20:43,000 It's a reminder of how much that implicit feudalism has been dominant. 188 00:20:43,000 --> 00:20:56,000 And how much? Actually, we haven't really explored the question of what it could look like to self govern. And it's that design space, that, above all, this book is an attempt to invite folks into. 189 00:20:56,000 --> 00:20:57,000 Yeah. 190 00:20:57,000 --> 00:21:00,000 So you're talking about. 191 00:21:00,000 --> 00:21:03,000 Like, where and how do we like, learn. 192 00:21:03,000 --> 00:21:07,000 Possible ways of self-governing, and like. I love this. 193 00:21:07,000 --> 00:21:16,000 You know, finding the spaces like, you know, finding the spaces like when nobody we're like. There's things we don't know about that people are making then there's. 194 00:21:16,000 --> 00:21:22,000 You're talking about like having the tools to actually support that, you know, like the capitalist. 195 00:21:22,000 --> 00:21:26,000 Internet programming culture and machine is not you. 196 00:21:26,000 --> 00:21:30,000 Is not had doesn't have the incentives to like. Give people the time to build those tools. 197 00:21:30,000 --> 00:21:38,000 But then there's also kind of I wanted to ask you about time. You know, self governance is really hard. And so one of the things that's. 198 00:21:38,000 --> 00:21:42,000 Been happening in San Diego. 199 00:21:42,000 --> 00:21:43,000 Bunch. 200 00:21:43,000 --> 00:21:50,000 Students and I, and also some folks in the Dsa. Here been working with the taxi workers, San Diego over the last 4 years. 201 00:21:50,000 --> 00:22:12,000 Yeah, taxi workers is a taxi worker center and we've been helping them launch an app called Rid United, that's backed by a cooperative of drivers. And so the app is accountable to driver wants and needs. But it's been really tricky to figure out like what drivers wanna make decisions about what drivers don't wanna make decisions about, and then figuring out how to structure our time. 202 00:22:12,000 --> 00:22:28,000 So that they actually have the time to do that because they're working really long days. You know, they're like, always on the road. You know, we've like done workshops in the middle of the taxi lot. You know, it took a year just to write the bylaws, you know, and a lot of the drivers who are active didn't have the time to actually spend the time doing the bylaws. 203 00:22:28,000 --> 00:22:29,000 So like. 204 00:22:29,000 --> 00:22:30,000 Ha! Who. 205 00:22:30,000 --> 00:22:37,000 How do we, you know, like, what are some of the things that are? Gonna be hard for us, like? What should we expect and like, how do we. 206 00:22:37,000 --> 00:22:43,000 Figure out how to create the time that we need, and be kind of judicious about that. 207 00:22:43,000 --> 00:22:48,000 1st of all, it's important to recognize that governance is always happening somewhere. 208 00:22:48,000 --> 00:22:49,000 Hmm, hmm. 209 00:22:49,000 --> 00:22:53,000 And and if it, if you're not doing it, someone else is doing it. 210 00:22:53,000 --> 00:22:55,000 And that doesn't mean that it's. 211 00:22:55,000 --> 00:22:56,000 Brings out my anti answer. 212 00:22:56,000 --> 00:22:59,000 It's easy for them, I mean, like. 213 00:22:59,000 --> 00:23:07,000 Read an account of of early startup culture, right, you know, or watch the social network, or whatever I mean, like. 214 00:23:07,000 --> 00:23:11,000 The you know these are complicated stories, no matter what. 215 00:23:11,000 --> 00:23:19,000 And you know, doing governance at large scale, like investors do that in the stock market. Right? I mean, that's a. 216 00:23:19,000 --> 00:23:29,000 There is a voting structure. There is decision making structures. There are boards all these things. So you know, in some respects, I just wanna hold that recognition that like. 217 00:23:29,000 --> 00:23:33,000 This stuff is happening already. It's just that many of us are not. 218 00:23:33,000 --> 00:23:35,000 Cut in on the deal. 219 00:23:35,000 --> 00:23:36,000 Hmm. 220 00:23:36,000 --> 00:23:38,000 Right? So it's it's really a question not of like. 221 00:23:38,000 --> 00:23:47,000 Could we possibly self govern? But actually like, can we change the the question of who gets to participate. 222 00:23:47,000 --> 00:23:53,000 And and you know, and also then the question becomes, you know, how do we design. 223 00:23:53,000 --> 00:23:55,000 Practices that are appropriate to. 224 00:23:55,000 --> 00:24:00,000 To our organizations. And you know that that requires. 225 00:24:00,000 --> 00:24:02,000 Beginning to recognize that. 226 00:24:02,000 --> 00:24:15,000 You know that there's an attention economy here, and there are some things that we want to be involved in, some things that we don't. In the book I explore a bunch of what I call stacks where communities. 227 00:24:15,000 --> 00:24:35,000 Have been trying to enlarge the space of self-governance I focus on ones that I've been involved in. I see there are a few folks from these here in the in the room like May first, st which is a kind of server hosting, cooperative. I've been part of for years. Social Co-OP, which is a masted on server, that a group of us co-govern. 228 00:24:35,000 --> 00:24:45,000 And in each case, you know, may first, st I don't have to be too involved. I go to an annual meeting. Put in my 2 cents, social involved. 229 00:24:45,000 --> 00:24:48,000 Also think of things like my credit union, which. 230 00:24:48,000 --> 00:24:53,000 You know, I'm not involved in at all, really, but it has a stakeholder structure set up for that. 231 00:24:53,000 --> 00:24:56,000 And I think we we have to recognize that there's going to be a topology. 232 00:24:56,000 --> 00:25:01,000 Of of participation. There are things that we. 233 00:25:01,000 --> 00:25:10,000 Should really be involved in, and should put time into. And then there are things where we can design them to be essentially accountable to us without us having to participate every day. 234 00:25:10,000 --> 00:25:21,000 And and and that, and that can change over the life of an organization. Maybe at the very beginning there's a lot of work involved. But maybe we should design that process so that at the end. 235 00:25:21,000 --> 00:25:28,000 Those drivers can focus on driving, you know, or whatever, or if they want to be really involved, they can be. 236 00:25:28,000 --> 00:25:31,000 But this is a design. Space, I think, is just. 237 00:25:31,000 --> 00:25:36,000 Again, in cases like in the in the crypto world. 238 00:25:36,000 --> 00:25:52,000 Some of these early experiments have shown the need to really better design around attention economies. This is an area of research I've been really focusing on now, just wrapping up a paper on precisely that topic. Of how? What kinds of questions should you ask. 239 00:25:52,000 --> 00:26:01,000 When designing, you know, a governance. You know, a governance surface, a a a system for self governance. 240 00:26:01,000 --> 00:26:04,000 And and what is the appropriate level. 241 00:26:04,000 --> 00:26:08,000 This is? These are old questions again. You know the. 242 00:26:08,000 --> 00:26:18,000 The rural electric 40 something, you know, cooperatives that provide electricity. Just in Colorado there are hundreds around the United States, and you know. 243 00:26:18,000 --> 00:26:23,000 Anymore. Elsewhere, they've been dealing with problems of participation and attention to. 244 00:26:23,000 --> 00:26:34,000 You know, and they have different ways of addressing that here in Colorado who, you know, person dressed in a robot suit, who comes to the annual meetings, you know just part of the way in which. 245 00:26:34,000 --> 00:26:41,000 We make an effort to make sure it's fun to come to your, to your rural meeting. Maybe another context. 246 00:26:41,000 --> 00:26:44,000 You know people of other ways of engaging participation. 247 00:26:44,000 --> 00:26:54,000 In other contexts we recognize, you know, maybe we need more, maybe we need less. But you know to me the the question here is, 1st of all. 248 00:26:54,000 --> 00:27:00,000 To recognize that this is these are problems we can solve and that we solve in other contexts of life. 249 00:27:00,000 --> 00:27:06,000 You know, we we self govern at local levels and regional levels and national levels. 250 00:27:06,000 --> 00:27:19,000 All the time, and we figure out ways to balance those. I don't think it's too much to ask to also be able to to find appropriate ways of doing that in our, in our online lives and and beyond. 251 00:27:19,000 --> 00:27:25,000 Yeah, and one of the things that also makes me think about those, you know, making sure people. 252 00:27:25,000 --> 00:27:32,000 Have the resources to do the participation, and that couldn't also mean, like childcare at the governance meetings. 253 00:27:32,000 --> 00:27:39,000 There's this Trickon project this is a platform for Amazon mechanical trick, like. 254 00:27:39,000 --> 00:27:41,000 Data processing gig workers. 255 00:27:41,000 --> 00:27:43,000 To 256 00:27:43,000 --> 00:27:49,000 Report employee. It basically write reports about how employers are treating them so that we can share it with other workers. 257 00:27:49,000 --> 00:27:51,000 And for 10 years that was kind of running on. 258 00:27:51,000 --> 00:27:56,000 Me, and like 6 silver men and some worker moderators, kinda like running on fumes. 259 00:27:56,000 --> 00:28:03,000 We didn't really have time to self govern the skills frankly, either. But then, over the last 5 years. 260 00:28:03,000 --> 00:28:08,000 We got a grant actually to pay worker themselves. 261 00:28:08,000 --> 00:28:19,000 To figure out how they wanted to govern the system and make decisions both about the system and what kind of advocacy workers would do. And so, you know, Turk workers are being paid like 18 bucks an hour. 262 00:28:19,000 --> 00:28:26,000 Because if they weren't getting paid for that, then they would basically be having to do Amazon Turk work to make ends meet. 263 00:28:26,000 --> 00:28:32,000 So like fundraising can actually be a really important way of like doing the social provisioning, of making sure that you don't have only, like. 264 00:28:32,000 --> 00:28:37,000 Middle class, upper middle class people who have the time to be in these spaces. And you actually talk about. 265 00:28:37,000 --> 00:28:45,000 Add a little bit citing some feminist philosophers. Talking about social, which I really appreciated. 266 00:28:45,000 --> 00:28:46,000 In your book. 267 00:28:46,000 --> 00:28:48,000 So 268 00:28:48,000 --> 00:28:49,000 I guess like. 269 00:28:49,000 --> 00:28:56,000 And with that, like, I I kind of wanted to talk about like other kinds of resourcing as well. So like one of the challenges. 270 00:28:56,000 --> 00:29:00,000 That sometimes comes up with these projects that I've been involved in is. 271 00:29:00,000 --> 00:29:18,000 You know, actually finding the financial resources to give. You know, in my case, like workers who are most directly impacted, the time to be involved in the actual design of it. Getting the financial resources for capital investment, you know, like, venture capitalists aren't going around looking for. 272 00:29:18,000 --> 00:29:22,000 Community run projects that are gonna you know, runable. 273 00:29:22,000 --> 00:29:27,000 You know, not meant to be a scale community software. 274 00:29:27,000 --> 00:29:32,000 You know, in the case of the taxi the cooperative taxi app. 275 00:29:32,000 --> 00:29:45,000 We basically like cobble the money together through a bunch of different grants. And like, there's, you know, taking a tiny bit of the fees from the rise people take that is eventually supposed to help pay for the software bills. 276 00:29:45,000 --> 00:29:47,000 You are. How do we think about. 277 00:29:47,000 --> 00:29:53,000 Like trying to do this cooperative democratic space design and building work and. 278 00:29:53,000 --> 00:29:54,000 A kind of like. 279 00:29:54,000 --> 00:30:00,000 Policy space. And he talked about this in the last chapter of the book. You know, policy space is like deeply imperfect. 280 00:30:00,000 --> 00:30:04,000 For supporting it like we could be struggling for policies and make it. 281 00:30:04,000 --> 00:30:11,000 Easier or incentivize these things like I can see that as one branch, but then also, like you know, the taxi app. 282 00:30:11,000 --> 00:30:28,000 The taxi app is, you know, operated by a private company that's run by a nice guy who is listening to the drivers, you know. And he but he's a private company like we don't own the software because we couldn't get it together. So like, how how do we think about doing this work like imperfectly and transitionally. 283 00:30:28,000 --> 00:30:32,000 Yeah, no, I I appreciate so much what you say about that social. 284 00:30:32,000 --> 00:30:40,000 We really do have to recognize that governance is is an expense. In some respects it's something that should cost something. 285 00:30:40,000 --> 00:30:55,000 And it does again in in our other context, right? And for whether you're talking about government or nonprofits or corporations, all of them expend money on governance. They pay people who do governance work. 286 00:30:55,000 --> 00:30:58,000 And we need to understand that that is. 287 00:30:58,000 --> 00:31:13,000 You know, is an expectation, and should you know, should be a part of how we think about this challenge. You have to provision one way or another to support. This this labor, and if we had an economy that better honored. 288 00:31:13,000 --> 00:31:24,000 Self governance and and mutualist activity, I think that would be, you know, is, would be a very reasonable thing to ask, and and in context, like, for instance. 289 00:31:24,000 --> 00:31:28,000 Agricultural cooperatives, real cooperatives, credit unions where we have. 290 00:31:28,000 --> 00:31:31,000 Policy structures to support. 291 00:31:31,000 --> 00:31:32,000 Co-governance. 292 00:31:32,000 --> 00:31:55,000 You know, if people get paid to to participate in governance? It's not, you know. Th. This is in some respects a solved problem in the context where we've decided to solve it historically. And you know, at the end of this book, it it starts out with our intimate everyday. You know, communities? That we we inhabit online. But it it ultimately does pose the question of. 293 00:31:55,000 --> 00:31:57,000 Policy, and it has to. 294 00:31:57,000 --> 00:32:00,000 Because we can't get to that work of. 295 00:32:00,000 --> 00:32:04,000 You know, self governance ultimately, unless there's real stake and and ownership. 296 00:32:04,000 --> 00:32:27,000 And and I think this requires creating that are, you know, competitive with something like venture capital that enable community ownership and and in turn governance and and I, I really think that this is achievable. I've been working for many years with cooperatives and and related kinds of organizations in tech. 297 00:32:27,000 --> 00:32:41,000 And over and over I see very reasonable ideas that just because of policy structures, are not able to access capital for for things that just should happen, you know, in in the United States, for instance, we have. 298 00:32:41,000 --> 00:32:42,000 A situation where. 299 00:32:42,000 --> 00:32:49,000 Where rural broadband is not available in many places where it needs to be. 300 00:32:49,000 --> 00:32:55,000 And this is a problem we solved before with rural electrification. As I described, we solved it with. 301 00:32:55,000 --> 00:33:05,000 Federal loan programs that are revenue positive for the government. They don't cost anything but that are set up to support collective ownership. 302 00:33:05,000 --> 00:33:12,000 We could be doing this for any kind of business. It might not be the same kinds of things that we get with. 303 00:33:12,000 --> 00:33:13,000 With 304 00:33:13,000 --> 00:33:23,000 With venture capital, but reasonable proven businesses should be able to access capital. And right now they can't. 305 00:33:23,000 --> 00:33:25,000 You know we have situations like, for instance, when. 306 00:33:25,000 --> 00:33:28,000 There was a time when the city of Austin enabled. 307 00:33:28,000 --> 00:33:38,000 The development of a of a nonprofit service. I used it. It worked just fine. It was much better for the drivers than Ubuntu it. It took. 308 00:33:38,000 --> 00:33:41,000 Tiny fees in comparison. 309 00:33:41,000 --> 00:33:44,000 And yet it just was not able to. 310 00:33:44,000 --> 00:33:55,000 To compete with companies that are able to pour billions and billions into controlling markets and and so we really need to. 311 00:33:55,000 --> 00:33:59,000 ensure that there's the capacity to. 312 00:33:59,000 --> 00:34:19,000 To direct capital, to shared ownership, to reasonable things that people wanna do again. I think this is this is more possible than it sounds, and in other sectors through our history we've done it, you know. A few months ago I was down at Cobank, which is a cooperative bank for for agricultural cooperatives in Denver. 313 00:34:19,000 --> 00:34:26,000 And the the CEO was talking about how much investment they're doing now in data centers. 314 00:34:26,000 --> 00:34:32,000 Co-ops which you might not know are co-ops like Lando lakes are running. Farmer owned AI, you know. 315 00:34:32,000 --> 00:34:35,000 They're already doing cooperative. AI. 316 00:34:35,000 --> 00:34:43,000 Because they have access to capital through provisions developed in, you know, in in a century ago. 317 00:34:43,000 --> 00:34:50,000 And and and now they're able to deploy that and and be on the cutting edge right now. 318 00:34:50,000 --> 00:34:52,000 We need to enable that in any industry. 319 00:34:52,000 --> 00:34:55,000 And through that kind of. 320 00:34:55,000 --> 00:35:09,000 Policy design. And this really comes down to a way of rethinking how we approach solving problems with the Internet right rather than hauling the the Ceos into Congress and saying, You solve this problem from the top down. 321 00:35:09,000 --> 00:35:14,000 I think much more. We need to ask ourselves, with every problem that we confront. 322 00:35:14,000 --> 00:35:36,000 Could we solve this problem with self governance rather than with feudalism? You know, could we solve this problem by enabling the people closest to the problem to solve it. Rather than turning once again to essentially reinforcing the power of a very few elite leaders of these companies. 323 00:35:36,000 --> 00:35:43,000 Maybe one of the things you talk about in the book is like, could we actually have policies that require the people closest to the problem. 324 00:35:43,000 --> 00:35:47,000 To actually be the ones charged with solving it. 325 00:35:47,000 --> 00:35:48,000 So. 326 00:35:48,000 --> 00:35:49,000 Like. 327 00:35:49,000 --> 00:35:59,000 In Canada. There's a sociologist, Eric Olin, right, and he has a great book called How to be an Anti-capitalism in the 21st century, but also envisioning really Utopias. And he talks about. 328 00:35:59,000 --> 00:36:01,000 Like in the 19 seventies and Quebec. 329 00:36:01,000 --> 00:36:06,000 There's this big social. And there's like some kind of social movement. And they basically got. 330 00:36:06,000 --> 00:36:10,000 The Government to pass a policy that funded childcare. 331 00:36:10,000 --> 00:36:11,000 For everybody. 332 00:36:11,000 --> 00:36:24,000 But the Government would only pay for the childcare if it was paying a cooperatively governed childcare facility. People could pay for their futilely governed childcare if they wanted to, out of their private pocket. But the government. 333 00:36:24,000 --> 00:36:30,000 Subsidy would only go to the cooperatively owned. And so that's like between like making a law. 334 00:36:30,000 --> 00:36:32,000 That actually requires. 335 00:36:32,000 --> 00:36:40,000 More fair and dignified governance that then actually like, enables those kinds, the resources to flow, to. 336 00:36:40,000 --> 00:36:44,000 People who wanna build those kinds of collectives or institutions. 337 00:36:44,000 --> 00:36:50,000 Promising, and I mentioned that because you said something about you know, we need to. 338 00:36:50,000 --> 00:36:54,000 Have these like self governed entities compete. 339 00:36:54,000 --> 00:37:05,000 With capital. We can't always compete with something that's got a big runway of venture capital, but we can have policies that resource them, so we can have, you know, in Austin. Didn't the Austin Co-OP. 340 00:37:05,000 --> 00:37:11,000 Take off, because the city actually kicked Uber and lift out for a while. 341 00:37:11,000 --> 00:37:23,000 And Minnesota. Minneapolis actually, just well, they just passed a minimum wage law that negatively impacted Uber and Uber and Lyft cried. And they left the city. And so now the city is actually giving like $150,000. 342 00:37:23,000 --> 00:37:26,000 To form more, you know. 343 00:37:26,000 --> 00:37:32,000 To basically form like alternative rideshare. So I'm I'm hoping that Minneapolis sticks to its guns. 344 00:37:32,000 --> 00:37:39,000 And you know, but I think competing isn't always, gonna even actually be a realistic thing. 345 00:37:39,000 --> 00:37:41,000 We can have policies and make it so. You don't have to. 346 00:37:41,000 --> 00:37:59,000 Have competition with venture capital, right? 347 00:37:59,000 --> 00:38:05,000 Oh, like, literally, just. 348 00:38:05,000 --> 00:38:06,000 Good correction. 349 00:38:06,000 --> 00:38:07,000 Well, you can use different words here, and I I think you know words matter immensely, but that story of Austin, I think, is a really fascinating one, because it wasn't that they kicked over and left out. They said, Uber and lyft, you have to follow the law, and you. You have to fingerprint your drivers for safety, and Uber and Lyft decided we'd we won't. 350 00:38:07,000 --> 00:38:09,000 Play, by fair rules, right. 351 00:38:09,000 --> 00:38:10,000 And and. 352 00:38:10,000 --> 00:38:11,000 And. 353 00:38:11,000 --> 00:38:16,000 Every everything about you know, the current structure of of our economy is set up around. 354 00:38:16,000 --> 00:38:19,000 You know is was. 355 00:38:19,000 --> 00:38:23,000 Was created, I mean the venture capital was essentially. 356 00:38:23,000 --> 00:38:33,000 Non scalable until 1979, when a series of changes to the tax code and rules around pension funds enabled. 357 00:38:33,000 --> 00:38:37,000 You know the California Pension funds to start investing in venture capital. 358 00:38:37,000 --> 00:38:41,000 And we have to recognize that the rules as they are today. 359 00:38:41,000 --> 00:38:48,000 Our our created and and the the ideas about what can. 360 00:38:48,000 --> 00:38:49,000 Can work. 361 00:38:49,000 --> 00:39:01,000 you know, are crafted. And and I think we we need to design those rules much more around, enabling democratic problem and enabling shared ownership and shared benefit. 362 00:39:01,000 --> 00:39:08,000 In fact, like those electric. I I keep mentioning that loan program was kind of indifferent to the structure. 363 00:39:08,000 --> 00:39:11,000 That that came out, but actually. 364 00:39:11,000 --> 00:39:23,000 It was designed in a way to support community communities coming together to solve these problems. And so it was really the cooperative that that took hold in that in that context. 365 00:39:23,000 --> 00:39:30,000 The you know the case that you you're describing with Montreal. You know it's it's rooted in earlier work. 366 00:39:30,000 --> 00:39:31,000 In Italy. 367 00:39:31,000 --> 00:39:37,000 You know that that also really designed policy around. 368 00:39:37,000 --> 00:39:44,000 Around community governed solutions to healthcare problems, into care, challenges and. 369 00:39:44,000 --> 00:39:48,000 You know. And and again, if we centered democratic problem. 370 00:39:48,000 --> 00:39:58,000 As as a way of approaching these questions. I don't think we have to, you know, kick out other models. Recognize that these other things are gonna be there. They're gonna show up. 371 00:39:58,000 --> 00:40:07,000 But we can design policy that recognizes. 372 00:40:07,000 --> 00:40:08,000 Yeah. 373 00:40:08,000 --> 00:40:11,000 You know, you know this. This is this needs to be a viable option. This needs to be this needs to be possible. 374 00:40:11,000 --> 00:40:30,000 And right now, often you know that capacity to assemble and solve problems democratically is just not even there. And this is the case with this financial policy we've been talking about. But it's also the case just in the user experience designs of the technologies we use. You know, you mentioned slack, for instance, like. 375 00:40:30,000 --> 00:40:43,000 Try to figure out. You know, I I'm in a number of communities on slack that try to do like voting and try to like, make collective decisions together, and we can kind of hack it. But it always feels like we're working against the tool. 376 00:40:43,000 --> 00:40:51,000 You know. What would it look like to build tools where self governance was kind of the 1st class default assumption of what you're doing. 377 00:40:51,000 --> 00:41:00,000 And and that's the, you know the challenge of posing both. For you know, the practice of user experience design as well as for policy design. 378 00:41:00,000 --> 00:41:03,000 Yeah, there are a lot of really cool examples in the book of. 379 00:41:03,000 --> 00:41:07,000 Kind of technology designs and systems that. 380 00:41:07,000 --> 00:41:22,000 You know, instead of defaulting, to assigning someone to be the administrator or moderator, like defaulting, to assigning the whole community is needing to vote on, like whether to kick somebody out or to take down the post like there's a lot of examples of actual things people have built in here. 381 00:41:22,000 --> 00:41:28,000 But we can all go and look for and see if we can incorporate some of those. 382 00:41:28,000 --> 00:41:31,000 There was one thing that I. 383 00:41:31,000 --> 00:41:38,000 Okay. So like, I guess my last question. That is like, you know, you're talking about the capacity for self governance, like. 384 00:41:38,000 --> 00:41:39,000 Sometimes, if you know. 385 00:41:39,000 --> 00:41:49,000 It's something that there's a lot of ways. We do it in our day to day life where for certain kinds of systems, for certain kinds of. 386 00:41:49,000 --> 00:42:00,000 Products and industries like, it's actually really difficult because of things like tax codes and regulations. And you know, infinite runways of venture capital to write laws in California, say. 387 00:42:00,000 --> 00:42:10,000 So like. What are some of you know, as we are all as we try for those of us who are convinced that we wanna incorporate more of this entire life like. 388 00:42:10,000 --> 00:42:26,000 Like, what are some of the things that we might have to learn, not just as skills, you know, about how to run stuff, but as kind of capacities in our bodies, emotionally, in terms of how you relate to each other. In order to be able to do this, and for me. One of the places this is coming from is. 389 00:42:26,000 --> 00:42:30,000 You know, I was, you know, trained in computer science. 390 00:42:30,000 --> 00:42:36,000 You know, intense long work grinds, working on projects, working to deadlines like a real like. 391 00:42:36,000 --> 00:42:46,000 Moving fat, the whole moving fast and breaking things like we talk about it. But also it's like something in for me, the culture of Tech, where I see a lot of people. 392 00:42:46,000 --> 00:42:55,000 Get really excited and just like to move fast, get frustrated, taking times, slow discussions to teach each other the different things that we know. 393 00:42:55,000 --> 00:43:05,000 Like. What have you had to learn in order to be able to be part of design? What emotional have you had to unlearn. 394 00:43:05,000 --> 00:43:14,000 Thanks. Yeah, I I think this is such a hard question. And and to me it really come, it's hard, because there's so many different answers. 395 00:43:14,000 --> 00:43:23,000 And I think it's really important to hold that that recognition that there is no one kind of self governance. And I think one of the deep limits of. 396 00:43:23,000 --> 00:43:31,000 Cultures like, you know, for instance, the the Garden Club, and it's Roberts rules of order and and and it's kind of. 397 00:43:31,000 --> 00:43:32,000 You know class. 398 00:43:32,000 --> 00:43:42,000 Identities and and things like that is is that it assumes that there's, you know, it's often been out of a culture that assumes there's 1 way of doing self governance, and there's so many. 399 00:43:42,000 --> 00:43:50,000 You know, one project I've been working on over the years is this effort called governance archaeology, trying to gather and and. 400 00:43:50,000 --> 00:44:01,000 Synthesize and and find the a broader repertoire of of governance practices that have existed through human history and in diverse cultures. 401 00:44:01,000 --> 00:44:06,000 And this is, I think, an opportunity for us to, you know, explore. 402 00:44:06,000 --> 00:44:08,000 Explore those. 403 00:44:08,000 --> 00:44:11,000 You know that that depth of the traditions that we inherit. 404 00:44:11,000 --> 00:44:14,000 Different ways that we can. 405 00:44:14,000 --> 00:44:15,000 That we can 406 00:44:15,000 --> 00:44:33,000 Embody self-governance, whether that's through a kind of through things like collective decision-making. These are one. These are one kind of thing we know how to do. Another kind of thing that we could develop learning on is practices like juries. You know things that. 407 00:44:33,000 --> 00:44:41,000 Yeah, I mean, one of the most important of my life, politically, was sitting on a a criminal jury in New York City, in in Brooklyn. 408 00:44:41,000 --> 00:44:44,000 And and it just forced me into a kind of embodied. 409 00:44:44,000 --> 00:44:48,000 Conversation that I had never been in before. 410 00:44:48,000 --> 00:44:49,000 You. 411 00:44:49,000 --> 00:44:55,000 Also just turning to our own cultural traditions or coming into conversation. 412 00:44:55,000 --> 00:45:02,000 With with culture traditions of others, and asking, you know, how would you solve this problem? How would my ancestors solve this problem? 413 00:45:02,000 --> 00:45:06,000 And then exploring. What could it look like to have. 414 00:45:06,000 --> 00:45:11,000 Technical systems capable of representing those of of. 415 00:45:11,000 --> 00:45:17,000 Of being spaces in which those practices are possible. A lot of you know, we were talking about the. 416 00:45:17,000 --> 00:45:20,000 Form of justice work. A lot of that comes out of. 417 00:45:20,000 --> 00:45:26,000 Ancestral practices, trying to reclaim ancestral practices for problem, solving. 418 00:45:26,000 --> 00:45:29,000 And and I think that's. 419 00:45:29,000 --> 00:45:37,000 That's a lot of the work, you know, is recognizing that that these are not new challenges. Actually, the problem is is that we've created. 420 00:45:37,000 --> 00:45:58,000 We've created these technical spaces that actively from from continuing to inhabit. You know the kinds of practices that our ancestors have have used and you know, to continue exploring and building new ones, and and experimenting and modifying and and and sharing those learnings out. 421 00:45:58,000 --> 00:46:00,000 And and. 422 00:46:00,000 --> 00:46:03,000 You know so so often to me the lessons are. 423 00:46:03,000 --> 00:46:14,000 Are are not around. How do we totally reinvent ourselves and reinvent the wheel? But how do we actually recognize how to rebrand things that we already know how to do. 424 00:46:14,000 --> 00:46:30,000 And and how do we recognize that? You know a lot of the answers are kind of around us already? They're they're just misplaced and and and excluded from from the spaces that need them most. 425 00:46:30,000 --> 00:46:32,000 Yeah, let's remember some of that knowledge. 426 00:46:32,000 --> 00:46:35,000 Thank you for this. 427 00:46:35,000 --> 00:46:59,000 But thank you for this conversation. I was so entranced by it I kind of forgot I had a job to do which was to to ask you some questions. From the audience that were coming in. And and we have some really, really good ones one that I thought, was a really fascinating one, especially at this time of year, where, you know, we are now. 428 00:46:59,000 --> 00:47:06,000 Bombarded with messaging about governance at the national level. There's an election happening, and you have to be like. 429 00:47:06,000 --> 00:47:09,000 Living under a rock. To not know this. 430 00:47:09,000 --> 00:47:22,000 And one of the major challenges, or what we see happening is there's this huge effort to get people who are basically and disengaged from the governance system, from. 431 00:47:22,000 --> 00:47:41,000 You know, caring at all, really about voting to try to motivate them to go vote. And so the question is about like in online spaces. W. What's the secret sauce, or what's what are the right ingredients to really drive engagement with governance. 432 00:47:41,000 --> 00:47:44,000 And I mean I think it really begins with. 433 00:47:44,000 --> 00:47:48,000 Asking like governance, for what and for whom, and. 434 00:47:48,000 --> 00:47:50,000 And. 435 00:47:50,000 --> 00:47:51,000 Making sure that it is. 436 00:47:51,000 --> 00:47:57,000 That it's aligned. I mean a lot of the problem with participation in government. 437 00:47:57,000 --> 00:48:01,000 Stuff is that people rightly recognize that this. 438 00:48:01,000 --> 00:48:04,000 E. These decisions are not gonna serve me, no matter what. 439 00:48:04,000 --> 00:48:07,000 And the disengagement, I think, is earned. 440 00:48:07,000 --> 00:48:08,000 You know. 441 00:48:08,000 --> 00:48:14,000 And and I've seen that a lot in online spaces where I've like, you know, crypto projects coming and asking like. 442 00:48:14,000 --> 00:48:18,000 Why is it that nobody participates in my. 443 00:48:18,000 --> 00:48:22,000 Token vote. And it's like, it's obvious because it doesn't affect them. 444 00:48:22,000 --> 00:48:31,000 Same for why? Like a corrupt, you know, electric Co-OP doesn't get people showing up that it's annual meetings, because those people know that. 445 00:48:31,000 --> 00:48:34,000 The board of directors doesn't actually listen to them. 446 00:48:34,000 --> 00:48:47,000 And and so I think it's. It's 1st of all asking, how much do you really need to ask of these people, how do you align the participation to actually fit into their lives and honor their busyness and their and the value of their time? 447 00:48:47,000 --> 00:48:50,000 But also to ensure that we're designing. 448 00:48:50,000 --> 00:48:58,000 Systems really around you. You know where the where people, what people put in, you know, is reciprocated. 449 00:48:58,000 --> 00:48:59,000 And and that's. 450 00:48:59,000 --> 00:49:15,000 You know that I I had some folks who are building a a cooperative dow called land. It's a massive cooperative, using tokens and things like this, and they were describing how they think of governance as as part of their product, design. 451 00:49:15,000 --> 00:49:23,000 You know, they they think about the design of experience the same way they think about designing a product that. 452 00:49:23,000 --> 00:49:29,000 Is worth people's time and and use. And I think that's that's 1 approach that that. 453 00:49:29,000 --> 00:49:31,000 In some respects honors. 454 00:49:31,000 --> 00:49:44,000 The you know the participants more than like blaming participants, or, you know, blaming members, or whatever for for not doing enough when it's actually not worth their time. 455 00:49:44,000 --> 00:49:50,000 May I jump in on this? Really, briefly, too, like, I think, yeah, 100% do it, Nathan? So. 456 00:49:50,000 --> 00:49:52,000 I've also been in. 457 00:49:52,000 --> 00:50:05,000 Collectively govern projects like Turkop, where there's people who actually would vote on really different places in the political spectrum. But when we're actually working on something that's like we all have. 458 00:50:05,000 --> 00:50:15,000 A stake in, and it's concrete and is not mediated through like political carrot, like charismatic political leaders like wedge issues. 459 00:50:15,000 --> 00:50:24,000 We actually can do a lot together. And it kinda so it's something that really frustrates me is the ways that you know. I feel like the Us. Government has, like, leaned into. 460 00:50:24,000 --> 00:50:29,000 Misinformation and social media as the characterization of the problem, and then. 461 00:50:29,000 --> 00:50:36,000 Policy attempts at policy solutions that are like laws that basically empower implicitly social media to have. 462 00:50:36,000 --> 00:50:38,000 Even a heavier hand. 463 00:50:38,000 --> 00:50:43,000 In unilateral and authoritarian like content. 464 00:50:43,000 --> 00:50:48,000 Because, you know, like, that's like misinformation, I don't think is big a problem as. 465 00:50:48,000 --> 00:50:59,000 A kind of lack of spaces to collectively problem, solve and evolve like resources and decision making to the people. We're actually doing the thing with. 466 00:50:59,000 --> 00:51:11,000 So one more. I have a whole list. I'm not gonna be able to get to them all. But this is I guess a sort of softball question, but I think it would actually be really helpful for the whole. Everyone on the talk. 467 00:51:11,000 --> 00:51:26,000 One person asks if you could share just a few bits of wisdom that you've you have about common misconceptions or pitfalls of self governance, especially to those who are new to it. 468 00:51:26,000 --> 00:51:29,000 Yeah, thank you. 469 00:51:29,000 --> 00:51:30,000 You know again. 470 00:51:30,000 --> 00:51:41,000 We? We've talked about one a lot around. Participation, you know, is is, don't mismatch your expectations of participation to the actual problem. 471 00:51:41,000 --> 00:51:43,000 Another is is. 472 00:51:43,000 --> 00:51:45,000 Is the role of leadership. 473 00:51:45,000 --> 00:51:48,000 Is that having collective governance. 474 00:51:48,000 --> 00:51:53,000 Does not remove the need for leadership and for people to hold vision and to. 475 00:51:53,000 --> 00:51:59,000 And to play that kind of role. The real question, I think, at at stake here is. 476 00:51:59,000 --> 00:52:01,000 Who are leaders accountable to. 477 00:52:01,000 --> 00:52:06,000 You know, when I talk with people who are building democratic organizations, a question I often ask them is like. 478 00:52:06,000 --> 00:52:07,000 What? 479 00:52:07,000 --> 00:52:14,000 Who who do you wanna be worried about when you're lying awake at night, you know, trying to work through a hard challenge and. 480 00:52:14,000 --> 00:52:15,000 You know. 481 00:52:15,000 --> 00:52:20,000 You have to make a call that's gonna affect people in different ways. Who do you want. 482 00:52:20,000 --> 00:52:23,000 To be, to ultimately be accountable, to. 483 00:52:23,000 --> 00:52:25,000 That question. I think. 484 00:52:25,000 --> 00:52:33,000 You know, is is more important than than W. What I think is often an impulse which is to imagine that. 485 00:52:33,000 --> 00:52:41,000 That you can run communities with as a kind of undifferentiated mass in which they make kind of spontaneous. 486 00:52:41,000 --> 00:52:52,000 decisions and so forth together. I think there's still a need for, for you know, people to you know, to hold positions of of responsibility. 487 00:52:52,000 --> 00:52:57,000 And and again, you know, it's like a lot of the a lot of the. 488 00:52:57,000 --> 00:53:10,000 Ways of addressing these things are kind of hidden in plain sight they're in. They're in the things we're doing. It's about realigning some of the things we're doing to the people who, you know, getting the accountability flows right. 489 00:53:10,000 --> 00:53:12,000 Rather than imagining a. 490 00:53:12,000 --> 00:53:14,000 That kind of. 491 00:53:14,000 --> 00:53:21,000 Total reinvention of of how we run our organizations and our communities. 492 00:53:21,000 --> 00:53:24,000 Chris, I think I have. Do I have time for one more. 493 00:53:24,000 --> 00:53:25,000 I think I have time for one more. So. There was an interesting. 494 00:53:25,000 --> 00:53:27,000 Go for it. 495 00:53:27,000 --> 00:53:40,000 That came in actually at the beginning of the talk. And I've been pondering the whole time, which is basically getting at differences between public and private spaces, and 496 00:53:40,000 --> 00:53:48,000 Like, is that actually a meaningful distinction in online spaces and governance? And to me, that's super interesting because. 497 00:53:48,000 --> 00:54:06,000 Depending on what room you're in from a legal standpoint. People fight about this all the time, cause it has a huge significance for what sort of legal structures apply to governance of that space? So so that I guess the question there is like, How how do you view that distinction. And how does that relate to self governance? 498 00:54:06,000 --> 00:54:12,000 Lily, do you have any initial thoughts? 499 00:54:12,000 --> 00:54:17,000 Yeah, I think I think it's a really interesting question, and it's a hard one. 500 00:54:17,000 --> 00:54:21,000 You know, in part because I think the. 501 00:54:21,000 --> 00:54:24,000 You know, if you take some of these ideas, there. 502 00:54:24,000 --> 00:54:26,000 In some ways kind of. 503 00:54:26,000 --> 00:54:32,000 Moderate and humble, but at the same time they're kind of radical, because if we were really. 504 00:54:32,000 --> 00:54:40,000 Self governing our online spaces, we would be self govern. We would be citizens of something that crosses borders, that. 505 00:54:40,000 --> 00:54:45,000 Is not really under the jurisdiction of our territorial governments. 506 00:54:45,000 --> 00:54:50,000 That kind of breaks a lot of the political categories that we're used to. 507 00:54:50,000 --> 00:54:51,000 And 508 00:54:51,000 --> 00:55:01,000 You know, I think this is something that is profoundly needed. You know, it's something that this is why, you know, we spend so much time in our online lives, because it enables us to connect with people. 509 00:55:01,000 --> 00:55:05,000 Not simply, you know, in the context of. 510 00:55:05,000 --> 00:55:07,000 Imagined lines in the sand. 511 00:55:07,000 --> 00:55:09,000 And. 512 00:55:09,000 --> 00:55:10,000 And and this is. 513 00:55:10,000 --> 00:55:15,000 It also breaks that distinction between private and public, between like. 514 00:55:15,000 --> 00:55:18,000 Governmental and non-governmental. 515 00:55:18,000 --> 00:55:23,000 And I think that's good. I think it this kind of. 516 00:55:23,000 --> 00:55:26,000 This kind of journey, opens the door toward. 517 00:55:26,000 --> 00:55:29,000 toward this this 518 00:55:29,000 --> 00:55:34,000 This concept of of non exclusive sovereignties, of. 519 00:55:34,000 --> 00:55:36,000 Of a world in. 520 00:55:36,000 --> 00:55:37,000 We have overlapping. 521 00:55:37,000 --> 00:55:42,000 Affinities and relationships. I. I end the book with a picture that comes from the. 522 00:55:42,000 --> 00:55:50,000 The website, native dash land.ca, which is a map of indigenous territories around the world. 523 00:55:50,000 --> 00:55:53,000 And they're overlapping. You know. I'm speaking from. 524 00:55:53,000 --> 00:56:10,000 The territories of the you Cheyenne and Arapaho peoples. These are this. The land was not exclusively held by one or another. The it wasn't easy to distinguish between private and public in that context and and and it. 525 00:56:10,000 --> 00:56:15,000 And and the practices of those people continue that kind of logic. 526 00:56:15,000 --> 00:56:21,000 And I think this is, you know, an invitation. If we can self govern on ours. 527 00:56:21,000 --> 00:56:23,000 To be able to inhabit. 528 00:56:23,000 --> 00:56:28,000 New kinds of political belongings that are actually very old. 529 00:56:28,000 --> 00:56:37,000 And to have and and to not have to rely on these these made up and militarized borders as the kind of. 530 00:56:37,000 --> 00:56:42,000 Clearest dividing lines of our of our of our political and social lives. 531 00:56:42,000 --> 00:56:45,000 So you know in some ways this. 532 00:56:45,000 --> 00:56:54,000 Th, this stuff is kind of small. And we're talking about user experience design. And you know, tech stuff. But it's also about like, Re. 533 00:56:54,000 --> 00:56:56,000 Reorienting our imagination about. 534 00:56:56,000 --> 00:56:59,000 About what we belong to, and. 535 00:56:59,000 --> 00:57:05,000 And and I don't think that some of the clean distinctions that we're used to fit into that world. 536 00:57:05,000 --> 00:57:12,000 And can I? Just I wanna underscore how Nathan put it, just to give a really example. You know, I live here. 537 00:57:12,000 --> 00:57:16,000 The border region where there's the Kumi Nations land. 538 00:57:16,000 --> 00:57:37,000 And then there's the United States jurisdiction, and there's Mexico's jurisdiction, and we have water pollution that, like moves up, you know, back and forth across the border. That's caused a lot by American owned companies, but that are located in Mexico that are in, you know. And then, like Kumi, people like actually know a lot about the kind of ways of the land, and the water flows and works. 539 00:57:37,000 --> 00:57:39,000 You know these are not. 540 00:57:39,000 --> 00:57:43,000 Like, why does the solution to that need to move through DC. 541 00:57:43,000 --> 00:57:46,000 At all. I feel like something a lot of people can sympathize with. 542 00:57:46,000 --> 00:57:53,000 And these questions are like overlapping sovereignties, or even also respecting, you know, older sovereignties, that we're kind of. 543 00:57:53,000 --> 00:58:01,000 That were ignored, treaties were broken. I I it's like the more profound kind of possibility of what. 544 00:58:01,000 --> 00:58:04,000 The discussion that Nathan is opening up. 545 00:58:04,000 --> 00:58:16,000 With a discussion of just our online lives. But it applies to much more. And actually, when I was reading the book, I was actually thinking about that. I'm kind of surprised the conversation got here, and I'm really happy about that. 546 00:58:16,000 --> 00:58:31,000 So Chris has shown back up on the screen, which means it is time for us to wrap up. So. Thank you both. This was such a fantastic conversation, and I hope everyone on the call, if you haven't read the book definitely, check it out. So thank you. 547 00:58:31,000 --> 00:58:37,000 Thank you so much for hosting and facilitating and making this space possible, and thank you to everyone who came. 548 00:58:37,000 --> 00:59:01,000 Yeah, thanks everyone for your time, Nathan Lily, for for hosting such a fantastic conversation. Lots of lots of love here in the in the chat, and like Dave, I got so wrapped up in the conversation I forgot that I had some jobs to do, but but here I am. I'm back to help bring this to a close. So as Dave mentioned, read the book. 549 00:59:01,000 --> 00:59:18,000 And Duncan's gonna drop a link out into the chat. Please read it online. It's available as a an open access addition online. And you can also buy it in print and and get it delivered, if you like to read and print from the booksellers that Nathan lists on his site. 550 00:59:18,000 --> 00:59:43,000 So as we wind down here, I do. Wanna in our remaining time, if you need to bounce, go for it. But I want to tell you about a couple of upcoming events that I think you'll wanna participate in. So on October 10th we'll have our rescheduled book talk from earlier this year that we weren't able to do because of some absences, but with Author Barbara Mcquaid for her bestselling book attack from within, Mcquaid deals with disinformation. 551 00:59:43,000 --> 00:59:54,000 Information and its effect on democracy, which will, of course, no doubt, continue to be a hot topic in October as it is now, and so I'm sure you won't want to miss that discussion. 552 00:59:54,000 --> 01:00:19,000 And then later in October, we'll host 2 days of events for the Internet archives. Annual celebration up 1st on October 20. Second is doors open our annual behind the scenes tour and party of the Physical Archive. We're opening the doors to an often unseen place where you can tour the physical collections of books, film, music, and video in Richmond, California, you'll be able to see the life cycle of physical books. 553 01:00:19,000 --> 01:00:44,000 Out from donation to preservation, digitization, and access. You also get to see samples from generous donations, and other acquisitions of books and microfiche and records and all the wonderful stuff that the Internet Archive has in analog form as it's being digitized and made available in digital form. All of that will be on display. It's a really fun night. So if you're in the Bay Area on October 20 second, you definitely will not want to miss that. 554 01:00:44,000 --> 01:00:45,000 And then. 555 01:00:45,000 --> 01:01:10,000 On the next day, on October 23, rd we'll host the Internet Archives, annual street party and celebration at our headquarters at 300 funds in San Francisco. This year's gathering. The memory hole explores the vital role that libraries play in protecting our Digital heritage as corporate decision makers increasingly control what stays online Mtv news cartoon network. Anyone. 556 01:01:10,000 --> 01:01:19,000 Libraries, like the Internet Archive, stand as guardians of our shared culture, ensuring that it remains preserved and accessible for future generations. 557 01:01:19,000 --> 01:01:43,000 So we have tickets available for both in person attendance, which runs 5 to 10. It's a big party you want to come or you can just sign on for the virtual live stream from the great room. Which is it? 7 Pm. Pacific? 10 pm. Eastern. We know it's late, it'll be archived for replay and also streamed on Youtube and other channels. So I do hope you'll join us for those events. They're gonna be a lot of fun. 558 01:01:43,000 --> 01:02:07,000 So to close down here. I would like to wrap, as we always do with commitments and thank yous. Today's session is recorded. The links that we've shared. The conversation that we've shared. All the resources will be available on the Internet Archive tonight and tomorrow all registrants will receive an email with a link to the recording. As for thank yous, a big thank you, of course, to Nathan and to Lily for such a great conversation. 559 01:02:07,000 --> 01:02:21,000 To Dave Hansen and Authors Alliance for co-hosting, and as always to you, our audience for showing up being respectful, given great feedback, great questions and your your time and your enthusiasm today. We certainly appreciate it. 560 01:02:21,000 --> 01:02:51,000 Final links were out in chat. You can stay up to date on everything that's happening at and around the Internet Archive, through our blog and our events, Calendar, and to sign off here. I do hope to see you at one of our upcoming events. Thank you. All have a great day.