I'd like to ask you a little bit if you could elaborate on some of the other research areas that the Internet rules lab at sea border is pursuing. There's three major research areas, as I understand it, technology ethics, online governance and online fandom, which we've talked a lot about. I wonder if you could unpack a little bit the technology ethics and the online governance aspects of the research and how these things intersect for you and what you see as the work and the contribution of the lab that you're working on. Yes. As I mentioned, my interests in fandom and online communities, this is one of the reasons I went to law school. When I came back into my PhD, I was really interested in issues of governance in online communities. My dissertation says about remixers but it was about fandom. It was about Platform policy like copyright licenses, social norms about copyright, how people learn things about copyright in this community. When I was a graduate student, as someone with a lot of expertise in law, I ended up teaching the computing and society class, which was also the ethics class for computer science students. It required class for undergrads and computer science at Georgia Tech. When I was looking for academic jobs, I definitely pitched myself as someone who did technology, ethics and policy, and also social computing, online community stuff. When I came to see you and make a class, the easiest classroom to teach was to do with ethics and policy class. This was in 2015 and we were having a renaissance of research ethics in the social computing community if anyone remembers the Facebook emotional contagion study fiasco. But, some research done on Facebook the people really got upset about and it started these conversations about research ethics for people who are studying platforms. Or one of the things that I was really interested in was researchers using tweets without people's permission, which was just happening constantly. We weren't talking about this. One of the reasons that I found this so interesting is because I've been participating these fan spaces where a lot of this, again, is public. The thought of a researcher coming in and studying my live journal posts from fandom was horrifying. That was one of the things that I started working on right away when I became a faculty member was the research ethics of public data stuff. Then, when Brianna came to work with me and we were doing fandom stuff. I had this funding to work on research ethics for public data, and so she led a project about that for fandom, and so she interviewed people about how they felt about researchers using their content and also journalists and this thing. That's where we found out all this stuff about privacy norms, which was really interesting. The ethics stuff in my lab is really taken off because that's what my students want to do mostly. I've got Brianna doing fandom work for then all of my other chair vices are doing ethics. Then the governance component is, occasionally to do things around copyright's, also, my Ph.D. Student who recently graduated, Erin, did work on content moderation, which I think is super important and a really big challenge right now. Again, this thing intersects with online communities and fandom as well. Brianna and I are also working on a paper about content moderation on archive of her own, which is actually a really interesting model of content moderation compared to some other place. Fandom is an interesting thing to be part of my research because it's a domain. There are people who are fans, studies, scholars, and I guess I kind of, but I tend to study things that relate to my other work with Fandom as a domain. So like online communities generally, like how platforms work, how platforms are designed and the ethics stuff and the governance stuff and all that, which makes a lot of sense because I'm in an information science department and not a media [inaudible] [LAUGHTER] But on that last point, it's really interesting because Fandom for me, in my own kind of intellectual study of it and also participation in it has been an opening up kind of force and expansive kind of force where it brings you into an encounter with lots of different kinds of ideas about identity and engagement online and things like privacy and ethics in a perhaps a more active and immediate kind of way than some of the other approaches that have been taken in intellectual history to those same things. You're kind of right in the thick of it immediately when you're a fan sometimes, when you're engaging online then you would be if you were coming at that from the history of ethics or the history of social computing, that sort of thing. So I think there is a kind of catalyzing element almost to Fandom and I want to pick up on something that you talked about in that last sequence about content moderation, because I was just reading something in the last few weeks, I think, about how content moderation was going to be one of the big points of contention and conversation about this next evolution of social media. I wonder if you could elaborate a little bit about what you're seeing in some of your initial thinking about that from a research perspective and how you see content moderation operating differently on a platform like Archive of One's Own than it might on Facebook or something like that and just kind of unpack that a little bit for us. Yes, content moderation really is, I think one of the biggest challenges right now online and we've seen a lot of conversations about it because of social media banning Donald Trump, because of Amazon Web Services, kicking off parlor conversations about whether Section 230 in the US needs to be revised and meanwhile, platforms like Twitter are getting heavily criticized for both moderating content too much and moderating content too little[LAUGHTER]. So one of the issues is that people vastly disagree on what should be happening [LAUGHTER]. One of the interesting things in the context of Fandom is like contemplation around copyright. Right now, I'm working on this paper that is based on a survey of fans about their experiences with copyright takedowns and one of the things that I'm seeing in the context of Fandom and this kind of thing is like this huge power imbalance between these platforms and the big copyright owners and the fans. This work has led me to think that, like, one of the things we really need to be thinking about for content moderation is like very strong appeals, processes and not making them sound like they're scary and that you have to hire a lawyer to be like appeal [LAUGHTER] great takedown and also trying to do things to combat like, content flagging as a form of harassment, which I've seen fans talk about, like people sending all these false copyright flags for their work to try to get it taken down as a form of harassment and this is happening constantly on TikTok right now, not in Fandom specifically, but just like big content creators, people coordinating, like sending flags for their content to TikTok so that they get banned. All the work that I do around ethics in this kind of thing comes back to like horrible people being horrible [LAUGHTER] and that's the thing about content moderation is like. How do you protect people from really upsetting or hateful content while also not accidentally silencing especially marginalized voices. Can an algorithm tell the difference between hate speech and a person of color talking about hate speech?[LAUGHTER] So this kind of complex interaction between algorithms and also like users flagging content and then commercial content moderators, which is like a whole other thing, it's like people whose job is to look at the worst of humanity all day [LAUGHTER] I think this is going to continue to be a really big topic.