Danielle Citron, Legal Scholar | 2019 MacArthur Fellow


Cyber harassment is fundamentally a civil
rights problem, because it disproportionately impacts women and minorities, and it costs
them their crucial life opportunities. I’m Danielle Citron, and I’m a lawyer,
law professor, and civil rights advocate. I write about online abuse and invasions of sexual privacy, the harm that they inflict, and how law and society should respond to them. Cyber harassment is the targeting of specific
individuals with a course of conduct that causes severe emotional distress and often
the fear of physical harm. And it impacts them in a way that takes away what we consider
crucial ability to make the most out of their lives in the 21st century, right, to get employment,
keep a job, to engage with other people, and to go to school free from the fear of online abuse. When I first started writing about the targeting
of women and sexual minorities and minorities online in about 2007, the response was just, you know,
this is part of online life, this is the nature of the internet, or this is the nature of humanity
in a digital age. And the answer is absolutely not, right. This is not something that we should accept. We wouldn’t accept people walking down the
street and being screeched at and threatened and humiliated and hurt. We shouldn’t find it as an acceptable part
of online life. The abuse that women and marginalized people
face online has a real cost to all of their important life opportunities and something
that is, that we’ve got to address to ensure that people have an equal chance to speak
and make a living and work in a networked age. I started to write about and call for legal
reforms, and then I started working with companies and federal and state lawmakers on devising
a plan of attack, and how do we get law enforcement to take these kinds of attacks seriously. And so my work has focused on legal reforms
and also sort of social reforms and working with companies on ensuring that this kind
of abuse isn’t tolerated. Right now our current efforts are focused
on what are called “deep fakes.” Deep fake technology is machine learning technology
that lets you manipulate or fabricate audio and video to show people doing and saying
things that they’ve actually never done or said. And the technology is advancing so
rapidly that soon, within months, technologists expect that the state of the art will become
so sophisticated that it will be impossible to distinguish fakery from what’s real. And so, the impact that it has is not just
on individuals, but it also has an impact, of course, on the truth and on more broadly
our trust in democratic institutions.