Deepfakes are used as a weapon to silence women, degrade women, show power over
Anita Sarkeesian
women, reducing us to sex objects. This isn’t just a fun-and-games thing. This can destroy lives.
Policymakers can adopt a multifaceted approach to prevent and respond to the challenges posed by non-consensual deepnudes and sexual deepfakes, based on the insights provided in the document:
- Criminalize the Production and Distribution: The creation and distribution of non-consensual deepnudes and sexual deepfakes should be explicitly criminalized. This legal measure would affirm that these acts are forms of violence, simplify the legal process for survivors, and provide law enforcement with the necessary tools to prosecute perpetrators effectively. This would act as a deterrent and enable better attribution of liability, especially in cases where content is posted anonymously.
- Conduct Intersectional and Gender-Based Research: Since non-consensual deepnudes and sexual deepfakes are emerging issues, it is crucial to conduct comprehensive research to understand their impacts better. This research should focus on the gendered and intersectional dimensions of these acts to inform policy accurately and develop targeted interventions.
- Collaborate with Media Companies: Policymakers should work with social media and other online platforms to find ways to detect and remove non-consensual deepnudes and sexual deepfakes. This could involve supporting technological advancements in deepfake detection and encouraging platforms to include clear prohibitions of such content in their terms of service.
- Fund Awareness and Prevention Campaigns: It is important to raise public awareness about non-consensual deepnudes and sexual deepfakes as forms of violence. Funding educational campaigns that inform people about the legal and social consequences of creating and sharing such content can help prevent its spread and normalize a culture of consent and respect online.
- Support Survivors Through National Initiatives: Establishing national support systems for survivors of non-consensual deepnudes and sexual deepfakes can provide essential assistance. These should offer legal advice, psychological support, and guidance on how to remove non-consensual images from the internet. Such initiatives could be modeled on existing helplines and support services in other countries but tailored to meet local needs.
These policy recommendations, when implemented, could significantly mitigate the prevalence of non-consensual deepnudes and sexual deepfakes and provide support to those affected by such violations of privacy and consent.
What sources used in report?
The document cites various sources to support its discussion on non-consensual deepnudes and sexual deepfakes. Here are some of the references used:
- Paris, B. & Donovan, J. (2019) – “Deepfakes and Cheap Fakes: The Manipulation of Audio and Visual Evidence” from Data & Society.
- Schick, N. (2020) – An article on deepfakes jumping from porn to politics from Wired.
- Tibbetts, J. (2018) – Discusses deep learning in the life sciences, published in Bioscience.
- Schimmele, C., Fonberg, J., & Schellenberg, G. (2021) – Canadian assessments of social media, published by Statistics Canada.
- Citron, D. K. (2019) – “Sexual Privacy,” published in The Yale Law Journal.
- Ajder, H., Patrini, G., & Cavalli, F. (2020) – A report on automating image abuse from Sensity.
- Ajder, H., Patrini, G., Cavalli, F., & Cullen, L. (2019) – A report on the state of deepfakes from Sensity.
- Alptraum, L. (2020) – Article discussing the impact of deepfake porn on adult performers, from Wired.
- Ayyub, R. (2018) – Article about being targeted with a deepfake to silence her, published by Huffpost.
- Eaton, A.A., Jacobs, H., & Ruvalcaba, Y. (2017) – Research report on nonconsensual porn victimization and perpetration by Cyber Civil Rights Initiative.
- Dickson, E.J. (2020) – Discusses TikTok stars turned into deepfake porn without consent, published by Rolling Stone.
- Amnesty International (2018) – Report on online violence and abuse against women.
- Harwell, D. (2018) – Article from The Washington Post on how fake-porn videos are being weaponized to harass women.
- Siekierski, B. J. (2019) – A report on deep fakes from the Library of Parliament.
- Khodayari, A. (2020) – Blog post discussing the regulation of deepfakes in Canada, from Tech Law McGill Blog.
- Gosse, C., & Burkell, J. (2020) – Study on the characterization of deepfakes in the media, from Critical Studies in Media Communication.
- United States Congress (2020) – Information about the H.R.5532 – Deep Fake Detection Prize Competition Act.
- Facebook AI (2020) – Results of the Deepfake Detection Challenge posted on Facebook AI Blog.
- Tseng, P. (2018) – McMillan Litigation and Intellectual Property Bulletin discussing the legal aspects of deepfakes.
These sources provide a comprehensive background on the technical, social, and legal aspects of non-consensual deepnudes and sexual deepfakes, helping to frame the policy recommendations made in the report.