| |
|
Platforms & Deepfake Nudes - A Responsibility To Act
|
| With , Viraj Doshi (Platform Safety Lead, Snap), Malika Saada Saar (Fellow Human Rights Law & Tech, Brown Univ.), Rebecca Portnoff (Head of Data Science, Thorn), Miranda Wei (Fellow, Princeton CITP).
|
|
May 21 (Thursday) @ 05:30 PM
| | Cornell Tech, 2 W Loop Rd
|
|
|
|
|
|
|
|
| |
All Tech Is Human & Cornell Tech's Security, Trust, & Safety Initiative (SETS) will host an event on May 21, 2026, from 5:30 to 8:30pm on Platforms & Deepfake Nudes.
This gathering will bring together a mixture of Trust & Safety professionals, researchers, academics, students, civil society orgs, & more for a discussion about Platforms & Deepfake Nudes: A Responsibility to Act, & in particular the TAKE IT DOWN Act. The TAKE IT DOWN Act creates new federal tools to fight digital sexual abuse, & its scope & implementation are a subject of ongoing discussion.
Agenda
5:30 to 6:00pm: Arrivals
6:00 to 7:00pm: Panel Conversation
7:00 to 8:00pm: Networking with light refreshments & bites
Please Note: We anticipate receiving more interest than we can accommodate in the space. In order to foster a dynamic & balanced group, we'll be selecting attendees based on a range of factors. Completing this form expresses your interest, but does not guarantee a spot. Thank you for your understanding!
Reach out to us at hello@alltechishuman.org for any questions.
===
PANEL
Malika Saada Saar (Senior Fellow, Human Rights Law & Tech, Brown University)
Rebecca Portnoff (Head of Data Science, Thorn)
Miranda Wei (Fellow, Princeton CITP)
Viraj Doshi (Platform Safety Lead, Snap)
Malika Saada Saar is a Senior Fellow at Brown University's Watson School of International & Public Affairs, where she teaches & advises on the governance of AI, democracy, & the rule of law. Most recently, she served as Global Head of Human Rights at YouTube, where she led company-wide efforts to embed human rights principles into the design, development, & deployment of AI systems. Earlier in her career, she founded & led Rights4Girls, a national advocacy organization that helped transform U.S. legal & policy responses to child sex trafficking, the criminalization of vulnerable girls, & systemic inequities in the justice system. Her leadership has been recognized widely: she was named one of Newsweek's 150 Women Who Shake the World, served on the Presidential Advisory Council on HIV/AIDS under President Barack Obama, & currently sits on the boards of the Watson School & the Peabody Awards.
Dr. Rebecca Portnoff has dedicated her career to defending children from sexual abuse. She is currently Vice President of Data Science at Thorn, owning strategy & vision for machine learning (ML)/AI across the organization. The ML/AI & algorithmic solutions her team builds have global impact: used across hundreds of LE agencies, hotlines & technology companies. She acts as an ecosystem leader to address emerging threats against children via novel research & cross-industry collaborations, bridging the gap between child safety experts & technologists.
Miranda Wei studies online abuse & societal factors in sociotechnical safety, especially concerning social media, gender, & interpersonal relationships. Wei holds a Ph.D. from the Paul G. Allen School of Computer Science & Engineering at the University of Washington. In fall 2026, they will start as an assistant professor at cole Polytechnique Fdrale de Lausanne (EPFL) in Lausanne, Switzerland.
Viraj Doshi is the Platform Safety Lead at Snap, focusing on the safety & digital well-being of the Snapchat community. His work involves engaging with external audiences, advising on internal safety-related products & policies, & raising awareness of online risks through programs & initiatives. Prior to Snap, he worked at Meta leading proactive efforts on youth well-being & content moderation including the global outreach efforts for Meta's oversight board. Before joining the tech industry, Viraj worked in Washington D.C. & Mumbai, helping international political & advocacy organizations, as well as companies, develop grassroots & digital campaigns.
The TAKE IT DOWN Act is a 2025 U.S. federal law criminalizing the nonconsensual sharing of intimate images (NCII) & digital forgeries (deepfakes), requiring platforms to remove such content within 48 hours, & creating penalties for creating & distributing these images, particularly those involving minors or threats, with enforcement by the FTC. The law aims to protect victims, but faces debate over its broad notice-and-removal rules potentially impacting privacy & encrypted services.
Key Provisions & Purpose:
Criminalizes NCII & Deepfakes: Makes publishing or threatening to publish nonconsensual intimate images (authentic or digitally altered) a federal crime.
Swift Takedown Mandate: Requires online platforms to remove reported NCII/forgeries within 48 hours.
FTC Enforcement: Grants the Federal Trade Commission (FTC) broader power to regulate platform compliance.
Victim Protection: Aims to protect individuals, especially minors, from exploitation by deepfakes & unauthorized intimate content.
==
All Tech Is Human is a non-profit organization taking a whole-of-ecosystem approach to tackling thorny tech & society issues. Through our multistakeholder community-building, education, & career-related activities & resources, we aim to build a tech future aligned with the public interest.
Attend our in-person & virtual gatherings, join our Slack community of over 14k members, read our Responsible Tech Guide & issue-specific resources, take our Responsible AI courses, use our Responsible Tech Job Board, & more. See all of our links here. Be on the lookout for our Responsible Tech Summit on October 29th at The New York Times Center (in-person + livestream).
|
|
|
|
|
|
|
|