CRIMINAL PROCEDURE--FOURTH AMENDMENT--NINTH CIRCUIT HOLDS THAT OFFICER'S WARRANTLESS REVIEW OF IMAGES FLAGGED BY GOOGLE AS APPARENT CHILD SEXUAL ABUSE MATERIAL VIOLATED FOURTH AMENDMENT.--United States v. Wilson, 13 F.4th 961 (9th Cir. 2021).
The rise of digital media has unleashed a flood of Child Sexual Abuse Material (CSAM) across the internet, and with it, the horrible shame and vulnerability that haunt survivors of such abuse. (1) In 2008, President Bush signed the PROTECT Our Children Act of 2008 (2) (PROTECT Act) to enlist large technology companies in the fight against CSAM. (3) The law requires "electronic communication service provider[s and] remote computing service providers" (4) to notify the National Center for Missing and Exploited Children (NCMEC) when they discover "apparent violation[s]" of laws prohibiting CSAM. (5) Some electronic communication service providers have responded by actively screening content on their platforms for CSAM. (6) But as service providers have started to help law enforcement search for CSAM, courts have struggled to apply Fourth Amendment doctrines that were developed in physical search cases to digital contexts. (7) Recently, in United States v. Wilson, (8) the Ninth Circuit held that the government violated the defendant's Fourth Amendment rights when it viewed--without a warrant--images he had attached to an email that Google flagged as "apparent child pornography." (9) The Ninth Circuit correctly applied precedent in this case, but only because the government did not provide adequate evidence to demonstrate the accuracy of Google's CSAM screening process. (10) The fact that the government can easily provide such information in future cases, nullifying the Ninth Circuit's analysis in this one, reveals that current Fourth Amendment jurisprudence does not provide meaningful protection against the government's ever-increasing power to conduct digital surveillance and that Congress shoulders the responsibility of protecting citizens' digital privacy rights.
In June 2015, defendant Luke Wilson attached four images containing CSAM to an email on his Gmail account. (11) Google's proprietary screening system--which scans uploaded images and checks for identical matches in a database of confirmed CSAM (12)--immediately flagged Wilson's attachments as "apparent child pornography." (13) Without having an employee review the attachments first, Google's system then sent an automated report to the NCMEC's CyberTipline that included the attachments. (14) The report classified each image as "Ai," a standard classification in the tech industry for "content [that] contains a depiction of a prepubescent minor engaged in a sex act." (15) NCMEC forwarded the report to local law enforcement. (16)
Agent Thompson, a member of San Diego's Internet Crimes Against Children Task Force, reviewed the report forwarded by the NCMEC. (17) Thompson inspected each of the images and confirmed that they were indeed CSAM. (18) Relying on Google's report and his personal observations, Thompson then applied for and obtained a search warrant for Wilson's email account. (19) When he searched Wilson's email account, "he discovered numerous email exchanges in which Wilson received and sent ... child pornography and in which Wilson offered to pay for the creation of child pornography." (20) Law enforcement subsequently obtained a...