Skip to main content

Kennedy School Review

Topic / Science, Technology and Data

A House Built on Sand: The Future of Privacy in the US

Privacy and Policing

When law professors or judges discuss a right to privacy, they mean something narrower: a Griswoldian right to privacy; it’s from the famous 1965 decision in Griswold v. Connecticut, where the Supreme Court found1 a right to privacy not explicitly mentioned in the Constitution. Along with the 1966 Miranda v. Arizona and a handful of other cases like Terry v. Ohio of the Warren and Burger Courts, these decisions form the modern American concept of where the state’s ambitions to investigate conflict with an individual’s right to resist its inquiries.

In essence, the mid-century caselaw affirmed that it is the state’s duty to respect an individual’s rights; it is not the individual’s duty to satisfy every curiosity of the state.

Concerns about privacy from the state’s investigative tools, whether in the criminal2 context or not, trace their origins back much further than the 1960s. Several Framers of the Constitution witnessed these events, and abuses of police power3 were on their minds when crafting America’s founding document. Today, we inhabit a world of aerial drones, GPS systems, infrared imagers, and body cameras–technology that would be unimaginable in the drawing rooms of 18th century Pennsylvania gentlemen.

In one of the cases discussed below, protections for the people encountering police are significantly lessened; in the other, these protections are meaningfully enhanced. In a forthcoming law review article4 co-authored with Illinois Assistant Attorney General Nancy Jack, from which this piece is adapted from, we explore the Timmsen opinion in detail and suggest a different outcome would have been correct.

The Illinois Supreme Court’s People v. Timmsen: A Problematic Precedent

In Timmsen, the defendant approached a roadblock and made a proper, legal, and unremarkable U-turn at normal speed. At trial, the prosecution conceded the defendant did nothing illegal. The Illinois Supreme Court found that even though the pile of things the defendant did were all individually unremarkable and permissible, the pile taken as a whole is sufficient to allow a search of the defendant’s vehicle; in other words, enough “rights” become a “wrong” in this strange instance.

Suppose a person is approaching the crest of a hill and cannot see the roadblock in the distance. However, she is using Google Maps or some similar driving aid, which notifies her of a delay ahead and suggests an alternate route to avoid the delay and reach the destination sooner. Such a driver may unknowingly, while following all applicable laws and attempting to operate the vehicle expeditiously and safely, make herself suspicious and subject to a vehicle search under Timmsen by “avoiding” a roadblock.

In the near future, one can imagine an autonomous vehicle in which the passenger has no steering wheel but instead is driven around in response to voice commands or a touchscreen map. Such an autonomous vehicle might choose to avoid roadblocks to get an occupant to a destination sooner. Current law permits the driver of a vehicle avoiding a roadblock to have her vehicle searched. Is it so hard to imagine a world where the judge finds that a software “driver” is no different?

While Timmsen may appear, even to legal scholars, a narrow holding quarantined within an unusual fact pattern, I believe the implications are more far-reaching, ominous, and worthy of futureproofing.

The Georgia Supreme Court’s Georgia v. Wilson: Privacy Preserved

Turning to Georgia, we see that in Wilson the police sought a very broad warrant to look through the content on the defendant’s smartphone, which had been lawfully and properly obtained from him along with other personal effects during a traffic stop resulting in a custodial arrest of the driver-defendant. The appropriateness and lawfulness of the arrest was not in dispute; what was, however, is the appropriateness of a second warrant, which allowed a “digital search”5 of all the files on the defendant’s smartphone, even though the officers and prosecutors could not describe in particular6 what they were looking for.

The Georgia trial court correctly suppressed the evidence found in the smartphone principally due to the overly broad warrant, and the Georgia Supreme Court upheld the lower court decision.

I believe Wilson to be not only correctly decided but, importantly, to be correct policymaking. To allow search warrants with no limiting language is bad policy and invites police to look for opportunities to rummage through citizens’ digital lives rather than looking for crimes to prevent and prosecute. Today, the right of a person to be “secure7 in his papers” must extend to being secure in a Google Doc that might be hosted in “the cloud”8 or a tax return backed up in a faraway data center.9

One can envision a future in which it becomes essentially impossible, in the absence of deep technological literacy and a nearly unlimited defense budget, to protect one’s life (especially the protection of personal data and metadata) from intrusive surveillance.

It Isn’t Just About Autonomous Cars or Smart(er) Phones

While Timmsen and Wilson occur in different places and scenarios, and while one is bad policy and the other good in my view, they raise a shared concern that should not be viewed as obsolete or settled. When and how should the police be able to interrupt daily life and root around in things10 we hold dear?

The best policies, like our Bill of Rights, envision not only a protection for the people against a current overreach of the state, but also anticipate erosion of existing protections over time and the need for additional protections when new vectors of attack become available.11 Good policy requires more than Timmsen’s failure to think about even near-term distortions of its intentions and Wilson’s correct but rather narrow vision of privacy in a digital world.

We live in a world where technology changes how we interact with law enforcement in unanticipated ways. To use the example of facial recognition for unlocking mobile devices, police officers will sometimes attempt to unlock a phone with the face or finger of an unconscious suspect or uncooperative suspect; FaceID can now detect the difference between a picture of you and “actual you,” but it cannot reliably distinguish between your typical face versus your face under duress. This may be your face with an officer’s hand around your throat or your face after being hit over the head with a department-issued flashlight.

Simply because something is difficult to solve or cannot be decisively solved does not mean it should go unaddressed.

I recently published a commentary with the Hon. James A. Shapiro on the difficulty of describing “reasonable doubt” to jurors as both a concept and a standard. While “reasonable doubt” is not very complex to understand, it is famously difficult to apply. Similarly, ideas like “I should enjoy digital privacy” or “police shouldn’t do that” are easy to understand but admittedly challenging to legislate. We must stop avoiding these challenges and confront them head-on with an understanding that our most cherished freedoms, those that are fundamental to our society, are at stake.

To borrow a famous legal quip, American privacy is protected merely by a thin layer of common decency, public policy, legislative safeguard, and judicial concern–a four-walled house built on, and of, sand.


Photo credit: Yinan Chen via Wikimedia Commons

[1] Or “excavated,” to use the captivating metaphor of Samuel D. Warren, II.

[2] Griswold, Miranda, and Terry are criminal cases.

[3] Though English troops were a martial force, they also possessed police-like powers and enjoyed substantial latitude from stated orders.

[4] 56 Loy. U. Chi. L. J. ___ (2024) (accepted, not yet paginated)

[5] This is my best “laypersonese” translation; the words “forensic examination” were used in the warrant language.

[6] “Particular” is used here because a search warrant in Georgia must particularly describe the thing expected to be found, see Dobbins v. State, 262 Ga. 161, 164 (1992).

[7] To better understand this term of art, see e.g., United States v. Miller, 425 U.S. 435 (1976).

[8] “The cloud” can mean different things in different digital ecosystems, but it customarily means a mixture of software and infrastructure designed to store and transit files to a variety of devices on an on-demand basis.

[9] See cf. United States v. Malquist, 791 F.2d 1399 (9th Cir. 1986).

[10] Even ethereal things, in the case of files our phones constantly create, back up, and retrieve wirelessly often without user direction.

[11] Eighteenth century policemen didn’t use infrared goggles to see through walls, for instance. See Kyllo at 29, 38: “Thermal imagers detect infrared radiation, which virtually all objects emit but which is not visible to the naked eye … [t]he Agema Thermovision 210 might disclose, for example, at what hour each night the lady of the house takes her daily sauna and bath—a detail that many would consider ‘intimate’; and a much more sophisticated system might detect nothing more intimate than the fact that someone left a closet light on.” Kyllo v. United States, 533 U.S. 27 (2001) (Scalia, J.).