Retroactive Loss of Privacy

Daniel Tunkelang
3 min readJan 16, 2024

--

We live in an age where we increasingly feel a loss of privacy. Much of our communication, media consumption, and shopping is mediated through platforms that track us and then mine or resell our data.

But at least we all know — or should know —that this is happening. We may not quite be consenting participants, but at least we are informed.

I want to explore something different: the retroactive loss of privacy.

A Personal Story

As a young undergraduate, I wrote a short story that I posted to a Usenet group. That was in 1990, long before there was a web, let alone web search. The story was mildly erotic, but quite tame even by 1990 standards. Suffice to say that I decided to keep my day job as a budding computer scientist.

Fast forward to 1999. A couple of people contacted me out of the blue to enlist me in a venture that ultimately became Endeca. The co-founders did their due diligence on me, and one of them discovered my short story via the then recently launched Google. I had never posted it to the web, but porn sites were copying and pasting content from Usenet groups as a form of search engine optimization (SEO). Luckily, my youthful writing exercise did not cost me the opportunity of a lifetime. But it could have.

As danah boyd said about Usenet in her 2002 Master’s Thesis on “Faceted Id/entity: Managing Representation in a Digital World”:

Posters knew that they were posting to public forums and that anyone who had access could read their posts. Perhaps a little bit of hindsight makes it seem obvious that the Internet could one day be comprised of most people and that those posts would be permanently archived and reassembled with search engines. Perhaps those posters should have had that foresight, but many of them did not.

Count me among the posters who lacked that foresight. I was an avid science fiction reader, but I did not imagine a world in which everything ever published would be accessible by keyword search. Perhaps I should have read “As We May Think” by Vannevar Bush when I was younger.

Genetic Databases

People who use services like 23andMe or Ancestry.com surely realize that they are trusting those companies with extremely personal data. Indeed, they should recognize that the implications of what they share will depend on unforeseeable advances in our understanding of genetics.

But those of us who have not deposited our data into genetic databases are affected too. After all, it is not difficult to mine family trees by combining public data with the data from a variety of private sources. Even if you choose to keep your own genetic data private, it is possible to infer a lot about you from data that your relatives provide.

Again, this is not the sort of privacy issue that most of us worried about a few decades ago. At the time, the worst we might have feared was being associated with a notorious relative. Now, all of our genetic data seems up for grabs, simply because we disclosed the names of our immediate family.

Science Fiction

One of my favorite works of science fiction is “E for Effort”, written by T. L. Sherred in 1947. It is about the the consequences of inventing a time viewer that can project images of any past time and place. I won’t spoil it for you.

But imagine how the sudden appearance of this invention would shatter our conception of privacy. It is not just that we would find ourselves living in a surveillance state. It is that the surveillance state would be retroactive, giving us no opportunity to change how we might have acted if we had known we might be observed someday.

I am not particularly concerned about future scientists inventing a time viewer. Still, I could not help thinking about “E for Effort” back in 1999, when the emergence of web search retroactively cost me my privacy.

And, while time viewers may remain science fiction, I expect that other technologies will have similar effects. We can already see how facial recognition links us to photos taken before such technology existed. Voice recognition will do the same for audio recordings. And generative AI allows us to be superficially cloned by anyone with enough of our digital scraps.

We cannot know what technologies the future will bring. But I am certain that they will lead us to lose even more of our privacy retroactively. Unfortunately, all that we can do about it today is brace for that eventuality.

--

--