How to Poison the Data That Is Used to Exploit You

Patrick K. Lin
4 min readOct 4, 2022

--

This article was originally published in my LinkedIn newsletter, Tech Support.

Algorithms exploit us by using our data. We can use that to our advantage to demand change.

As you go about your day, you create data that is systematically collected by tech companies. Every email you send, every order you make online, every video you watch allows tech companies to develop a clearer and more complete picture of you and your preferences. This data is fed into machine-learning algorithms to target you with ads and recommendations. In fact, your data helped Google make nearly $210 billion in advertising revenue last year.

It has also become more and more challenging to cut big tech out of your life. Nearly one-third of the internet runs on Amazon’s cloud computing platform AWS. Google is responsible for more than 85 percent of global desktop search traffic. Meta owns Facebook, Messenger, Instagram, WhatsApp, Oculus, and many other companies and platforms consumers interact with. Almost half of all smartphone users use an Apple iPhone.

Even if you manage to live off the grid, your individual actions will not result in any meaningful setback for these tech giants. But what if millions of people realized how powerful their collective data is? Sure, tech companies have impressive algorithms that can predict our next purchase or pop culture obsession, but those same algorithms are nothing without reliable data — our data.

In a 2021 paper, PhD students at Northwestern University propose three ways the public can exploit big tech’s reliance on our data:

Data strikes, inspired by protests like labor strikes, involve withholding or deleting your data to reduce the amount of data tech companies have access to, particularly with respect to training data for their AI systems. For example, you can either leave a platform by deleting your account or install privacy tools.

Data poisoning involves contributing inaccurate or harmful data, thus encouraging the underlying AI model to perform poorly. While data strikes harm performance by reducing the amount of available data, data poisoning harms performance by providing meaningless data. AdNauseam, for instance, is a browser extension that clicks on every single ad sent your way, which confuses Google’s ad-targeting algorithms.

Conscious data contribution is when users give meaningful data to the competitor of a platform you want to protest, such as by switching to a different web browser or transferring photos to a different platform. For example, in May, it was revealed DuckDuckGo, the self-proclaimed “internet privacy company,” made an exception for its business partner Microsoft to its browser’s blocking of some advertising trackers on websites. Many users switched over to Mozilla Firefox, which does not share browsing data with Mozilla, and Tor, a modified version of Firefox with even more privacy-oriented features.

Many people already use these tactics to protect their online privacy. If you’ve ever used an ad blocker, you’ve already participated in data striking.

Unfortunately, your individual use of an ad blocker won’t do much to get tech giants to suddenly stop monitoring your online behavior, which brings us back to my earlier question: what if we collectively poisoned the data that is used to exploit us?

Through collective action, companies can be forced to change their data collection practices. Having enough collective action for there to be any meaningful impact is the tricky part. However, there are already examples we can turn to. When WhatsApp announced its new terms of service that would allow Facebook and its subsidiaries to store user data, millions of users deleted their accounts and moved to competitors like Signal and Telegram (Note: Of the major messaging services, only Signal is truly private).

As a result of the mass exodus, Facebook delayed its policy changes.

Collective action of this kind requires a great deal of coordination, including collaboration between technologists and policymakers. Computer scientists play a crucial role by developing simple privacy tools, which lower the barrier to participate in tactics like data strikes. Policymakers can also push for better data protection laws. After all, data strikes are much more effective when backed up by strong data privacy laws, like the European Union’s General Data Protection Regulation (GDPR), which gives people the right to request the deletion of their data. Without these policies in place, it’s difficult to guarantee that a tech company will actually delete your data, even if you delete your account.

These tactics are a way to reclaim some agency and ownership over the use of our data. However, it is only a start — and a tough one at that. Collective action is really challenging, especially ongoing action. Another obstacle is getting people to see themselves as part of a community when the action may be as brief as downloading a particular privacy tool or switching over to a different search engine. Still, inspiring collective action around data privacy starts with a simple fact:

We create data just by going about our day. AI systems rely on this data. Reclaiming our data is how the public can start to level the playing field.

===

With the overturning of Roe v. Wade, data privacy is more important than ever. It can also be overwhelming to know which services or tools take your privacy seriously. PrivacyTools.io — established in 2015 after Edward Snowden’s revelations — is a helpful and user-friendly website that provides services, tools, and a comprehensive privacy guide:

https://www.privacytools.io/

You can read more about other ways to protect your data privacy in a post-Roe future by checking out this report from the Surveillance Technology Oversight Project (S.T.O.P.): Pregnancy Panopticon.

--

--

Patrick K. Lin
Patrick K. Lin

Written by Patrick K. Lin

Patrick K. Lin is a New York City-based author focused on researching technology law and policy, artificial intelligence, surveillance, and data privacy.

No responses yet