Your Data, Their Profit: The Hidden Economy of Surveillance Capitalism


How Big Tech tracks us all—and targets the most vulnerable

In today’s economy, privacy is a luxury good. If you use social media, browse the internet, or even walk down the street with a smartphone in your pocket, chances are you’re being surveilled. Not for safety. Not for convenience. But for profit.

This is the core of surveillance capitalism: the extraction, commodification, and sale of personal data—without meaningful consent. And while every user is affected, the burden falls heaviest on those already marginalized.

TikTok and the Data Extraction Playbook

Take TikTok. With over 150 million U.S. users, it’s more than a social platform—it’s a data goldmine. TikTok collects:

  • Precise location via IP and SIM data
  • Keystroke patterns and clipboard content
  • Biometric identifiers (like face and voiceprints)
  • Search history, interaction times, and device metadata

(Source)

In 2024, the U.S. Justice Department investigated TikTok for collecting sensitive user data—including views on gun rights, abortion, and religion—and making that data accessible to ByteDance employees in China.

But this isn’t just about one app. Meta, Google, X (formerly Twitter), and thousands of third-party data brokers operate with similar tactics—aggregating intimate details of your life into digital profiles sold to advertisers, insurers, and sometimes government agencies.

Who’s Most at Risk?

Surveillance capitalism doesn’t operate in a vacuum. It amplifies existing systems of inequality. Here’s how:

1. Racial Profiling Goes Digital

Predictive policing software, often fed by biased historical data, has been shown to disproportionately target Black and Latino communities. In Los Angeles, an internal review of the LAPD’s Operation LASER program revealed that Latinos and African Americans comprised 84% of the 233 individuals identified as “chronic offenders”, subjecting them to increased surveillance and police action. This overrepresentation raised concerns about the program’s potential to reinforce existing racial disparities in policing.  

Facial recognition tools like Clearview AI have been shown to have error rates as high as 100 times greater for darker-skinned faces, especially women. Yet these tools are being deployed in schools, public housing, and airports—spaces disproportionately used by low-income and minority residents.

2. Exploiting Economic Insecurity

Low-income users are more likely to rely on “free” apps and devices, which come at the hidden cost of data collection. A 2024 report found that free apps are four times more likely to harvest data than paid ones.

That data is then sold or used to shape manipulative content—like predatory payday loan ads, junk food marketing, and misinformation targeting immigrant communities.

3. Digital Redlining

Data brokers segment consumers by race, ZIP code, and inferred creditworthiness. This has led to “digital redlining”—where people in majority-Black or Latino neighborhoods see fewer job ads, higher insurance rates, or are excluded from housing opportunities.

A ProPublica investigation revealed that Facebook allowed advertisers to exclude users by “ethnic affinity” when posting housing and employment ads—a practice the company later had to halt under federal pressure.

More Than Privacy: A Civil Rights Crisis

Surveillance capitalism isn’t just a privacy issue. It’s a civil rights issue. When tech platforms collect data without consent and build tools that reinforce bias, they’re not just invading your digital life—they’re reshaping your physical one.

From discriminatory policing to health misinformation and economic exclusion, the impacts are most severe for people who’ve historically had the least power. In a system built to extract value, marginalized communities are treated as raw material—mined for data and discarded when convenient.

What We Need Now

  • Federal Data Privacy Law: The U.S. is one of the only developed nations without comprehensive data privacy protections. Europe’s GDPR shows it can be done.
  • Ban on Biometric Surveillance: Especially in schools, housing, and law enforcement where abuses are rampant.
  • Algorithmic Transparency: Tech companies must be held accountable for how their systems make decisions—and who they harm.

Surveillance Is a Social Justice Issue

Surveillance capitalism relies on two things: your attention and your silence. But the more we expose its architecture of exploitation, the harder it becomes to justify.

We don’t have to accept a world where Black and brown communities are over-surveilled and underprotected. Where corporations manipulate vulnerable populations for clicks and conversions. Where our most intimate details are up for sale.

This is more than a tech story. It’s a fight for dignity, equity, and human rights.

Evangeline
Author: Evangeline

Help Keep Big Easy Magazine Alive

Hey guys!

Covid-19 is challenging the way we conduct business. As small businesses suffer economic losses, they aren’t able to spend money advertising.

Please donate today to help us sustain local independent journalism and allow us to continue to offer subscription-free coverage of progressive issues.

Thank you,
Scott Ploof
Publisher
Big Easy Magazine


Share this Article

Leave a Reply

Your email address will not be published. Required fields are marked *