Are We the Unpaid Workforce of the AI Era?

American anti-capitalist poster, 1938.

IMAGE: Vintage American anti-capitalist poster, circa 1938. Source: r/antiwork

A few years ago, I received an invitation to walk in the Labor Day Parade. It got me thinking about labor law as an alternative path to meaningful data privacy regulation. Today, as we begin to think hard about how to approach AI regulation, I started to wonder if my fanciful idea about using collective bargaining to help regulate data collection and usage might not be as crazy as it was back then.

There is a popular meme used to explain the exchange of our data for free online services: “When something online is free, you are not the customer; you are the product.” This idea is generally adapted to fit the regulatory argument du jour. Another popular way to explain the exchange of our data for free services is to say, “The services we enjoy are not free; we are paying for them with a new form of currency, our data.”

There is some validity to each of these ideas, but both assume that users inevitably exist to serve the commercial interests of the data elite. I reject this notion. At the moment, most users are humans, and in America, being human comes with some “unalienable” rights.

We Are a Workforce!

We are neither digital products nor possessors of a fungible currency that represents the marginal value of our data; we are underpaid digital workers whose labor (behaviors) generates raw data which is used in the manufacture of digital products. These digital products, such as interactive advertisements, generate hundreds of billions of dollars of revenue for the organizations we work for.

Our Employers

Aside from the usual suspects, Google and its family of products: YouTube, Waze, Gmail, etc.; Facebook and its family of products: Instagram, WhatsApp, etc.; Twitter; and every other social network, search service, or digital product (website or app) we are offered free access to, our de facto employers now include every AI company that trains its models on the public internet.

The Current Approach to Regulation

In May 2023, Meta was fined a record-breaking €1.2 billion ($1.3 billion) by European Union regulators for violating EU privacy laws by transferring the personal data of Facebook users to servers in the United States. (I’ve complied a list of the Top 5 GDPR Fines to date).

GDPR is a strong regulatory framework. Could some version of it be adapted to AI regulation? The spirit of GDPR asserts that your data should be private unless you grant permission for its use. But, the law (as written) may have some applicability to the scraping of publicly available data for use in AI training (and that’s a bit of a stretch), it has very little overlap with the way AI models store and use data.

A Different Approach to Regulation

If we want meaningful transparency regarding the use of our personal data (including AI training), which I assert are the fruits of our labor, maybe we should be thinking differently about how to use the laws of the land.

Perhaps we can make the case that we (all of us) are employees of the data elite and AI organizations that use our data, we can collectively bargain for the work conditions and wages we think we deserve. To do this, one human resources lawyer suggests that we ask the data elite organizations what they would have to pay researchers, pollsters, and other gatherers of data if we refused to provide our services to them. This would set a value on our labor.

The American Federation of Users and Data Generators

Imagine forming a union, the American Federation of Users and Data Generators. With enough members, the union could go to Google, Facebook, and the other data elite that use the data generated by the members’ labor (online behaviors) to collectively bargain for total transparency with regard to 1st-, 2nd-, and 3rd-party data usage, rules around data provenance and ownership, and other rights.

This might be the very best way for ordinary Americans, who are not part of the data elite, to gain control of their data destiny.

The National Labor Relations Act

There is a legal mechanism in place to do this. In 1935, Congress enacted the National Labor Relations Act (NLRA) to protect the rights of employees and employers, to encourage collective bargaining, and to curtail certain private-sector labor and management practices that can harm the general welfare of workers, businesses, and the U.S. economy.

A Heavy Lift

Legally speaking, this is a heavy lift. But we need new language to describe what is actually happening in the transition from the Information Age to what we’re going to call the Age of AI.

When intelligence and consciousness are fully decoupled, and algorithms make decisions for other algorithms that make decisions for other algorithms, that make decisions about what we see, where we go, and how we get there, and then other algorithms use long-accumulated, poorly calculated proxy data to make decisions that influence other decisions made by other algorithms that we can’t even comprehend but directly impact our lives, we are going to look back and wish that sometime in 2019–2020, we put a stake in the ground and declared: “We are humans and we choose humanity.”

Portions of this article were originally published as, Free Search & Social: We Are NOT the Product; We Are Underpaid Workers on September 8, 2019. The original article was revised and republished on September 5, 2022 with the title, We Are NOT the Product; We Are Underpaid Workers.

Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it.

Get Briefed Every Day!

Subscribe to my daily newsletter featuring current events and the top stories in technology, media, and marketing.