In the rapidly evolving landscape of artificial intelligence, a new economy has emerged where individuals sell their personal data, including moments from their daily lives, to train AI models. As reported by The Guardian, this phenomenon is part of a broader trend where people are monetizing their identities to feed the growing demand for high-quality, human-grade data.
Jacobus Louw, a 27-year-old from Cape Town, South Africa, is one such individual who has found a way to earn money by uploading videos and photos of his everyday life to apps like Kled AI. For instance, he recorded videos of his feet and the view as he walked on the pavement, earning him $14, which is approximately half a week's worth of groceries. This income, though erratic and insufficient to cover his full monthly expenses, has been crucial for Louw, who struggled with a nervous disorder and couldn't secure a job.
Similarly, Sahil Tigga, a 22-year-old student from Ranchi, India, earns over $100 a month by letting Silencio access his phone's microphone to capture ambient city noise and by uploading recordings of his voice. Ramelio Hill, an 18-year-old welding apprentice from Chicago, made a couple of hundred dollars by selling his private phone chats with friends and family to Neon Mobile, a conversational AI training platform.
The Rise of Gig AI Training
The emergence of gig AI training as a new category of work is substantial, according to Bouke Klein Teeselink, an economics professor at King's College London. AI companies are willing to pay individuals for their data to avoid copyright disputes and to obtain high-quality data necessary for modeling new behaviors in their systems. Veniamin Veselovsky, an AI researcher, notes that human data is currently the gold standard for sampling outside of the distribution of the model.
However, the pitfalls of gig AI training are multifaceted. Data trainers often grant irrevocable, royalty-free licenses that allow companies to create derivative works, potentially leading to deepfakes, identity theft, and digital exploitation. The lack of transparency in these marketplaces means that users' data could end up in facial recognition databases or predatory advertisements without their knowledge or consent.
Structural Concerns and Precarious Work
Mark Graham, a professor of internet geography at the University of Oxford, warns that gig AI training is structurally precarious, non-progressive, and effectively a dead end. The work relies on a "race to the bottom in wages" and a temporary demand for human data, leaving workers with no protections, no transferable skills, and no safety net once the demand shifts.
Jennifer King, a data privacy researcher at the Stanford Institute for Human-Centered Artificial Intelligence, finds it concerning that AI marketplaces are unclear about how and where users' data will be deployed. Without negotiating or knowing their rights, consumers run a significant risk of their data being repurposed in ways they don't like or understand, with little recourse if this happens.
Case Studies and Concerns
Adam Coy, an actor from New York, sold his likeness for $1,000 to Captions, an AI-powered video editor, with an agreement that ensured his identity wouldn't be used for political means or selling alcohol, tobacco, or pornography, and that the license would expire in a year. However, his friends started forwarding him videos featuring his face and voice, garnering millions of views, including one where his AI replica claims to be a "vagina doctor" promoting unproven medical supplements.
Coy's experience highlights the risks and embarrassments associated with selling one's likeness. He hasn't signed up for any AI data gigs since and would only consider it if a company offered major compensation. The terms of these agreements, as noted by law professor Enrico Bonadio, permit platforms and their clients to do almost anything with the material, forever, with no further payment and no realistic way for the contributor to withdraw consent or renegotiate.
Conclusion and Future Implications
The rise of gig AI training and the sale of personal data to feed AI models raise significant concerns about privacy, exploitation, and the future of work. As AI continues to evolve, it's crucial to address these issues and ensure that individuals are protected and compensated fairly for their contributions. The future of AI development must prioritize transparency, consent, and the well-being of those whose data is being used to train these models.

