(C): Unsplash
Digital services are instantaneous: a ride comes to a place, a video is censored, a package route is updated in real time. But behind this convenience, there is a large army of workers who do invisible work, i.e., labelling data, filtering harmful content, training AI models, customer support, and micro-jobberies that pay me low wages. This digital labour is growing faster than the protections meant to keep people safe at work. With border-spanning platform companies, workers tend to experience unclear contracts, constant surveillance, and access to grievance processes. The result is a modern human rights challenge hidden inside everyday apps.
The hidden workforce powering the internet
On clickwork sites to outsourced moderation teams, the work on the internet is often divided and distributed to be performed anywhere. This structure assists companies to grow fast but at the same time, responsibility is blurred. Once an employee is outsourced using a vendor, given a job on a per-task basis, and controlled by an algorithm, it becomes difficult to determine who is obliged to provide good wages, working conditions, and fair trials.
Read more: Digital Authoritarianism 2.0: How Spyware and AI Surveillance Are Quietly Rewriting Human Rights Law
Where rights risks show up
There are numerous harms that are directly linked to the rights that exist at work. Wage theft can appear through unpaid “training,” rejected tasks with no appeal, and piece-rate pay that falls below minimum wage equivalents. The privacy is at risk when the platforms capture screens or keystrokes, location, or biometrics. The harms caused by health arise when content moderators continually watch graphic content without proper counselling or rotation. Across the platform economy, workers may also experience discrimination through opaque rating systems that silently reduce access to jobs.
AI supply chains and accountability gaps
As AI grows, demand rises for data annotation and model testing—often outsourced to regions with fewer labour inspections. They might present their own as independent contractors, but the practicality will be more like employment: follow strict rules, score performance, and be fired without any warning. If digital labour underpins AI products sold globally, then human rights responsibilities also travel globally—through procurement, vendor contracts, and audit standards.
What better protection looks like
Better protection does not mean recreating rights, it merely means applying them online. Such major steps as open payment and rejection policies, working-time restrictions, a mental-health support that can be approached by moderators, and substantial appeals in the cases of account deactivation can be listed. Governments can update labour codes for platform work, while companies can publish supplier lists, adopt human rights due diligence, and allow independent worker representation.
Disclaimer: Stay informed on human rights and the real stories behind laws and global decisions. Follow updates on labour rights and everyday workplace realities. Learn about the experiences of migrant workers, and explore thoughtful conversations on work-life balance and fair, humane ways of working.






