The Moral Laundromat of AI
The Moral Laundromat of AI
Somewhere in the Philippines, a gig worker logs onto a platform. Their task for the next hour, for a wage that would be illegal in California, is to teach a machine how to see. They are not looking at abstract shapes or lines of code. They are looking at a Tuesday afternoon in suburban Atlanta. A car, a license plate, a person walking their dog. For a few cents, they are annotating the raw feed of American life, flagging a vehicle’s make, a piece of clothing, or perhaps, as one of the system’s patents suggests, a person’s race.
This is the engine room of Flock, the AI-powered surveillance company whose cameras are quietly blanketing thousands of American communities. And according to a recent leak, this is the reality of how its all-seeing eye is trained. The predictable outrage focuses on privacy, on the unsettling idea of American surveillance data being viewed overseas. But to fixate on privacy is to miss the point entirely. This isn’t a story about a data leak. This is a story about a perfectly functioning machine.
That machine is the great moral laundromat of modern AI development.
Like financial laundering, which washes the stain of criminality from money, moral laundering is a process designed to wash the ethical stain from acts of power and control. It works by creating a supply chain of abstraction, distance, and plausible deniability. It takes a morally fraught action—say, building a nationwide network to spy on your fellow citizens—and atomizes it into a million sanitized micro-tasks, distributing them to a global workforce so disconnected from the consequences that their work feels morally neutral.
Consider the architecture. Flock, the corporation, doesn’t directly employ a legion of watchers. Instead, it uses a platform like Upwork, a frictionless intermediary that rebrands people as “AI services.” The task is no longer “surveil a community”; it is “annotate vehicle assets.” The worker in the Philippines isn’t participating in the construction of a police state; they are completing a task in a queue, earning a living in a digital factory. The geographical and cultural distance is a feature, not a bug. It ensures the annotator has no context for the lives they are cataloging, no stake in the society being monitored. They are the ultimate disinterested observers, their labor stripped of all moral weight.
This is not a new invention. It is colonialism with a software update.
For centuries, empires have relied on the same principle: outsourcing the dirty work of extraction and control to distant territories where labor is cheap and accountability is diffuse. The raw material has changed—from rubber and cotton to data—but the logic is identical. The data is harvested from the citizens of the Global North, a resource they generate simply by existing. It is then shipped, via undersea cables, to the digital factory towns of the Global South for processing. The finished product—actionable intelligence, predictive policing, social control—is then repatriated and sold back to the powerful.
This is where the grand illusion of “AI Ethics” is revealed for what it truly is: a branding exercise. While executives in Silicon Valley deliver keynote speeches on “responsible innovation” and “human-in-the-loop” systems, their entire business model is predicated on this invisible, morally compromised supply chain. “Ethical AI” is the marketing department for the moral laundromat. It produces white papers, convenes panels, and drafts principles that serve as a sophisticated smokescreen, providing moral cover for a system that is, at its core, indefensible.
The system is working perfectly. The leak is not evidence of failure, but of success at an industrial scale. It demonstrates that the moral cost of building an omniscient surveillance apparatus can be successfully externalized, pushed down the supply chain onto the shoulders of those with the least power to object.
So, the next time you hear a tech company champion its commitment to ethics, remember the worker in the Philippines. Remember the quiet transaction that turns one person’s Tuesday afternoon into another person’s data point. The surveillance state isn’t just being built by algorithms and cameras. It’s being built by this global machine that launders accountability, outsources morality, and calls the entire process “progress.” And we, the users of its myriad convenient services, are all its silent shareholders.