Flock’s Use of Overseas Gig Workers for AI Training Raises Privacy Concerns

This article was generated by AI and cites original sources.

Flock, known for its automatic license plate reader and AI-powered cameras in the US, has come under scrutiny for utilizing overseas workers from Upwork to train its machine learning algorithms. An accidental leak exposed training material instructing workers in the Philippines on how to review and categorize footage captured by Flock’s surveillance systems in the United States.

This practice has raised questions about the privacy and security implications of outsourcing sensitive surveillance tasks to remote workers. While it’s common for companies to engage overseas workers to train AI models due to cost efficiencies, Flock’s focus on continuous monitoring of US residents’ movements amplifies the sensitivity of the data involved.

Flock’s cameras are designed to scan and analyze various details of passing vehicles, including license plates, color, brand, and even the race of individuals detected. This level of surveillance has prompted concerns from civil rights organizations, particularly regarding the potential misuse of collected data by law enforcement agencies.

As Flock’s presence expands across numerous American communities, the revelation of its reliance on offshore labor for AI training underscores the need for transparent practices and robust privacy safeguards in the development and deployment of surveillance technologies.

Source: WIRED