Meta cancelled its contract with data annotation firm Sama, putting 1,108 Kenyan workers out of a job. The timing looks awful. Less than two months before the termination, workers had told Swedish newspapers Svenska Dagbladet and Goteborgs-Posten that they reviewed graphic, intimate footage captured by Meta's smart glasses, including people having sex and using the bathroom. Meta claims it dropped Sama for failing to meet its standards. Sama says that's false. "At no point were we notified of any failure to meet those standards," the company said in a statement.
One worker told the Swedish papers: "We see everything, from living rooms to naked bodies." In one case, a man's glasses were left recording in a bedroom and captured a woman undressing without her knowledge. Data annotators reviewed this footage to train Meta's AI to interpret images. They also checked transcripts of user interactions with the AI. Meta says users consent to human review in its terms of service and that a recording light activates when the camera is on.
Naftali Wambalo of the Africa Tech Workers Movement doesn't buy Meta's explanation for ending the contract. "What I think are the standards they are talking about here are standards of secrecy," he told BBC News. Mercy Mutemi, a lawyer representing petitioners, told the BBC that Kenya shouldn't build its AI industry on work like this, calling it "a very flimsy foundation." This isn't the first controversy either. A previous Sama contract for Facebook content moderation led to lawsuits from workers exposed to traumatising material. Sama later said it regretted taking that work.
Regulators are now involved. The UK's Information Commissioners Office wrote to Meta calling the reports "concerning." Kenya's Office of the Data Protection Commissioner launched its own investigation. The core problem is straightforward: always-on cameras need human reviewers, and those humans end up seeing things nobody intended to share. Meta hasn't solved this.