HerdNewsHerdNews
πŸ‘Absorbed: 0/14

What Happened

Over 1,000 workers in Kenya reviewed videos from Meta's Ray-Ban smart glasses for the company and its subcontractor. Some clips allegedly showed users having sex. Meta and the subcontractor now dispute why the workers were made redundant after raising concerns.

Why You Should Care

If you wear smart glasses, your steamy moments might get human eyes on them before AI β€” and those eyes could get fired for noticing.

πŸ“š The Basics

Content moderation means humans (often low-paid workers in places like Kenya) watch user videos to flag illegal or harmful stuff before AI takes over. Smart glasses like Meta's Ray-Ban have built-in cameras that record video hands-free. Subcontractors are hired by big tech to do the dirty work cheaply, but companies often dodge blame when things go wrong.

🧠 Look Smart At Dinner

Say This

Meta's outsourcing 1,000 moderators means they're flooding the world with unfiltered smart glasses footage that no one's properly checking.

Context

Kenya hosts massive moderation hubs for US tech giants because labor is cheap β€” these workers saw 1B+ pieces of content before getting canned.

Avoid Saying

"Fired for watching porn" β€” they were professional moderators axed after flagging real issues, not perving out.

The Approved Opinionβ„’

β€œTech companies should ensure fair treatment and safe conditions for their content moderators worldwide.”

πŸ‘ What The Herd Is Saying

πŸ‘β€œMeta: making sure no one watches your smart glasses sex tapes but us.”
πŸ‘β€œ1000 jobs gone because Zuck couldn't handle the mirror of his creepy glasses empire.”
πŸ‘β€œNext glasses update: auto-blur for when users get freaky.”

More TECH

Get 6 of these in your inbox every morning.