An Engineer's Post Protesting Laptop Surveillance Is Going Viral Inside Meta
Briefly

An Engineer's Post Protesting Laptop Surveillance Is Going Viral Inside Meta
"“Selfishly, I don't want my screen scraped because it feels like an invasion of my privacy,” wrote an engineer in an internal post seen by nearly 20,000 coworkers this week. “But zooming out, I don't want to live in a world where humans-employees or otherwise-are exploited for their training data.”"
"The message aimed to rally support for a petition circulating inside the company since last Thursday that demands an end to what Meta calls the Model Capability Initiative. It's a piece of mandatory software that Meta began installing on the laptops of US employees last month. The tool records employees' screens when using certain apps with the goal of collecting “real examples of how people actually use” computers, including “mouse movements, clicking buttons, and navigating dropdown menus,” according to Reuters."
"“I'm mixed on Al. On one hand, I really enjoy using it to write software. On the other hand, I'm really nervous about its impact on the world,” the engineer wrote in an internal forum for coders. “And what kind of norms are we establishing about how the technology is used, and how people are going to be treated?”"
"The petition, also seen by WIRED, states that “It should not be the norm that companies of any size are permitted to exploit their employees by nonconsensually extracting their data for the purposes of Al training.” In the US, employers generally have wide latitude to monitor workers' devices for security, training, evaluation, and safety purposes. But using these tools to build datasets that instruct AI systems on navigating computers without human supervision appears to be a new tactic—and one that doesn't sit right with many Meta workers."
Meta began installing mandatory software on US employees’ laptops to collect real examples of how people use computers. The tool records screens when employees use certain apps and captures mouse movements, clicking buttons, and navigating dropdown menus. An internal post from an engineer raised concerns about privacy and the broader risk of exploiting humans for training data. The post supported an internal petition demanding an end to the Model Capability Initiative. The petition argues that companies should not nonconsensually extract employee data for AI training. US employers can monitor devices for security, training, evaluation, and safety, but using monitoring to build AI datasets for unsupervised computer navigation is viewed as a new and troubling approach by many employees.
Read at WIRED
Unable to calculate read time
[
|
]