The ACLU of Massachusetts says the cameras - marketed as neighborhood safety tools - enable broad government surveillance by collecting data on everyone's movements, not just those suspected of wrongdoing.
Details have emerged of a troubling case in which a basic engineering mistake wrecked a digital evidence investigation and led to wrongful accusations. An open judgment [PDF] published by the UK's Investigatory Powers Tribunal, which is responsible for investigating claims of British authorities illegally abusing their powers during the course of an investigation, detailed the impact on three people wrongly accused of child sex offences.
"To maximize freedom for our users, only sexual content involving minors is considered prohibited," reads an updated company document about what will be allowed, suggesting wide latitude for developers to use the company's platform to craft naughty experiences for users. As observers quickly pointed out, it was a pretty astonishing reversal for the company. Just two months ago, its CEO Sam Altman had boasted on a podcast that OpenAI hadn't "put a sexbot avatar in ChatGPT yet" - even though, he conceded at the time, doing so would be sure to boost engagement.
In this issue, we're helping you take control of your online privacy with Opt Out October; explaining the UK's attack on encryption and why it's bad for all users; and covering shocking new details about an abortion surveillance case in Texas. Prefer to listen in? Check out our audio companion, where EFF Security and Privacy Activist Thorin Klosowski explains how small steps to protect your privacy can add up to big changes. Catch the conversation on YouTube or the Internet Archive.
In the screenshot, you can see that the "About this account" page shows the date the user joined X, the number of times the username changed and the date of last change, the location the account is "based in," and a "Connected via" field that shows how the user is getting onto X. Bier's post generated a series of follow-up comments, some of which he responded to with more details about the service.
"All people are by nature free and independent and have inalienable rights. Among these are enjoying and defending life and liberty, acquiring, possessing, and protecting property, and pursuing and obtaining safety, happiness, and privacy." The prescience of this section is hard to overstate, especially as those inalienable rights have been attacked. Specifically, the right to privacy has progressively eroded in San Francisco thanks to partnerships between the police and private companies.
It is often difficult for people in India to remember life before Aadhaar. The digital biometric ID, allegedly available for every Indian citizen, was only introduced 15 years ago but its presence in daily life is ubiquitous. Indians now need an Aadhaar number to buy a house, get a job, open a bank account, pay their tax, receive benefits, buy a car, get a sim card, book priority train tickets and admit children into school.
LinkedIn Corp. must face three related lawsuits alleging it collected the sensitive information of visitors to several health-related websites without their consent in violation of California privacy laws. The individual plaintiffs in two of the proposed class actions adequately pleaded claims of invasion of privacy under the California Constitution and violations of section 632 of the California Invasion of Privacy Act, Judge Edward J. Davila of the US District Court for the Northern District of California said Oct. 10.
Our live panel featured (EFF Associate Director of Community Organizing), (EFF Staff Technologist), Mitch Stoltz (EFF IP Litigation Director) and Yael Grauer , Program Manager at Consumer Reports. Together, they unpacked how we arrived at a point where a handful of major tech companies dictate so much of our digital rights, how these monopolies erode privacy, and what real-world consequences come from constant data collection-and most importantly, what you can do to fight back.
It's a nervy time to be a frontline worker in a call center or back-office hub. Startups are advertising 'AI employees' and the likes of venture capital firm Andreessen Horowitz are talking of AI ' productizing and unbundling ' the business process outsourcing (BPO) sector that executes the core functions of corporations around the globe. No doubt customer service, HR, and IT workers in the industry are wondering how their employers will respond-and whether their livelihoods are at risk.
There's lots of models that help make that very financially attainable, especially for startups and especially for companies that are gathering intimate information, because that's what data brokers want,
"So, I work from home and my employer just started time tracking," Tim began in his video. "It takes screenshots every 10 minutes or so, tracks my mouse activity, keyboard activity, the URLs I visit, and what percentage of time I spend on doing whatever." Tim explained that despite his company's best efforts to monitor the work that he's doing and ensure that he's actually completing it on time, at the end of the day, none of that changes what he does.
Together, these teams would operate as intelligence arms of ICE's Enforcement and Removal Operations division. They will receive tips and incoming cases, research individuals online, and package the results into dossiers that could be used by field offices to plan arrests. The scope of information contractors are expected to collect is broad. Draft instructions specify open-source intelligence: public posts, photos, and messages on platforms from Facebook to Reddit to TikTok.
Traditional car insurance sets premiums based partly on estimated annual mileage. Pay-per-mile splits the cost in two: Base rate. Covers risks like theft, fire, or weather-related damage. Per-mile rate. A set amount for every mile driven. Mileage is confirmed through telematics devices, odometer photos, connected-car systems, or smartphone apps. Many plans include a daily mileage cap, so the occasional road trip doesn't blow your monthly total.
The ads on my phone were getting too personal. I could look up headphones once and then see them everywhere, from YouTube to random free games. Even after I stopped shopping, the same product continued to follow me. It became a steady reminder that my activity might be linked across apps, and I could not ignore it. I opened my privacy settings to see what I could change.
At EFF, we that tech rights are worker's rights . Since the pandemic, workers of all kinds have been subjected to increasingly invasive forms of . These are the "algorithmic management" tools that surveil workers on and off the job, often running on devices that (nominally) belong to workers, hijacking our phones and laptops. On the job, digital technology can become both a system of ubiquitous surveillance and a means of total control .
The British government has ordered Apple to hand over personal data uploaded by its customers to the cloud for the second time this year in an ongoing privacy row that has raised concerns among civil liberties campaigners. The Home Office issued a demand in early September for the tech behemoth to create a so-called back door that would allow the authorities access to private data uploaded by United Kingdom Apple customers after a previous attempt that included customers in the United States failed,
Hyper-personalization-AI's ability to tailor experiences down to the individual level-has become the new norm. For many consumers, these recommendations feel helpful, convenient, and even delightful. Yet, for others, they provoke discomfort, raising questions about just how much these platforms know about us. This paradox is at the heart of a growing debate: Does hyper-personalization build consumer trust and loyalty, or does it erode them by feeling intrusive? And more importantly, how does it shape our purchase intention?
In their gold rush to build cloud and AI tools, Big Tech is also enabling unprecedented government surveillance. Thanks to reporting from The Guardian, +972 Magazine, Local Call, and The Intercept, we have insights into the murky deals between the Israeli Government and Big Tech firms. Designed to insulate governments from scrutiny and accountability, these deals bode a dark future for humanity, one that is built using the same tools that once promised a bright, positive world.
"The preliminary investigation is ongoing, and we are assessing the scope of any concerns and any necessary required remediation," the spokesperson added. "We are in the process of evaluating technical remediation solutions and will act as appropriate. Compliance with the Privacy Act and identifying a solution for this technical problem is critical to the DAF to ensure warfighter readiness and lethality."
Redmond has done so unilaterally, effectively endorsing "shadow IT" - the practice of bringing unapproved software and devices into the workplace. Earlier this year, Microsoft said it had adopted a new approach to shadow IT. "While earlier eras of our IT history focused on trying to prevent shadow IT, we are now concentrating on managing it," the biz said in a blog post. By "managing," Microsoft also means "enabling."