An orthopedic center with several locations in the Capital Region faces a $500,000 fine for failing to protect patient information. The New York Attorney General, Letitia James, said an investigation into Orthopedics NY LLP found the orthopedic medicine and surgery center failed to adequately protect its systems, exposing the personal information of more than 650,000 patients and employees. The AG's office said cyberattackers gained remote access to OrthopedicsNY's patient data in 2023 by using compromised login credentials.
We do not fear advances in technology - but we do have legitimate concerns about some of the products on the market now... AI continues to develop and we are hopeful that we will reach a point in the near future where these reports can be relied on. For now, our office has made the decision not to accept any police narratives that were produced with the assistance of AI.
For example, privileged information shared by foreign partners is currently not overseen by the IPC. It's common practice for national intelligence agencies, such as GCHQ, to receive reports from allies overseas, including from those in the Five Eyes alliance. These reports often contain the kind of privileged information that, in the UK, would require permission from a judicial commissioner, under the IPA, to acquire.
Rehmat Alam operates from the mountains of northern Pakistan, according to one of his online profiles. There, he flaunts his talent for harvesting LinkedIn data and advises YouTube viewers how to earn money off the internet. His company, ProAPIs, allegedly boasted in marketing materials that its software can handle hundreds of requests per second to scrape profiles, selling the underlying data for thousands of dollars a month.
Pasted on the wall next to the locked steel door that seals Laura Poitras's studio from visitors and intruders is a black poster depicting a PGP key that the filmmaker has used in the past to receive encrypted messages. It makes sense that this key-a sort of invitation to send her a secret message-is the only identifiable sign that Poitras edits her movies in this building;
Internet users have been buzzing ever since learning that the messages they choose to exchange with Meta's AI chatbot will be analyzed and used to personalize advertisements and recommendations across its apps and services. The topic and viral social media posts from tech influencers caused some users to enter a panic, worried that this is one example of AI being taken too far and potentially invading privacy.
Shoshana Zuboff (New England, U.S., 1951) joins the video call from her home in Maine, in the northeastern United States, on the border with Canada, where the cold is relentless at this time of year. She sips tea to warm her throat and apologizes for being late; her schedule is so packed these days that it was impossible to find an opportunity to do this interview in person.
As someone with a child in the US, this new Trump threat to scrutinise tourists' social media is concerning. Providing my user name would be OK the authorities would get sick of scrolling through chicken pics before they found anything critical of their Glorious Leader but what if I have to hand over my phone at the border, as has happened to some travellers already?
Sometimes, a false sense of intimacy with AI can lead people to share information online that they never would otherwise. AI companies may haveemployees who work on improving the privacy aspects of their models, but it's not advisable to share credit card details, Social Security numbers, your home address, personal medical history, or other personally identifiable information with AI chatbots.
"This took all of 20 minutes," Exempt, a member of the group that carried out the ploy, told WIRED. He claims that his group has been successful in extracting similar information from virtually every major US tech company, including Apple and Amazon, as well as more fringe platforms like video-sharing site Rumble, which is popular with far-right influencers. Exempt shared the information Charter Communications sent to the group with WIRED, and explained that the victim was a "gamer" from New York.
Reddit has launched a challenge in Australia's highest court against the nation's landmark social media ban for children. The online forum is among 10 social media platforms which must bar Australians aged under 16 from having accounts, under a new law which began on Wednesday. The ban, which is being watched closely around the world, was justified by campaigners and the government as necessary to protect children from harmful content and algorithms.
Ncontracts found that leadership support is rising at the compliance level, with 82% of respondents saying they're satisfied with board and management backing, while 74% are satisfied with their institution's compliance culture. More than half (56%) reported stronger integration of compliance into policies, procedures and training since 2021. Nearly 40% of institutions operate with one or two compliance professionals, while 25% of firms with $1 billion to $10 billion in assets have similarly small teams.
The government surveils you every time you drive through San Jose, collecting a trove of highly revealing data that police search thousands of times per month without ever seeking a warrant. It's an unchecked police power, an end run around judicial oversight and a blatant privacy invasion. It's also a violation of the California Constitution. That's why we at the Electronic Frontier Foundation, with ACLU of Northern California, have sued the city, its police chief and its mayor.
the Indian government is reviewing a proposal by the telecom industry to require smartphone companies to keep satellite location tracking enabled at all times for better tracking. The report states the Indian government, for years, has been concerned about its agencies not getting precise locations when legal requests are made to telecom operators during investigations, and that's because the telecom firms are limited to using cellular tower data.
Health data represents one of the most valuable types of personal data available to companies, whether this be for the training of AI (it is worth noting that the AI health care market is estimated to reach a value of around $187bn by 2030, the development of digital health technology (such as wearables, estimated to be valued at around $76bn by 2030)