Google says new cloud-based "Private AI Compute" is just as secure as local processing
Briefly

Google says new cloud-based "Private AI Compute" is just as secure as local processing
"NPUs can't do it all, though. While Gemini Nano is getting more capable, it can't compete with models that run on massive, high-wattage servers. That might be why some AI features, like the temporarily unavailable Daily Brief, don't do much on the Pixels. Magic Cue, which surfaces personal data based on screen context, is probably in a similar place. Google now says that Magic Cue will get "even more helpful" thanks to the Private AI Compute system."
"Google has also released a Pixel feature drop today, but there aren't many new features of note (unless you've been hankering for Wicked themes). As part of the update, Magic Cue will begin using the Private AI Compute system to generate suggestions. The more powerful model might be able to tease out more actionable details from your data. Google also notes the Recorder app will be able to summarize in more languages thanks to the secure cloud."
"There are still reasons to use local AI, even if the cloud system has "the same security and privacy assurances," as Google claims. An NPU offers superior latency because your data doesn't have to go anywhere, and it's more reliable, as AI features will still work without an Internet connection. Google believes this hybrid approach is the way forward for generative AI, which requires significant processing even for seemingly simple tasks."
NPUs on Pixel devices remain limited compared with large, high-wattage server models, so some on-device features like Daily Brief and Magic Cue underperform. Magic Cue will begin using the Private AI Compute system to generate suggestions, which may enable a more powerful model to extract actionable details from personal data. The Pixel feature drop also enables the Recorder app to summarize in more languages via the secure cloud. More user data will be offloaded to the cloud to improve suggestions, while local NPUs continue to provide superior latency and offline reliability in a hybrid approach.
Read at Ars Technica
Unable to calculate read time
[
|
]