
""R unpacking items from a box," read one notification from the Nest camera on a shelf in the kitchen. "Jenni cuts a pie / B walks into the kitchen, washes dishes in the sink / Jenni gets a drink from the refrigerator," it continued. Sometimes, the alerts sounded like the start of a joke, "A dog, a person, and two cats walk into the room / Two chickens walk across the patio.""
"But these weren't jokes. They were mostly accurate descriptions of the goings-on in and around my home, where I'd installed several Google Nest cameras powered by Gemini for Home. This is a new AI layer in the Google Home app that interprets footage from the cameras and - combined with Nest's facial recognition feature - delivers a written description of the events, including who or what is present, what they are doing, and sometimes even what they're wearing."
Gemini for Home interprets Nest camera footage and generates descriptive notifications identifying people, animals, actions, and sometimes clothing. Notifications shifted from generic alerts (e.g., "animal detected") to specific descriptions (e.g., two chickens, one dog), enabling quicker, prioritized responses and reducing anxiety for some doorbell alerts. The system uses facial recognition to name or identify household members. Accuracy was generally high in the 72-hour monitoring and did not produce hallucinated strangers or wildlife in that instance. The enhanced visibility improves usefulness but amplifies concerns about constant surveillance, privacy, and how much AI should be trusted to observe a home.
Read at The Verge
Unable to calculate read time
Collection
[
|
...
]