Apple's Visual Intelligence is a built-in take on Google Lens
Briefly

Apple's new Visual Intelligence feature allows users to instantly learn about objects seen through their iPhone camera, making everyday inquiries like identifying dog breeds or finding restaurant hours simpler.
Craig Federighi highlighted how the Visual Intelligence feature is activated through a new capacitive camera control, enhancing user interaction while encouraging exploration and learning through visual cues.
The integration with systems like Google and ChatGPT showcases Apple's commitment to competing in the AI space by providing tools that make information retrieval more intuitive and accessible.
While the feature's exact launch date remains undisclosed, it represents a significant advancement in Apple's suite of AI capabilities, bridging the gap between visual interaction and information processing.
Read at The Verge
[
|
]