5 new AI-powered features that flew under the radar at Apple's launch event
Briefly

5 new AI-powered features that flew under the radar at Apple's launch event
"As an AI reporter, I have covered every major smartphone launch event in the past year. The line between hardware and software is blurring with each release, and new AI features are equally noteworthy company to company. This week, Apple took a quieter approach to embedding AI in its products."
"Ironically, the biggest AI upgrade didn't even occur in the iPhone but rather in the new AirPods Pro 3. Live Translation brings the real-time translating capabilities Apple announced at WWDC back in June in iOS 26 to AirPods. As the name implies, when the feature is activated, users can partake in free-flowing natural conversations and hear the translation of what the other person is saying live in their ears. For users who don't have the AirPods, they can see the live transcription on their iPhone screen."
"After a streak of overpromising and underdelivering in the AI space, Apple went back to basics with its new product drops. The focus of the new smartphones, watches, and AirPods was on the new hardware, including better specs across cameras, battery life, form factor, and more."
Apple prioritized hardware improvements across new iPhones, watches, and AirPods while quietly integrating AI enhancements that improve user experience. AI features were present but often not branded as standalone Apple Intelligence offerings, appearing instead as practical enhancements embedded in device functionality. The most prominent AI upgrade arrived in AirPods Pro 3 with Live Translation, enabling real-time translated conversation audio and on-screen transcription via iOS 26. The approach contrasts with louder AI marketing from other smartphone makers and focuses on seamless, useful features that complement stronger cameras, longer battery life, and refined form factors.
Read at ZDNET
Unable to calculate read time
[
|
]