The AirPods Pro 3 Live Translation still proves that Google's AI is miles ahead - Yanko Design
Briefly

The AirPods Pro 3 Live Translation still proves that Google's AI is miles ahead - Yanko Design
"Apple just launched a pretty impressive feature with their AirPods Pro 3 - the ability to actively translate any language in real-time, so you can listen to anyone speak but understand them perfectly. The only problem is that this feature made it to Google's Pixel Buds in 2017 - yes, nearly a decade back. I don't mean to be hard on Apple, the company has a remarkable control over its hardware,"
"The only problem isn't just that this feature's 8 years too late... it's that in the translation, you aren't listening to the opposite person talking, you're listening to the voice of Siri translating things instead. You could be talking to Morgan Freeman and the AirPods Pro 3 still play back his dialogues in Siri's voice. The dissonance can be a bit jarring, which is why Google unveiled Voice Translate this year at their August event."
AirPods Pro 3 include a Live Translation mode that translates any language in real time, allowing listeners to understand speakers immediately. The translation output uses Siri's voice rather than preserving the original speaker's tone and timbre, which can create a dissonant listening experience. Google released conversational translation on Pixel Buds in 2017 and introduced Voice Translate to preserve the original speaker's voice and tone. AirPods Pro 3 also offer advanced health sensing capabilities such as heart rate monitoring and adaptive audio for emotional state, and prior AirPods models have been designed to function as hearing aids.
[
|
]