On Sept. 15, Apple released its latest “Apple Intelligence” feature for updated AirPods (Pro 2 and up), allowing users to receive live translations of their conversations, synced to their iPhone. The update creates live transcripts of conversations in the iPhone’s Live tab, allowing users to review and verify the automated translation.
To activate the feature, users can either press both AirPod stems simultaneously and tell Siri to “start Live Translation” or simply press the Action button on their iPhone. The translation isn’t limited to in-person conversations though, and users can enable the feature for audio calls, FaceTime, and text messages. While the technology isn’t perfect and may sometimes misinterpret voices or phrases, Live Translation completely changes the translator game where it really matters.
While Google Translate can convert pictures of foreign text into English (or whatever language you choose), the new AirPods allow users to immerse themselves in a foreign language, without the delay of pulling up an online dictionary. It only takes a simple gesture to start the Live Translation, and translation begins within seconds of the spoken conversation. Speakers won’t have to use simple words anymore or feel like their extra comments aren’t worth the difficulty of translation when they host multilingual conversations. The fluidity of translation and ability to hold a conversation face-to-face will likely change the way people communicate and hopefully reduce the impacts of language barriers.

While it may take a few months before everyone actually decides to update their Apple devices, the social effect of Live Translation will be noticeable. For example, in the United States, where 15.4% of the population are immigrants (Pew Research Center), non-English speakers will have an easier time paying for services or finding jobs that otherwise would have been incompatible with their language skills. As tourists travel to different countries, Live Translation will also make it easier to ask for directions or learn about a town’s history from a local.
Apple’s jump into the translation field follows Apple Intelligence’s theme of decreasing barriers and elevating self-expression. Features like Genmoji, where custom emojis can be made, and Image Playground, where custom images can be made, are examples of Apple’s dedication to enhancing communication, particularly amongst Gen Z. Meeting Gen Z’s creative interests, Apple has also now allowed any developer to use the same large language model that fuels Apple Intelligence. The other underlying initiative seems to be to normalize AI in virtually all aspects of life.
