Every month, people translate around 1 trillion words across Google Translate, Search and in visual translations in Lens and Circle to Search. Now, thanks to AI, we’re making it even more effortless to overcome language barriers. Using the advanced reasoning and multimodal capabilities of Gemini models, we’re bringing two new features to Translate to help with live conversations and language learning.
Live conversations translated in real time
Exploring the world is more meaningful when you can easily connect with the people you meet along the way. To help with this, we’ve introduced the ability to have a back-and-forth conversation in real time with audio and on-screen translations through the Translate app. Building on our existing live conversation experience, our advanced AI models are now making it even easier to have a live conversation in more than 70 languages — including Arabic, French, Hindi, Korean, Spanish, and Tamil.
To try it out, open the Translate app for Android or iOS, tap on “Live translate,” select the languages you want to translate and simply begin speaking. You’ll hear the translation aloud and see a transcript of your conversation in both languages on your device. Translate smoothly switches between the two languages you and your language partner are speaking, intelligently identifying conversational pauses, accents and intonations. This allows you to have a natural conversation with just a single tap.