Microsoft Teams meetings now have a new interpreter feature that allows each participant to speak and listen in a foreign language of their choice. Teams Interpreter uses AI-powered real-time language translation to simulate speaking voices in meetings.
Preview will be available in early 2025, covering up to nine languages and providing the ability to use the interpreter feature to simulate an individual’s voice in another language.
This is part of a series of AI-powered changes coming to Microsoft Teams. Meeting transcription will soon support multilingual meetings, supporting up to 31 translation languages in meeting transcription.
In early 2025, in addition to the usual transcripts and chat summaries, Microsoft also plans to introduce the ability for Teams to understand and summarize visual content shared on-screen during meetings from PowerPoint or the web. Copilot can also create a quick summary of all files shared in the Teams chat interface, so you don’t have to open the whole file.
Microsoft is also using Copilot Plus PCs to enable the Teams Super Resolution feature, which leverages the local NPU chip to improve the quality of video calls. This could improve the quality for your colleagues if you’re dialing in over a weak Internet connection. Windows app developers will also be able to use a similar Image Super Resolution API to improve blurry images in January, in addition to updates to the Copilot runtime, including image segmentation, object removal, and image description features.