Meta is rolling out an AI-powered voice translation feature to all users on Facebook and Instagram globally, the company announced on Tuesday.
The new feature, which is available in any market where Meta AI is available, allows creators to translate content into other languages so it can be viewed by a broader audience.
The feature was first announced at Meta’s Connect developer conference last year, where the company said it would pilot test automatic translations of creators’ voices in reels across both Facebook and Instagram.
Meta notes that the AI translations will use the sound and tone of the creator’s own voice to make the dubbed voice sound authentic when translating the content to a new language.
In addition, creators can optionally use a lip sync feature to align the translation with their lip movements, which makes it seem more natural.
At launch, the feature supports translations from English to Spanish and vice versa, with more languages to be added over time. These AI translations are available to Facebook creators with 1,000 or more followers and all public Instagram accounts globally, where Meta AI is offered.
To access the option, creators can click on “Translate your voice with Meta AI” before publishing their reel. Creators can then toggle the button to turn on translations and choose if they want to include lip syncing, too. When they click “Share now” to publish their reel, the translation will be available automatically.
Creators can view translations and lip syncs before they’re posted publicly, and can toggle off either option at any time. (Rejecting the translation won’t impact the original reel, the company notes.) Viewers watching the translated reel will see a notice at the bottom that indicates it was translated with Meta AI. Those who don’t want to see translated reels in select languages can disable this in the settings menu.

Creators are also gaining access to a new metric in their Insights panel, where they can see their views by language. This can help them better understand how their content is reaching new audiences via translations — something that will be more helpful as additional languages are supported over time.
Meta recommends that creators who want to use the feature face forward, speak clearly, and avoid covering their mouth when recording. Minimal background noise or music also helps. The feature only supports up to two speakers, and they should not talk over each other for the translation to work.
Plus, Facebook creators will be able to upload up to 20 of their own dubbed audio tracks to a reel to expand their audience beyond those in English or Spanish-speaking markets. This is offered in the “Closed captions and translations” section of the Meta Business Suite, and supports the addition of translations both before and after publishing, unlike the AI feature.

Meta says more languages will be supported in the future, but did not detail which ones would be next to come or when.
“We believe there are lots of amazing creators out there who have potential audiences who don’t necessarily speak the same language,” explained Instagram head Adam Mosseri, in a post on Instagram. “And if we can help you reach those audiences who speak other languages, reach across cultural and linguistic barriers, we can help you grow your following and get more value out of Instagram and the platform.”
The launch of the AI feature comes as multiple reports indicate that Meta is restructuring its AI group again to focus on four key areas, including research, superintelligence, products, and infrastructure.
Great Job Sarah Perez & the Team @ TechCrunch Source link for sharing this story.