
Meta just announced that it is expanding its AI offerings with a new translation and dubbing feature for Reels on Facebook and Instagram that automatically translates between English and Spanish, cloning the creator’s voice and syncing the audio with lip movements.
Creators can activate the feature before publishing by selecting “Translate voices with Meta AI” and previewing the dubbed version. Translated Reels are labeled as AI-generated and surfaced to users who prefer the translated language.
The rollout applies to Facebook creators with at least 1,000 followers and to all public Instagram accounts.
Part of a broader AI push
This update comes as Meta continues investing heavily in AI products. At its Meta Connect 2024 event, the company previewed the dubbing feature alongside other creator tools, including AI-generated backgrounds and filters. The new dubbing function demonstrates Meta’s strategy of embedding AI directly into creative workflows, reducing barriers for creators who want to reach audiences across languages.
Meta has already built several large-scale language initiatives. Its SeamlessM4T model, introduced in 2023, supports speech-to-text and speech-to-speech translation across nearly 100 languages. The company also launched the “No Language Left Behind” project, an open-source effort that provides high-quality translation for more than 200 languages. The new dubbing feature narrows that scope to one high-impact pairing — English and Spanish — where demand is strongest.
Also see: Meta’s standalone AI app built with Llama 4 highlights how Meta is extending generative voice technologies across formats.
What Meta says
“We’re testing a Meta AI translation tool that will automatically translate the audio of Reels, so more people can enjoy your content, even if you speak different languages,” Meta wrote when introducing the technology at Connect 2024.
In February 2025, Meta reiterated its longer-term commitment with the launch of the Language Technology Partner Program.
“We’re seeking partners to collaborate with us on advancing and broadening Meta’s open source language technologies, including AI translation technologies,” the company said. “Our efforts are especially focused on underserved languages, in support of UNESCO’s work.”
What it means for creators
For creators, this tool offers a cost-free alternative to professional dubbing or subtitling services. Until now, a US creator hoping to reach Spanish-speaking audiences might have needed to hire translators or rely on captions. Auto-dubbing allows them to distribute the same content to new regions without additional production costs.
On Instagram, where Reels compete with TikTok videos for attention, the ability to speak directly to viewers in their preferred language — even through AI — could boost engagement metrics like watch time, shares, and follows. For Facebook creators with large followings, the feature opens opportunities in Latin America and among US bilingual communities.
Enterprise and marketing implications
The tool also carries implications beyond individual creators. Brands can use AI dubbing to localize product campaigns, influencer partnerships, and short-form ads at scale. Instead of producing multiple versions of a campaign, companies can publish one and let Meta’s AI handle translation and dubbing.
This reduces time-to-market for global marketing strategies and could help Meta position its platforms as more competitive for international advertisers. The move aligns with Meta’s broader attempt to keep creators and brands producing directly for Instagram and Facebook, rather than outsourcing to other platforms.
Risks and limitations
As with any AI system, risks remain. Translation accuracy is not guaranteed — idioms and cultural nuances may be lost or distorted. Lip-syncing may feel unnatural in some cases, potentially distracting viewers.
Voice cloning raises ethical concerns as well. Meta says videos will carry an AI-translation disclosure, but creators and audiences may still worry about consent, misuse, or the possibility of deepfake manipulation. For now, creators must opt in manually, which may help mitigate unintended use.
The feature is also limited to English and Spanish, leaving out other major creator languages. Expanding language support will be critical for Meta to broaden adoption.
Also see: Zuckerberg’s vision for “personal superintelligence” demonstrates Meta’s motivation behind embedding AI in creative tools.
What to watch next
Meta has made clear this is just the beginning. Future updates may expand dubbing to additional languages and integrate with other products, such as Ray-Ban Meta smart glasses, which already offer real-time translation in a more personal setting.
Another open question is whether Meta will expose this technology as an API for businesses, schools, or enterprises to use outside of Reels. That could put Meta in closer competition with startups offering enterprise-grade dubbing and localization services.
Key facts
- August 19, 2025: Meta launched AI-powered translation and dubbing for Facebook and Instagram Reels.
- Available for Facebook creators with at least 1,000 followers and all public Instagram accounts.
- First supports English ↔ Spanish, with voice cloning and lip-syncing.
- Announced at Meta Connect 2024 and tested with select creators.
- Builds on Meta’s SeamlessM4T model and No Language Left Behind project.
Implications
- Enterprise IT: Brands can scale campaigns more easily across language markets.
- Risks/opportunity: Greater global reach for creators, but there are ethical concerns about voice cloning.
- What to watch: Expansion into new languages, accuracy improvements, and potential enterprise integrations.