Meta's Artificial Intelligence chatbot, Meta AI, now has over 500 million monthly active users, CEO Mark Zuckerberg said during the company's earnings call on October 30, while highlighting the rapid adoption of Meta's AI offerings. Meta AI launched in September of last year to compete with Microsoft's OpenAI and Google's Gemini. Initially rolled out in over a dozen countries, Meta AI is now available in around 43 countries and multiple languages.
Also Read: Meta AI Expands to 21 New Countries, Including the UK and Brazil
Meta AI’s Global Reach and Adoption
The chatbot is accessible across most Meta apps, including Facebook, WhatsApp, Instagram, and Messenger, with a dedicated web app also available.
"We estimate that there are now more than 3.2 billion people using at least one of our apps each day - and we're seeing rapid adoption of Meta AI and Llama, which is quickly becoming a standard across the industry," Mark Zuckerberg said during the call.
The CEO noted AI's positive impact across various facets of the business, from core engagement and monetisation to the long-term roadmaps for new services and computing platforms.
Boosting Engagement and Ad Conversion
Discussing improvements to Meta's AI-driven feed and video recommendations, Zuckerberg mentioned that these enhancements have led to an 8 percent increase in time spent on Facebook and a 6 percent increase on Instagram this year.
"More than a million advertisers used our GenAI tools to create more than 15 million ads in the last month, and we estimate that businesses using Image Generation are seeing a 7 percent increase in conversions," he added.
Also Read: Everyone in India Can Ask AI Assistants Questions About Health Issues, Says Meta Official: Report
Zuckerberg also reported exponential growth in the usage of Llama tokens this year. As Llama gains traction as an industry standard, he expects ongoing improvements in quality and efficiency to benefit all Meta products. This quarter, Meta released Llama 3.2, including small models for on-device use and open-source multi-modal models.
Infrastructure Expansion
Llama 4 is currently in development, with Zuckerberg revealing, "We're training the Llama 4 models on a cluster with more than 100,000 H100 GPUs, which is larger than anything I've seen reported elsewhere." Smaller Llama 4G models are expected to be ready first.
"First, it's clear that there are a lot of new opportunities to use new AI advances to accelerate our core business that should have strong ROI over the next few years, so I think we should invest more there. Second, our AI investments continue to require serious infrastructure, and I expect to continue investing significantly there too," Mark said.
Zuckerberg also shared updates on the AI-powered Ray-Ban Meta glasses, stating that demand remains strong. Meta's first full holographic AR glasses are also in development.
CFO Susan Li added that Meta AI is on track to become the most-used AI assistant globally by the end of the year. "Last month, we began introducing Voice, so you can speak with Meta AI more naturally, and it's now fully available in English to people in the US, Australia, Canada and New Zealand. In the US, people can now also upload photos to Meta AI to learn more about them, write captions for posts, and add, remove, or change things about their images with a simple text prompt. These are all built with our first multi-modal foundation model, Llama 3.2" she explained.
Li noted that Meta AI is being used for various tasks. "We're seeing frequent use cases like information gathering, ‘how-to’ tasks, deeper exploration of interests, content discovery, and image generation, which has become a popular use case," she said.
"In the near term our focus is really on making Meta AI increasingly valuable for people and if we're successful, we think there will be a broadening set of queries that people use it for, including more monetizable queries over time," Li said.
Also Read: Meta Develops AI Search Engine to Reduce Dependence on Google and Bing: Report
In-House Search Offering
For a question about plans to build the company's own in-house search offering, Susan replied, "Meta AI draws from content across the web to address timely questions from users and provides sources for those results from our search engine partners. We've integrated with Bing and Google, both of which offer great search experiences."
"Like other companies, we also train our Gen AI models on content that is publicly available online, and we crawl the web for a variety of purposes," she added.
Zuckerberg highlighted the essential role of AI in products like Ray-Ban Meta glasses, AI Studio, and many other initiatives, saying AI will continue to be a significant factor in their development.
Productivity Gains with AI
Regarding AI's role in employee productivity, Li shared, "I think there are different efficiency opportunities with AI that we’ve been focused on in terms of where we can reduce costs over time and generate savings through increasing internal productivity in areas like coding. For example, it’s early, but we’re seeing a lot of adoption internally of our internal assistant and coding agent, and we continue to make Llama more effective at coding, which should also make this use case increasingly valuable to developers over time."
Also Read: Meta Unveils New AI Models and Tools to Drive Innovation
While the Meta chief said they haven't decided on a final budget yet, Meta has raised its annual spending forecast to USD 38-40 billion, up from the previous range of USD 37-40 billion projected in July. The company expects a "significant acceleration" in infrastructure expenses next year due to expanded investments in servers, data centers, and network infrastructure.
Meta's family daily active people (DAP)—users engaging with any Meta app daily—increased by 5 percent year-over-year this quarter, reaching 3.29 billion.