Meta AI User Base Nears 600 Million Milestone

2024-12-10

Meta has revealed that its AI assistant, Meta AI, is set to reach a major milestone, with monthly active users approaching 600 million. This statement was officially made by Meta CEO Mark Zuckerberg in the latest announcement, underscoring the strong growth momentum of Meta AI in the AI industry.

Since its launch last autumn, Meta AI has experienced a consistent rise in its user base, breaking the 500 million-user threshold in October of the previous year. The recent spike in user numbers coincided with the release of Meta's latest text model, Llama 3.3. This model is characterized by superior performance and reduced operational costs, designed to offer users more efficient and user-friendly AI services.

The Llama 3.3 model introduces an all-new 70 billion parameter configuration. Meta claims that its performance rivals that of the earlier 405 billion parameter model but with substantially lower operational costs. To verify these claims, Meta's Vice President of Generative AI, Ahmad Al-Dahle, shared performance comparison data, which showed that Llama 3.3 performed exceptionally well across multiple benchmark tests, even surpassing competitors such as Google's Gemini Pro 1.5 and OpenAI's GPT-4.

The achievement of this milestone is attributed to Meta's ongoing innovation and breakthroughs in post-training technologies. Specifically, the introduction of techniques such as online preference optimization has significantly enhanced the performance of the Llama 3.3 model, thereby establishing a solid foundation for the future development of Meta AI.

In the announcement, Zuckerberg also outlined Meta AI's future plans. He stated that Llama 3.3 will be the final major update to Meta AI for this year, and the highly anticipated Llama 4 model is on the horizon. Although specific details about Llama 4 have not yet been disclosed, Zuckerberg mentioned that the model is being trained using over 100,000 H100 GPUs and expects to release some smaller model versions early next year.