Photo credit: www.gadgets360.com
Meta Platforms Set to Launch Llama 4, Amid Ongoing AI Developments
Meta Platforms is anticipated to unveil the latest iteration of its large language model, Llama 4, later this month. This announcement comes after the release has experienced multiple delays, as reported by The Information. Meta, the parent company of Facebook, faces significant competitive pressures in the rapidly evolving AI sector.
There are indications that the rollout of Llama 4 could face further postponements. According to sources familiar with the development, the model has not met certain technical benchmarks which are crucial for Meta’s ambitions in AI, particularly in reasoning capabilities and proficiency with mathematical tasks.
Concerns have also been raised regarding Llama 4’s ability to engage in conversations that mimic human-like interactions, especially in comparison to OpenAI’s models, which are seen as more advanced in this area.
In a bid to strengthen its AI infrastructure amidst rising investor expectations, Meta has allocated a staggering budget of up to $65 billion (approximately Rs. 5,39,000 crore) for 2023. The competitive landscape has been reshaped by the emergence of more affordable AI models, notably from companies like DeepSeek, raising questions about the necessity of hefty investments for AI development.
The report further highlighted that Llama 4 is set to incorporate certain technical features developed by DeepSeek. At least one version of the model will utilize a machine-learning approach known as the mixture of experts method. This technique allows the model to specialize in specific tasks, effectively turning different segments of the model into experts in their respective areas.
Additionally, Meta is contemplating a phased approach for Llama 4’s release, starting with an initial introduction through Meta AI, followed by a later open-source version.
Previously, Meta launched its Llama 3 model, which has garnered attention for its ability to communicate in eight languages, produce higher-quality code, and tackle more complex mathematical problems than its predecessors.
Source
www.gadgets360.com