Meta has made a lot of headway with its open source Llama family of large language models, but with increasing competition from China's DeepSeek and Qwen models the company is ramping distribution.
For enterprises, the biggest news out of Meta's LlamaCon AI developer conference was the Llama API, which is in limited free preview. With the API, developers will be able to better experiment with Llama models and pair them with the company's software developer kits.
- Meta launches Llama 4 suite, ups ante in LLM wars
- Meta's Zuckerberg lays out AI vision with heavy dose of open source
Pricing wasn't available for the Llama API, but it's clear that Meta is thinking about ways to monetize its flagship model.
Key takeaways about Llama API:
- The API has one-click API key creation and playgrounds to explore models, including Llama 4 Scout and Llama 4 Maverick.
- SDKs in both Python and Typescript will be available for Llama app building.
- Llama API is compatible with OpenAI SDK.
- The API will include tools to evaluate and tune custom Llama models starting with Llama 3.3 8B.
- Meta said it is collaborating with Cerebras and Groq to speed up inference on Llama 4 models using Llama API.
On the consumer side of the Llama equation, Meta launched a standalone Meta AI app. The previous strategy for Meta AI revolved around infusing it on the company's family of applications. If you wanted Meta AI, you'd have to use it through Facebook, Instagram or WhatsApp. The problem is that some of us don't like any of those apps. Now you'll be able to use Meta AI on its own.
The Meta AI app will run on Llama 4 and feature text and voice interfaces. The app will also have Meta AI features such as image generation and editing.