Amazon said it will invest another $2.75 billion into Anthropic to bring its total investment to $4 billion. The deal highlights the urgency of the generative AI arms race as hyperscalers create spheres of large language model influence.
Under the AWS partnership, Anthropic uses AWS as its primary cloud provider and uses AWS Trainium and Inferentia chips. Anthropic gets more distribution and heft behind its Claude model. In September, AWS and Anthropic outlined their initial partnership. AWS exercised its option to invest more in Anthropic.
The LLM orbits break down like this:
- Microsoft and OpenAI.
- Microsoft partnerships with Mistral and talent grabs to ensure it has LLM brainpower and keep regulators at bay.
- AWS and Anthropic.
- Google Cloud and Anthropic to a lesser degree than AWS.
- Open-source models such as Meta's Llama that can be customized through various marketplaces, GitHub and Hugging Face.
- Oracle and Cohere.
- Google Cloud and its own Gemini models.
- A bevy of smaller models that are targeted at specific enterprise use cases.
- DBRX, a Databricks entry that goes along with Mosaic ML models.
Today, it's clear that enterprise cloud and software giants are teaming up with LLM specialists as fast as possible. It's an arms race and why you'll need a chief AI officer to sort out the LLM strategy.
But the larger question is what happens when LLMs become commoditized. Of course, no one is thinking about that possibility yet since the party is just starting. These foundational models will lose importance as the game really becomes about customization with company-specific data.
Constellation Research's take
Dion Hinchcliffe:
"Ultimately, it's all about the data. If AI offerings can entangle themselves in their customers' data in a way that is beneficial for the customer, yet hard to leave, then it’s a win. Commodity offerings won’t matter as much when switching costs are high. Such switching costs involve data gravity, product skill switching, lost training time (weeks/months to train the new model on enterprise data), and especially a track record — or a lack thereof — of trust/privacy. AI is likely the new lock-in. Yes, this implies private LLMs are where the big money is, and that is likely where we’ll end up. Commodity AI gets the public model market, hyperscale offerings get the enterprise data market. Use of public models with enterprise data is also another avenue for non-commodity offerings."