The #AI pricing landscape is shifting faster than a quantum particle. For providers, the message is clear: cost-cutting alone is a losing game. The real winners will be those who can offer unique value – be it through industry specialization, edge solutions, or comprehensive platforms. Constellation analyst Andy ThurAI sits down with R "Ray" Wang to explain the new #generativeAI price battle and what #CXOs need to know about the shifting generative AI landscape. Watch below ⬇

Read Andy's full blog on pricing wars here 👉 https://www.constellationr.com/blog-news/generative-ai-price-wars-what-c...

View the Full Video Transcript: (Disclaimer: this transcript has not been edited and may contain errors)

Hey everybody, it's ray here with Andy, and we're going to talk about this big AI Gen, AI price war. But more importantly, I want to start with a comment, Andy, I think you've heard me say this many times. Unlike the internet age, where it was decentralized, it was cheaper, it had a lot more players, and it was open, AI is exactly the opposite. It is centralized, it is more expensive. It is closed, and there are a few players that are going to win. And you're telling me, wait, there's an LM Gen AI price war going on because everybody else is saying it's getting more and more expensive, but you're saying, hey, something's new on the horizon.

So tell me more about this. You penned this blog recently, and you're more importantly, talking about something that's big, that's happening around the horizon. First of all, hi everybody. When it comes down to the LLM price wars, or AI pricing, there are two parts to it, the actual LLM pricing itself, right? For example, the consumption pricing. A lot of these companies are unable to hold the pricing power, so they are, as you might have seen Google, you know, anthropic open AI. I'm not talking about your subscription based pricing that you pay 20 bucks a month.

That's that's a nobody, couple of consumers with a credit card holders. We're not talking about them. We're talking about an API based consumption model in which these companies use millions of, you know, inferences and what have you right, or even enterprise license, those prices have been going down off late, open, AI, Google, anthropic, all of these companies have been producing the prices simply because there is a major price, not just price crunch, compression, as you call it, but also in the fact that you know, when you Come up with a new model, you're top of the chart for maybe a couple of days, if that, and then after that, others come up. You know, it's a cyclical phenomenon. So you you can just increase the price and then compete, hoping that your price will gain traction. Price is one of the the options these companies are using to reduce the price, to reduce the the friction, so companies can engage with them, so there is a major reduction price. But having said that, the other side of it, we are talking about implementation of AI.

You and I talked about multiple times the production is production of AI, or even productionization of llms is not that easy. There are other things come with that, the security, the compliance, the bias removal, the platformization. There are so many options, even the consulting companies that are coming in and executing that, because there is a talent shortage and resource shortage in the front, the prices there are going up. So technically, the overall price, the sticker shock of enterprises are going up for AI implementation, but the other price are going down. So prices are still going up to be able to do stuff in Gen AI and in AI, but at least one of the core components is coming down. Is what you're saying, Yep, yes, but here's the challenge, right? I was like, Oh, I'm gonna change models. And you can't just change models. You got to train these puppies every time, right? So, so is it going to facilitate more switching of changing of models? Or how do we actually bring the training with us when you switch a model? Can we do that yet again? It depends the answer for them as well.

So one of the things I we recommend, I recommend to our you know, advisor calls for the enterprise list when they ask for advice, don't choose a model first of all, because it's cheap. Okay, so there are more than the 11, or even the total AI platform pricing component is just one portion of it. Choose a vendor that you're comfortable with. You have existing relationship with, but more importantly, the faster innovation cycle that these guys have. You know, how often do they train the models? You know whether they give you an option to fine tune your models, or how do you build rag and all of this capability.

So you need look beyond that. Matter of fact, I did mention that in my blog, an idea or a thought that people loved it. Remember the world of AI today's bargain could be tomorrow's technical debt, so don't go with a cheaper option. Build something that will be crap, and then tomorrow you have a problem. So find something that is compatible with you. So will open source win? Right? Islama going to take over the world, versus closed models or very, very industry specific models? It's too early to answer the question, right? Because look, the companies like Alec Alba, they started even before open. Ai, they have a ton of traction at one point. But then after that, as you know, just two days ago, they decided to exit the race. They're not even going to be in it, by the way. Let me ask you this question, do you know how many LLM providers exited just in 2024 alone?

I heard 20, but there could be more. No, so four major ones that that left out right? Ala, covid is the latest one. The reason being is that, because it's a competitive nature of the business, they can't just monetize their models alone, so they're looking for other ways to engage in to substantiate what they already have, including the platformization option. I did I answer your question. I might have you. You did. Some of the smaller ones also exited because their verticals were too small. They didn't have enough data. Yeah, know about, like, I know more than a dozen startups that have actually thought they were going to build LMS and jumped into the race and then realized, oh, wait, we need data.

Like, yeah, you need lots of data, right? It's kind of funny, but they had the right idea, but I think they were better off sitting on top of other people's speaking of that, there's also another issue happening in the industry, which which I'm kind of concerned about as well. Remember that you said a lot of the smaller companies don't have enough data because collecting, curating, getting the data ready and cleansing, it is not easy to train an LLM, it's a very expensive process. So a lot of the smaller companies, as you were suggesting, what they tried to do is that, instead of using a synthetic data creation to create a model, which is creating another problem, because you're using AI to create the AI model, so it's basically getting in a loop. You're using AI to create the data, you're using AI to create the models, and then you're using AI to validate it's all relative real anymore cares about that. It's interpreted a loop already. Well, hey, anyways,

that's really hot. It's one of the big blog posts people are reading. We're trying to understand what this price war is and what it means for them. They'll definitely keep your eyes hot on this topic. And of course, Andy looks at this area. And more importantly, everybody looks at AI, and he's looking at this part of AI. So definitely watch this closely. And of course, the shortlist that come with it. So all right, hey, any great catching up with you? I have nothing, yeah, nothing. I have nothing interesting to add. Thanks a lot. Everybody.