That is fair but doesn't do much to push back against the risk to the independent model vendors, particularly in consumer. This has represented a huge portion of the AI capex so far. OAI alone represented 2GW of compute in 2025. The point is they are in a fragile position, and so are the economics of AI DC spending in aggregate.
With respect to Google, I'd also wonder about the economics of AI search vs traditional.
Googles search revenue comes from ads which depend somewhat on the quality and speed of the search result. Yeah, a better LLM could do it but a better pagerank with NLP that actually works again could do it.
I use ddg instead of google but it does show answers. I don't go to google for chatbots, i do go to find answers and more than not i find myself unsatisfied with the LLM answer so i end up diving past SEO spam (also LLM written these days) to find where i need to go. It's very frustrating and i'm feeling very pessimistic about the future of the web. It seems to be atrophying.
Google search revenue for example was over $200B in 2025. This revenue will be tightly coupled to the quality of their AI models in the future.