Key insights of today’s newsletter:
AI companies like OpenAI, Anthropic, Google and Microsoft sell engines, not cars. They don’t sell products, but raw intelligence.
This intelligence is rapidly becoming a commodity. Without large jumps in capabilities, the key differentiators are going to be speed and cost.
While AI may well turn out to be revolutionary, to this day a large gap remains between the promise and the value being created from it.
↓ Go deeper (6 min read)
On June 13, Wired published an article arguing the launch of Apple Intelligence proved AI is a feature, not a product. Here’s why that’s relevant.
AI, which has come to stand synonymous for anything that is generative, has truly captured the world’s imagination. Foundation models have shown a great deal of mundane utility simply because of their ability to manipulate language and code. They are quick and they are versatile.
However, you need more than access to the API if you want to build something useful with them — and anyone who has tried knows that. The companies who develop these models know it too, although some like to convince us otherwise.
The commoditization of AI
The analogy I like to use is that AI companies like OpenAI, Anthropic, Google and Microsoft sell engines, not cars. They don’t sell a product, they sell raw intelligence.
Most people don’t realize, but even chatbots like ChatGPT and Claude rely on much more than just their underlying model. Deploying these advanced general-purpose AI assistants requires the orchestrations of all sorts of different systems, from the hardware required to scale to the integrations with tools that allow the AI to write and run code for you.
The models themselves are rapidly becoming a commodity. OpenAI continues to hold on to their position as the market leader, but that is mostly due to first mover advantage. Intelligence is now available on demand, with the option to switch to a better or cheaper provider with the press of a button.
In my latest piece about Anthropic, I mentioned that the focus has recently shifted towards price and speed optimizations. This solidifies the idea of commodification and I suspect this trend to continue if scaling won’t get us much further.
A generation-defining wave
This matters, because at the end of the day most people don’t really care about the processor in their computer or the engine in their car. They only care insomuch that it works.
The general assumption is that a huge amount of economic value will be created by AI. Model providers sure hope so. But the onus is not on them to deliver value to end users. It’s on enterprises to turn this raw intelligence into features and functionalities that make money or save money. If they fail to do so, the bubble will inevitably burst. And in case you’re wondering how much return needs to be generated, consider AI’s $600B Question.
Don’t get me wrong, AI may well turn out to be a generation-defining technology, bubble or not. I just have a feeling it’s going to take a generation to materialize. And while we’re out here assembling cars, and AI companies are building ever-bigger engines, Nvidia is happy to provide us all with fuel.
One last thing…
Teaching computers how to talk is a reader-supported newsletter. Any contribution, however small, helps me allocate more time to the newsletter and put out the best articles I can.
Pick your number. Become a donor. Cancel anytime.
Join the conversation 🗣
Leave a like or comment if this article resonated with you.
Get in touch 📥
Shoot me an email at jurgen@cdisglobal.com.
There is a way in which I define these issues with my friends: 'concentrates of interest': they are short issues, straight to the point, but which open up new perspectives in my mind and inform me in a unique and fascinating way. Thanks Jurgen for sharing.
I’m curious: would you say an individual non-coding user can leverage those engines in a meaningful way?
Or will all of the playing around people do right now sooner or later disappear and be replaced by commercial solutions (cars) in various shapes and forms?