OpenAI might use Apple’s TSMC for chips
In another interesting move that hints at a symbiotic relationship, ChatGPT maker OpenAI has reportedly followed Apple to become a lead customer for TSMC processors. Given the industry lead Apple has achieved with Apple Silicon, the move could be seen as tacit enthusiasm, rather than symbiosis, but follows reports Apple might stake an investment in OpenAI.
These moves by the biggest names in tech underscore the profound difference generative AI (genAI) has made in artificial intelligence, which has taken what’s been part of the industry for decades and placed it at the forefront of the zeitgeist. That Open AI plans to work with TSMC can also be seen as justification of the integrity of Apple’s approach to silicon design, as it concedes the computational power these processors provide while meeting real world needs in terms of energy supply.
The first OpenAI chips under the purported deal are set to slip off the lines some time in 2026.
A new platform battle?
As Apple stands at the cusp of becoming the world’s biggest multi-platform AI ecosystem, the move also hints at new competition down the road. After all, it was only earlier this year that OpenAI CEO Sam Altman was reported to be getting into chip manufacturing. Now, the company has booked into early production of chips using TSMC’s A16 process, which are expected to enter production in 2026.
Despite using the same foundry, the processors won’t be the same as Apple’s and will be designed apparently by Broadcom and Marvell.
While it is very possible that OpenAI wants to use its chips inside its own servers, it is also plausible it might also have plans to introduce its own devices, or to offer up its AI inside chips as options to other computer hardware manufacturers.
It takes energy to make things happen
Everyone with a passing interest in genAI recognizes that the scale of energy consumption required to deliver server-based services using the tech is very, very high. Even at this point in genAI deployment, the energy being used is higher than that required by some smaller nations — and those demands will only increase.
With that in mind, Apple’s M-series chip message around computational performance per watt turns out to be even more prescient than earlier believed. After all, if genAI is to be woven into global use, it must meet those needs without using all the world’s energy; reducing energy demands is mandatory. This also implies tech firms will continue to make major investments in renewable energy supply to drive those server farms, and suggests the carbon offset market will be forced to prove its legitimacy, rather than continuing to be a kind of 21st century equivalent of Papal Indulgences (as George Monbiot once described it).
Power, profit, people
The chips Apple makes deliver excellent computational performance at significantly less power than rival processors. Once Apple’s production moves to TSMC’s A16 process, you’ll see another 8-10% spike in performance for up to 20% less power, a report claims.
That’s great for Mac, iPad, and iPhone users — who doesn’t want more powerful devices that use less energy? But for server-based services handling millions of requests daily, that power difference affects both environmental performance and operational costs in terms of energy bills.
With that in mind, OpenAI doesn’t need to be looking to become a hardware competitor to unlock value from chip design; its own running costs will be reduced dramatically through the introduction of more efficient chips — particularly as the number of people it serves grows from millions to billions.
While people in tech might see AI everywhere, most people haven’t begun using genAI tools and services just yet — something which is going to change within the next few weeks as Apple ships its AI-ready devices, starting with the next iPhone.
But if the direction of travel is anything to go by — a trajectory in which Apple and Microsoft seem set on investing in a company that could yet compete with both of them — it seems the people at the summit of Tech Power Mountain don’t merely see OpenAI as a service provider, but as a peer player in the future of IT. We just have to hope that neither they, nor the AI, are hallucinating.
Please follow me on LinkedIn, Mastodon, or join me in the AppleHolic’s bar & grill group on MeWe.