Omdia: AI chip startups to have a tough year

Ryan Daws is a senior editor at TechForge Media with over a decade of experience in crafting compelling narratives and making complex topics accessible. His articles and interviews with industry leaders have earned him recognition as a key influencer by organisations like Onalytica. Under his leadership, publications have been praised by analyst firms such as Forrester for their excellence and performance. Connect with him on X (@gadget_ry) or Mastodon (@gadgetry@techhub.social)


Analysts from Omdia expect AI chip startups to have a difficult year.

Omdia’s Top AI Hardware Startups Market Radar finds that over 100 venture capitalists invested over $6 billion into the top 25 AI chip startups since 2018. However, it seems the good times weren’t to last.

The global chip shortage is becoming an inventory crisis. Meanwhile, the economic downturn and difficult monetary policies have made it difficult to raise funding.

“The best-funded AI chip startups are under pressure to deliver the kind of software support developers are used to from the market leader, NVIDIA,” says Alexander Harrowell, Principal Analyst for Advanced Computing at Omdia.

“This is the key barrier to getting new AI chip technology into the market.”

Omdia predicts that at least one major startup will exit the market this year, likely through a sale to a major chipmaker or a hyperscale cloud provider.

“The most likely exit route is probably via trade sales to major vendors,” adds Harrowell.

“Apple has $23 billion in cash on its balance sheet and Amazon $35 billion, while Intel, NVIDIA, and AMD have some $10 billion between them. The hyperscalers have been very keen to adopt custom AI silicon and they can afford to maintain the skills involved.”

Over half of the $6 billion invested in AI chip startups have focused on large-die, CGRA  accelerators that are designed with the aim of loading entire AI models on-chip. That approach is now being questioned due to the continuing growth of AI models.

“In 2018 and 2019, the idea of bringing the entire model into on-chip memory made sense, as this approach offers extremely low latency and answers the input/output problems of large AI models,” explains Harrowell.

“However, the models have continued to grow dramatically ever since, making scalability a critical issue. More structured and internally complex models mean AI processors must offer more general-purpose programmability. As such, the future of AI processors may lie in a different direction.”

(Photo by Fabrizio Conti on Unsplash)

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

Tags: , , , , , , ,

View Comments
Leave a comment

Leave a Reply