June 12, 2025Comment(19)

Challenges in the AI Chip Industry

Advertisements

In an age characterized by rapid technological advancements, the field of artificial intelligence stands out as one of the most dynamic and transformative frontiersThe emergence of DeepSeek has disrupted the landscape of the AI chip market with an intensity that resembles a sudden storm, redefining existing norms and expectations in ways previously thought unimaginable.

The launch of DeepSeek's open-source model acted as a stone thrown into a tranquil lake, creating ripples that reverberate throughout the industryIts impact is illustrated by the billions of dollars wiped off the market cap of chip leader NVIDIA almost overnightThis incident vividly demonstrates how DeepSeek has shattered the old monopolistic structure, compelling the industry to recognize that technological revolutions can happen rapidly and disruptivelyHowever, this storm does not spell disaster for smaller AI companies; rather, it presents unprecedented opportunitiesMany AI chip firms have expressed that they perceive in DeepSeek a shining beacon for scaling up and achieving breakthroughs, viewing it as a "great opportunity" rather than a threat.

Take Cerebras Systems, a chip startup, as an illustration

Advertisements

Its CEO, Andrew Feldman, shared his enthusiasm by stating, "Developers are exhibiting an unprecedented eagerness to replace the pricey closed models of OpenAI with open-source solutions like DeepSeek R1." The open-source nature of the technology allows developers a remarkable degree of freedom to modify and redistribute the source code to fit their specific needs, a stark contrast to the costly restrictions imposed by competitors like OpenAIFor Cerebras, which has long been one of NVIDIA's few challengers in AI model training, this opens new avenues, allowing for the cloud-based services they offer through unique computing clustersFeldman noted that the demand surge following the R1 model's release represents one of the company's largest service demand peaks everHe concluded, "The success of the R1 model clearly indicates that the growth of the AI market cannot be monopolized by a single player; open-source models have dismantled the ‘moats’ historically built around traditional hardware and software, promoting a fairer and more diverse competitive landscape."

Last month, the release of DeepSeek’s open-source R1 inference model shook the global market to its coreNot only does this model deliver performance on par with leading American technologies, but it does so at a fraction of the cost, a feat that astonishes industry observersReflecting on this, Feldman remarked, "The decline in pricing has historically been a key driver in the global spread of technology—look at personal computers and the internetToday's AI market is following a similar long-term growth trajectory, with DeepSeek hastening this process."

Within the chip arena, the demand for inference chips is accelerating dramaticallyBoth chip startups and industry experts assert that by fast-tracking the transition of AI from "training" to "inference," DeepSeek is expected to catalyze widespread adoption of new inference chip technologies

Advertisements

In essence, inference refers to the application of AI to predict outcomes or make decisions based on new information, a process fundamentally distinct from the "training phase" where models are built.

Philip Lee, a semiconductor stock analyst at Morningstar, elaborated, "Training AI is about building the tools or algorithms; inference is about leveraging those tools in real-world scenarios to realize their value." While NVIDIA currently dominates GPU for AI training, many competitors are uncovering significant expansion opportunities in inference, focusing on enhancing efficiency at lower costs to capture market shareLee emphasized that while AI training demands high computational power primarily relying on robust GPUs, inference can be executed using relatively less powerful chips designed for executing specific tasks, prioritizing practicality and specificity.

As demand grows for DeepSeek's open-source model and its applications, an increasing need for inference chips and capabilities is becoming evidentSid Sheth, CEO of AI chip startup d-Matrix, commented, "Practically, smaller open-source models have proven to be just as powerful as large proprietary ones; in some respects, they hold advantages and are cost-effectiveThe widespread use of small functionally-focused models marks the dawn of a golden age of inference." He added that they have observed a geometric increase in global customer interest in accelerating their inference strategies, suggesting that the market is on the brink of a breakthrough in inference technology.

Robert Wachen, co-founder and COO of the AI chip manufacturer Etched, echoed similar sentiments, sharing that since the release of DeepSeek's model, dozens of companies have proactively approached his startupHe noted, "Now, enterprises are shifting their spending from training clusters to inference clusters, a clear signal of market demand transformation

Advertisements

Advertisements

Advertisements

Error message
Error message
Error message
Error message
Error message

Your Message is successfully sent!