DeepSeek Triggers Plunge in Tech Stocks
Advertisements
In a day that set the global financial community abuzz, the stock market witnessed dramatic shifts as the notion that "only reckless financial expenditure can drive AI development" was put to the test. The computational industry, particularly the so-called "pick-and-shovel" stakeholders, who profit from supplying the tools needed for the AI gold rush, found themselves caught in a violent sell-off.
By the time the day ended, two powerhouses of high-performance computing, Nvidia and Broadcom, reported losses exceeding 15% in tandem. Other players in the supply chain such as TSMC, ASML, and Tokyo Electron faced similar downturns. The year had already seen promising growth in the "AI + Power/Nuclear" sectors, yet stocks from renowned companies like Constellation Energy, Vistra Energy, General Electric's Vernova, and leading nuclear assets like Oklo and NuScale felt the impact, each witnessing intraday drops hovering near 20% at their lowest points.
The catalyst for this upheaval was an internet revelation presented by the AI industry's cream of the crop: DeepSeek unveiled a method to train large models that can rival those created by industry giants like OpenAI, but at a fraction of the cost and with the potential for replication by engineering teams across the globe. This breakthrough raised eyebrows on Wall Street, igniting fierce skepticism regarding the justification for the lofty valuations of these technology giants.
Adding to the tension was the fact that much of the stock price appreciation in the US markets over the past two years had been attributed to just a handful of tech behemoths. Analysts had been reluctantly accepting that the profit growth of these companies might not keep pace with share prices, maintaining their high valuations through the sheer momentum of hype. Thus, any blows to the underlying logic of their inflated market positions made sustaining these valuations increasingly challenging.
Despite the significant sell-off among AI stocks on Monday, some bullish analysts stood defiant, arguing that DeepSeek's achievements shouldn’t be viewed as purely detrimental to the sector at large.
Cantor Fitzgerald, an investment bank founded by the incoming US Trade Secretary, Mark Luthnick, shared their latest analysis with clients on the same day. They contended that the emergence of lower-powered large models from China could likely benefit high-end GPU manufacturers and data center builders in the long run.
Within their report, C.J. Muse, a semiconductor sector analyst at Cantor, noted the collective anxiety surrounding the computational demands following DeepSeek's launch of their V3 model. Many feared a peak in GPU requirements. Muse challenged this perspective, asserting that such a view is fundamentally misaligned with reality. On the contrary, he suggested that this advancement is highly "bullish," signaling that artificial general intelligence (AGI) seems closer than ever. The paradox known as Jevons Paradox, which states that as efficiency improves, total consumption of resources can actually increase, further supports the argument that the AI sector's demand for computational power will only continue to rise.

The timing of Muse's commentary coincided serendipitously with remarks from Microsoft CEO Satya Nadella over the weekend, underscoring the resurgence of Jevons Paradox. He posited that as AI technologies become more efficient and accessible, the demand for these innovations is likely to surge, turning them into a sought-after commodity.
Muse continued to elaborate that the industry will persist with pre-training, post-training, and time-based reasoning. Investments in large-scale farming clusters for chips are bound to accelerate further. He reinforced that this development is a positive signal for the trend of increasing computation demands, refuting the idea of a decrease.
UBS's semiconductor research chief, Timothy Arcuri, echoed similar sentiments in his report on Monday, hinting that while there are some speculations regarding resources used in training the newly developed R1 model, these won't undermine its efficiency during inference, as each token's cost is reported to be over 95% lower than that of OpenAI's O1 model. Developers will likely consider integrating new technologies from R1 into their models, optimizing performance in the process.
Arcuri concluded, asserting that although it might appear that this trend could negatively impact computational demand, the reality remains that even with models gaining efficiency, the need for extensive computational power to enhance model performance persists.
Additionally, Bernstein analysts provided distinct insights from a specialized viewpoint regarding the current market climate. They articulated that the stock market seems excessively influenced by panic instigated on social media platforms. Emphasizing the realm of AI, their team conveyed that the rising costs associated with large AI models today are simply not sustainable in an unlimited fashion. For AI to progress when cost growth is constrained, it will require innovative technological solutions like MoE (Mixture of Experts), distillation, and mixed precision. These advancements could help strike a balance between managing costs and implementing technological breakthroughs, pushing the AI sector to new heights.
Bernstein analysts firmly espouse the validity of Jevons Paradox, arguing that any newly unlocked computational power is more likely to be absorbed due to increased usage and demand, rather than impacting long-term expenditures negatively. They assert that the current computational demands for AI are nowhere near reaching their limits.