1
DeepSeek R1's Implications: Winners and Losers in the Generative AI Value Chain
Abel Gregorio edited this page 2 months ago
R1 is mainly open, on par with leading exclusive models, appears to have been trained at substantially lower expense, and is cheaper to use in terms of API gain access to, all of which indicate a development that may alter competitive characteristics in the field of Generative AI.
- IoT Analytics sees end users and AI applications suppliers as the most significant winners of these recent developments, while proprietary design companies stand to lose the most, based on value chain analysis from the Generative AI Market Report 2025-2030 (released January 2025).
Why it matters
For suppliers to the generative AI value chain: Players along the (generative) AI value chain may need to re-assess their value proposals and align to a possible truth of low-cost, lightweight, open-weight models. For generative AI adopters: DeepSeek R1 and other frontier models that may follow present lower-cost alternatives for AI adoption.
Background: DeepSeek's R1 model rattles the marketplaces
DeepSeek's R1 model rocked the stock exchange. On January 23, 2025, China-based AI startup DeepSeek launched its open-source R1 thinking generative AI (GenAI) model. News about R1 rapidly spread out, and by the start of stock trading on January 27, 2025, the market cap for lots of major innovation business with large AI footprints had fallen drastically since then:
NVIDIA, a US-based chip designer and designer most understood for its data center GPUs, dropped 18% in between the market close on January 24 and the market close on February 3. Microsoft, the leading hyperscaler in the cloud AI race with its Azure cloud services, dropped 7.5% (Jan 24-Feb 3). Broadcom, a semiconductor business focusing on networking, broadband, and customized ASICs, dropped 11% (Jan 24-Feb 3). Siemens Energy, a German energy technology vendor that supplies energy solutions for data center operators, dropped 17.8% (Jan 24-Feb 3).
Market individuals, and particularly investors, responded to the narrative that the design that DeepSeek released is on par with advanced designs, was allegedly trained on just a couple of countless GPUs, and is open source. However, since that preliminary sell-off, reports and analysis shed some light on the initial buzz.
The insights from this post are based upon
Download a sample for more information about the report structure, choose meanings, choose market information, additional information points, and trends.
DeepSeek R1: What do we know previously?
DeepSeek R1 is a cost-efficient, advanced thinking design that matches leading competitors while fostering openness through openly available weights.
DeepSeek R1 is on par with leading thinking models. The biggest DeepSeek R1 model (with 685 billion parameters) efficiency is on par or even better than some of the leading models by US structure design service providers. Benchmarks reveal that DeepSeek's R1 design carries out on par or better than leading, more familiar designs like OpenAI's o1 and Anthropic's Claude 3.5 Sonnet. DeepSeek was trained at a substantially lower cost-but not to the extent that preliminary news recommended. Initial reports showed that the training costs were over $5.5 million, but the real value of not only training but developing the design overall has been discussed because its release. According to semiconductor research and consulting company SemiAnalysis, the $5.5 million figure is just one component of the costs, neglecting hardware spending, the salaries of the research study and development group, and other factors. DeepSeek's API pricing is over 90% less expensive than OpenAI's. No matter the real cost to establish the model, DeepSeek is providing a more affordable proposition for utilizing its API: input and output tokens for DeepSeek R1 cost $0.55 per million and $2.19 per million, respectively, compared to OpenAI's $15 per million and $60 per million for its o1 model. DeepSeek R1 is an ingenious design. The related scientific paper launched by DeepSeekshows the methodologies utilized to develop R1 based upon V3: leveraging the mixture of professionals (MoE) architecture, reinforcement learning, and really innovative hardware optimization to produce designs requiring less resources to train and also fewer resources to perform AI inference, resulting in its previously mentioned API usage expenses. DeepSeek is more open than the majority of its rivals. DeepSeek R1 is available totally free on platforms like HuggingFace or GitHub. While DeepSeek has actually made its weights available and provided its training methodologies in its term paper, the initial training code and information have not been made available for an experienced person to construct a comparable design, consider specifying an open-source AI system according to the Open Source Initiative (OSI). Though DeepSeek has actually been more open than other GenAI companies, R1 remains in the open-weight classification when thinking about OSI requirements. However, the release stimulated interest in the open source neighborhood: Hugging Face has released an Open-R1 initiative on Github to create a complete reproduction of R1 by developing the "missing pieces of the R1 pipeline," moving the design to totally open source so anyone can replicate and construct on top of it. DeepSeek launched powerful small designs alongside the significant R1 release. DeepSeek launched not only the major large model with more than 680 billion parameters however also-as of this article-6 distilled models of DeepSeek R1. The designs range from 70B to 1.5 B, the latter fitting on many consumer-grade hardware. As of February 3, 2025, the designs were downloaded more than 1 million times on HuggingFace alone. DeepSeek R1 was possibly trained on OpenAI's information. On January 29, 2025, reports shared that Microsoft is investigating whether DeepSeek used OpenAI's API to train its models (an offense of OpenAI's regards to service)- though the hyperscaler also added R1 to its Azure AI Foundry service.
Understanding the generative AI value chain
GenAI spending advantages a broad market value chain. The graphic above, based upon research study for IoT Analytics' Generative AI Market Report 2025-2030 (released January 2025), portrays key beneficiaries of GenAI costs across the value chain. Companies along the value chain consist of:
Completion users - End users consist of customers and companies that use a Generative AI application. GenAI applications - Software vendors that consist of GenAI features in their products or deal standalone GenAI software. This consists of business software application business like Salesforce, with its concentrate on Agentic AI, and startups particularly focusing on GenAI applications like Perplexity or wiki.fablabbcn.org Lovable. Tier 1 beneficiaries - Providers of foundation models (e.g., OpenAI or Anthropic), design management platforms (e.g., AWS Sagemaker, Google Vertex or Microsoft Azure AI), information management tools (e.g., MongoDB or Snowflake), cloud computing and information center operations (e.g., Azure, AWS, Equinix or wiki.rolandradio.net Digital Realty), AI specialists and combination services (e.g., Accenture or Capgemini), and edge computing (e.g., Advantech or HPE). Tier 2 beneficiaries - Those whose items and services routinely support tier 1 services, including providers of chips (e.g., NVIDIA or AMD), network and server devices (e.g., Arista Networks, Huawei or Belden), server cooling innovations (e.g., Vertiv or Schneider Electric). Tier 3 beneficiaries - Those whose product or services regularly support tier 2 services, such as suppliers of electronic design automation software application suppliers for chip style (e.g., Cadence or Synopsis), semiconductor fabrication (e.g., TSMC), heat exchangers for cooling technologies, and electrical grid innovation (e.g., Siemens Energy or ABB). Tier 4 recipients and beyond - Companies that continue to support the tier above them, such as lithography systems (tier-4) needed for semiconductor fabrication devices (e.g., AMSL) or companies that offer these providers (tier-5) with lithography optics (e.g., Zeiss).
Winners and losers along the generative AI value chain
The increase of models like DeepSeek R1 signifies a potential shift in the generative AI worth chain, challenging existing market characteristics and reshaping expectations for success and competitive benefit. If more designs with similar capabilities emerge, certain gamers might benefit while others face increasing pressure.
Below, IoT Analytics assesses the essential winners and most likely losers based upon the developments presented by DeepSeek R1 and the more comprehensive pattern toward open, cost-efficient models. This assessment considers the prospective long-lasting impact of such designs on the worth chain instead of the instant results of R1 alone.
Clear winners
End users
Why these developments are favorable: The availability of more and more affordable designs will ultimately decrease costs for the end-users and make AI more available. Why these developments are unfavorable: No clear argument. Our take: DeepSeek represents AI development that ultimately benefits completion users of this technology.
GenAI application suppliers
Why these innovations are favorable: Startups developing applications on top of foundation designs will have more options to select from as more models come online. As stated above, DeepSeek R1 is without a doubt less expensive than OpenAI's o1 design, and though reasoning models are hardly ever used in an application context, it reveals that ongoing advancements and innovation enhance the designs and make them more affordable. Why these developments are negative: No clear argument. Our take: The availability of more and less expensive designs will ultimately reduce the expense of consisting of GenAI functions in applications.
Likely winners
Edge AI/edge computing business
Why these developments are favorable: During Microsoft's recent incomes call, Satya Nadella explained that "AI will be far more ubiquitous," as more workloads will run locally. The distilled smaller sized designs that DeepSeek launched together with the effective R1 model are small adequate to operate on numerous edge devices. While little, the 1.5 B, 7B, and 14B models are also comparably powerful reasoning designs. They can fit on a laptop computer and other less powerful gadgets, e.g., IPCs and industrial gateways. These distilled designs have actually already been downloaded from Hugging Face numerous countless times. Why these innovations are unfavorable: No clear argument. Our take: The distilled designs of DeepSeek R1 that fit on less effective hardware (70B and listed below) were downloaded more than 1 million times on HuggingFace alone. This shows a strong interest in deploying models in your area. Edge computing producers with edge AI solutions like Italy-based Eurotech, and Taiwan-based Advantech will stand to earnings. Chip business that specialize in edge computing chips such as AMD, ARM, Qualcomm, or perhaps Intel, might also benefit. Nvidia also operates in this market segment.
Note: IoT Analytics' SPS 2024 Event Report (published in January 2025) dives into the most recent commercial edge AI trends, as seen at the SPS 2024 fair in Nuremberg, Germany.
Data management services suppliers
Why these developments are favorable: There is no AI without information. To develop applications utilizing open models, adopters will need a variety of data for training and throughout deployment, needing correct data management. Why these innovations are unfavorable: No clear argument. Our take: Data management is getting more crucial as the variety of different AI models increases. Data management companies like MongoDB, Databricks and Snowflake as well as the respective offerings from hyperscalers will stand to earnings.
GenAI providers
Why these innovations are favorable: The unexpected introduction of DeepSeek as a top player in the (western) AI ecosystem reveals that the intricacy of GenAI will likely grow for some time. The higher availability of different models can result in more complexity, driving more need for services. Why these developments are negative: When leading designs like DeepSeek R1 are available totally free, the ease of experimentation and implementation may limit the requirement for combination services. Our take: As brand-new developments pertain to the market, GenAI services demand increases as enterprises try to understand how to best utilize open designs for their service.
Neutral
Cloud computing suppliers
Why these developments are positive: Cloud players hurried to consist of DeepSeek R1 in their design management platforms. Microsoft included it in their Azure AI Foundry, and AWS enabled it in Amazon Bedrock and Amazon Sagemaker. While the hyperscalers invest greatly in OpenAI and Anthropic (respectively), they are also model agnostic and make it possible for hundreds of various models to be hosted natively in their design zoos. Training and fine-tuning will continue to happen in the cloud. However, as models become more efficient, less investment (capital investment) will be required, which will increase profit margins for hyperscalers. Why these developments are unfavorable: More models are expected to be released at the edge as the edge ends up being more powerful and models more efficient. Inference is likely to move towards the edge moving forward. The cost of training cutting-edge models is likewise anticipated to go down even more. Our take: Smaller, more efficient designs are becoming more essential. This lowers the demand for powerful cloud computing both for training and reasoning which might be offset by greater total demand and lower CAPEX requirements.
providers
Why these developments are favorable: Demand for new AI chip designs will increase as AI workloads become more specialized. EDA tools will be crucial for creating efficient, smaller-scale chips tailored for edge and dispersed AI inference Why these innovations are unfavorable: The relocation towards smaller sized, less resource-intensive designs might lower the need for creating cutting-edge, high-complexity chips optimized for enormous data centers, possibly causing decreased licensing of EDA tools for high-performance GPUs and ASICs. Our take: EDA software companies like Synopsys and Cadence could benefit in the long term as AI specialization grows and drives need for new chip styles for edge, consumer, and low-cost AI workloads. However, the market might require to adjust to shifting requirements, focusing less on big information center GPUs and more on smaller, effective AI hardware.
Likely losers
AI chip business
Why these innovations are favorable: The supposedly lower training costs for models like DeepSeek R1 might eventually increase the overall need for AI chips. Some referred to the Jevson paradox, the idea that performance leads to more demand for a resource. As the training and inference of AI models end up being more efficient, the demand could increase as greater efficiency results in reduce costs. ASML CEO Christophe Fouquet shared a similar line of thinking: "A lower cost of AI might imply more applications, more applications suggests more need in time. We see that as an opportunity for more chips need." Why these innovations are negative: The allegedly lower expenses for DeepSeek R1 are based mainly on the need for less advanced GPUs for training. That puts some doubt on the sustainability of massive tasks (such as the recently revealed Stargate project) and the capital investment spending of tech companies mainly earmarked for buying AI chips. Our take: IoT Analytics research for its most current Generative AI Market Report 2025-2030 (published January 2025) discovered that NVIDIA is leading the information center GPU market with a market share of 92%. NVIDIA's monopoly identifies that market. However, that likewise reveals how highly NVIDA's faith is linked to the ongoing development of costs on data center GPUs. If less hardware is required to train and release models, then this could seriously damage NVIDIA's development story.
Other classifications associated with information centers (Networking equipment, electrical grid innovations, electrical power service providers, and heat exchangers)
Like AI chips, designs are likely to become cheaper to train and more efficient to release, so the expectation for more data center infrastructure build-out (e.g., networking equipment, cooling systems, and power supply solutions) would decrease appropriately. If fewer high-end GPUs are needed, large-capacity data centers may downsize their investments in associated facilities, possibly impacting demand for supporting innovations. This would put pressure on companies that offer vital parts, most significantly networking hardware, power systems, and cooling services.
Clear losers
Proprietary design suppliers
Why these developments are favorable: No clear argument. Why these developments are negative: The GenAI companies that have collected billions of dollars of financing for their exclusive models, such as OpenAI and Anthropic, stand to lose. Even if they develop and release more open models, this would still cut into the revenue circulation as it stands today. Further, while some framed DeepSeek as a "side job of some quants" (quantitative analysts), the release of DeepSeek's effective V3 and then R1 designs proved far beyond that belief. The question moving forward: What is the moat of exclusive model companies if innovative models like DeepSeek's are getting released totally free and end up being fully open and fine-tunable? Our take: DeepSeek launched effective designs free of charge (for local implementation) or extremely cheap (their API is an order of magnitude more budget-friendly than comparable models). Companies like OpenAI, Anthropic, and Cohere will deal with significantly strong competition from players that launch free and personalized innovative models, like Meta and DeepSeek.
Analyst takeaway and outlook
The introduction of DeepSeek R1 strengthens a key pattern in the GenAI space: open-weight, affordable designs are ending up being feasible competitors to proprietary options. This shift challenges market presumptions and forces AI providers to reassess their worth propositions.
1. End users and GenAI application service providers are the biggest winners.
Cheaper, top quality models like R1 lower AI adoption expenses, benefiting both enterprises and consumers. Startups such as Perplexity and Lovable, which construct applications on structure models, now have more choices and can substantially decrease API expenses (e.g., R1's API is over 90% less expensive than OpenAI's o1 model).
2. Most professionals agree the stock market overreacted, but the development is genuine.
While major AI stocks dropped greatly after R1's release (e.g., NVIDIA and Microsoft down 18% and 7.5%, respectively), lots of experts view this as an overreaction. However, DeepSeek R1 does mark a genuine advancement in expense efficiency and openness, setting a precedent for future competition.
3. The dish for developing top-tier AI models is open, accelerating competition.
DeepSeek R1 has actually shown that launching open weights and a detailed method is assisting success and accommodates a growing open-source neighborhood. The AI landscape is continuing to shift from a few dominant exclusive gamers to a more competitive market where brand-new entrants can construct on existing advancements.
4. Proprietary AI service providers face increasing pressure.
Companies like OpenAI, Anthropic, and Cohere should now differentiate beyond raw design efficiency. What remains their competitive moat? Some might move towards enterprise-specific options, while others might explore hybrid company models.
5. AI infrastructure service providers deal with combined prospects.
Cloud computing companies like AWS and Microsoft Azure still gain from design training but face pressure as inference relocate to edge devices. Meanwhile, AI chipmakers like NVIDIA might see weaker demand for high-end GPUs if more models are trained with fewer resources.
6. The GenAI market remains on a strong growth path.
Despite interruptions, AI costs is expected to broaden. According to IoT Analytics' Generative AI Market Report 2025-2030, global costs on foundation designs and platforms is forecasted to grow at a CAGR of 52% through 2030, driven by enterprise adoption and continuous performance gains.
Final Thought:
DeepSeek R1 is not simply a technical milestone-it signals a shift in the AI market's economics. The dish for constructing strong AI models is now more commonly available, ensuring greater competitors and faster development. While proprietary designs must adapt, AI application providers and end-users stand to benefit a lot of.
Disclosure
Companies pointed out in this article-along with their products-are used as examples to display market developments. No business paid or got favoritism in this post, and it is at the discretion of the expert to pick which examples are used. IoT Analytics makes efforts to differ the companies and products discussed to help shine attention to the numerous IoT and related innovation market gamers.
It deserves noting that IoT Analytics might have business relationships with some business mentioned in its articles, as some companies license IoT Analytics marketing research. However, for confidentiality, IoT Analytics can not disclose private relationships. Please contact compliance@iot-analytics.com for any questions or issues on this front.
More details and more reading
Are you interested in learning more about Generative AI?
Generative AI Market Report 2025-2030
A 263-page report on the enterprise Generative AI market, incl. market sizing & projection, competitive landscape, end user adoption, trends, difficulties, and more.
Download the sample for more information about the report structure, select meanings, choose data, additional information points, trends, and more.
Already a customer? View your reports here →
Related posts
You may also have an interest in the following posts:
AI 2024 in review: The 10 most notable AI stories of the year What CEOs spoke about in Q4 2024: Tariffs, reshoring, and agentic AI The commercial software market landscape: 7 key stats entering into 2025 Who is winning the cloud AI race? Microsoft vs. AWS vs. Google
Related publications
You may also have an interest in the following reports:
Industrial Software Landscape 2024-2030 Smart Factory Adoption Report 2024 Global Cloud Projects Report and Database 2024
Register for our newsletter and follow us on LinkedIn to remain up-to-date on the latest patterns shaping the IoT markets. For complete enterprise IoT coverage with access to all of IoT Analytics' paid content & reports, including devoted expert time, take a look at the Enterprise membership.