Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Gen Z And Millennials Are Racing To Upskill In AI

    December 6, 2025

    AI deepfakes of real doctors spreading health misinformation on social media | Health

    December 6, 2025

    AI labs like Meta, Deepseek, and Xai earned worst grades possible on an existential safety index

    December 5, 2025
    Facebook X (Twitter) Instagram
    ailogicnews.aiailogicnews.ai
    • Home
    ailogicnews.aiailogicnews.ai
    Home»Deepseek»DeepSeek Shifts Smaller AI To Huawei Chips
    Deepseek

    DeepSeek Shifts Smaller AI To Huawei Chips

    AI Logic NewsBy AI Logic NewsSeptember 2, 2025No Comments2 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email

    DeepSeek will utilize Huawei AI chips instead of Nvidia’s for training small AI models, as the company seeks to decrease its reliance on Nvidia processors. The shift comes as DeepSeek is testing new AI GPU accelerators from various manufacturers.

    According to , DeepSeek plans to adopt Huawei chips for its smaller AI models. This initiative aims to reduce the company’s dependency on Nvidia. DeepSeek is currently evaluating new AI GPU accelerators from Huawei, Baidu, and Cambricon for training models smaller than its AI R2 version.

    DeepSeek intends to continue using Nvidia processors for its R2 AI large language model (LLM), considering them a reliable source for its current products. The company was previously considering the Ascend processor for its next-generation AI reasoning model but might defer that plan.

    Stay Ahead of the Curve!

    Don’t miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

    DeepSeek encountered challenges with the upcoming R2 AI model. Despite engineering support from Huawei, development issues led to the postponement of the R2 AI model’s launch. The debut of the R2 AI model is now expected later this year.

    DeepSeek is relying on Nvidia’s chipsets to construct the more powerful R2 AI reasoning model. Concurrently, it will use Huawei Ascend processors for training and refining smaller iterations of the R2 model. The company has not specified a debut date for consumer platforms utilizing Huawei AI chip-powered LLM technology.

    A Nvidia spokesperson stated, “The competition has undeniably arrived. The world will choose the best tech stack for running the most popular applications and open-source models. To win the AI race, U.S. industry must earn the support of developers everywhere, including China.”

    Featured image

     

    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleOpenAI Eagerly Trying To Reduce AI Psychosis And Squash Co-Creation Of Human-AI Delusions When Using ChatGPT And GPT-5
    Next Article US Open 2025: Jannik Sinner labelled an ‘AI’ player after thrashing Alexander Bublik in New York
    AI Logic News

    Related Posts

    Deepseek

    AI labs like Meta, Deepseek, and Xai earned worst grades possible on an existential safety index

    December 5, 2025
    Deepseek

    ByteDance and DeepSeek Are Placing Very Different AI Bets

    December 5, 2025
    Deepseek

    Nvidia claims 10x speed boost on new server for DeepSeek-style AI models

    December 4, 2025
    Demo
    Top Posts

    FTC’s Holyoak Has Her Eyes On DeepSeek

    February 22, 20256 Views

    OpenAI Rejects Elon Musks Bid Further Escalating The Feud

    February 17, 20253 Views

    Optimize Inventory Management with AI for Small Online Retailers

    February 17, 20253 Views
    Latest Reviews
    ailogicnews.ai
    © 2025 Lee Enterprises

    Type above and press Enter to search. Press Esc to cancel.