Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Open Source AI Is Moving From Sideshow To Strategy

    April 20, 2026

    GenAI Bitcoin Thriller Has To Sell This Junk To Any Sucker It Can

    April 20, 2026

    DeepSeek Seeks First Outside Funding at $10 Billion Valuation – Unite.AI

    April 19, 2026
    Facebook X (Twitter) Instagram
    ailogicnews.aiailogicnews.ai
    • Home
    ailogicnews.aiailogicnews.ai
    Home»Deepseek»DeepSeek’s lateset update is a serious threat to OpenAI and Google — here’s why
    Deepseek

    DeepSeek’s lateset update is a serious threat to OpenAI and Google — here’s why

    AI Logic NewsBy AI Logic NewsJune 1, 2025No Comments3 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Chinese AI startup DeepSeek is quickly gaining momentum in the global AI race. The company just released DeepSeek-R1-0528, proving once again that this is a bot to watch. The powerful update is already challenging rivals like OpenAI’s GPT-4o and Google’s Gemini.

    The new version delivers major performance gains in complex reasoning, coding and logic, which are areas where even top-tier models often stumble.

    With its open-source license and lightweight training demands, DeepSeek is proving to be faster and smarter.


    You may like

    A leap in benchmark performance

    🚀 DeepSeek-R1-0528 is here!🔹 Improved benchmark performance🔹 Enhanced front-end capabilities🔹 Reduced hallucinations🔹 Supports JSON output & function calling✅ Try it now: https://t.co/IMbTch8Pii🔌 No change to API usage — docs here: https://t.co/Qf97ASptDD🔗… pic.twitter.com/kXCGFg9Z5LMay 29, 2025

    In recent benchmark tests, DeepSeek-R1-0528 achieved an 87.5% accuracy on the AIME 2025 test.

    This is a notable jump from the previous model’s 70%. It also improved significantly on the LiveCodeBench coding benchmark, moving from 63.5% to 73.3%, and more than doubled its performance on the notoriously difficult “Humanity’s Last Exam,” rising from 8.5% to 17.7%.

    For those unfamiliar with what these benchmark tests mean, essentially, they suggest DeepSeek’s model can keep pace with, and in some cases outperform, its Western rivals in specific domains.

    Open-source and easy to build on

    DeepSeek on Android

    (Image credit: Pexels)

    Unlike OpenAI and Google, which tend to guard their best models behind APIs and paywalls, DeepSeek is keeping things open. R1-0528 is available under the MIT License, giving developers the freedom to use, modify, and deploy the model however they like.

    Get instant access to breaking news, the hottest reviews, great deals and helpful tips.

    The update also adds support for JSON outputs and function calling, making it easier to build apps and tools that plug directly into the model.

    This open approach not only appeals to researchers and developers but also makes DeepSeek an increasingly attractive option for startups and companies seeking alternatives to closed platforms.

    Trained smarter, not harder

    DeepSeek logo on smartphone in front of computer data

    (Image credit: NurPhoto / Getty Images)

    One of the more impressive aspects of DeepSeek’s rise is how efficiently it’s building these models. According to the company, earlier versions were trained in just 55 days on roughly 2,000 GPUs at a cost of $5.58 million, just a fraction of what it typically costs to train models at this scale in the U.S.

    This focus on resource-efficient training is a key differentiator, especially as the cost and carbon footprint of large language models continue to draw scrutiny.

    What this means for the future of AI

    DeepSeek’s latest release is a sign of shifting dynamics in the AI world. With strong reasoning abilities, transparent licensing, and a faster development cycle, DeepSeek is positioning itself as a serious competitor to industry heavyweights.

    And as the global AI landscape becomes more multipolar, models like R1-0528 could play a major role in shaping not just what AI can do, but who gets to build it, control it and benefit from it.

    More from Tom’s Guide

    Arrow

    Back to Laptops

    Arrow

    Show more



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleSam Altman biographer Keach Hagey explains why the OpenAI CEO was ‘born for this moment’
    Next Article TC Sessions: AI Trivia Countdown — score big on tickets
    AI Logic News

    Related Posts

    Deepseek

    DeepSeek Seeks First Outside Funding at $10 Billion Valuation – Unite.AI

    April 19, 2026
    Deepseek

    Nvidia’s Jensen Huang warns DeepSeek running on Huawei chips is a ‘horrible outcome’ for US

    April 19, 2026
    Deepseek

    DeepSeek Seeks First Outside Funding at $10B Valuation

    April 18, 2026
    Demo
    Top Posts

    DeepSeek V4 And Tencent’s New Hunyuan Model To Launch In April

    March 17, 202641 Views

    OpenAI’s Simo Said to Warn Staff Ag

    March 17, 202633 Views

    Houston’s Small Biz Gets Smarter: H

    July 29, 202513 Views
    Latest Reviews
    ailogicnews.ai
    © 2026 Lee Enterprises

    Type above and press Enter to search. Press Esc to cancel.