In early January, there was little contention that booming appetite for all things artificial intelligence would spur growth in digital infrastructure. Fast forward to the end of the month, and Chinese tech start-up DeepSeek upended all that.
Released out of the blue, the firm’s AI model triggered a $700 billion sell-off as the stock market reacted to the landmark announcement. Reportedly produced for less than its Western rivals, investors feared lower computational requirements would directly reduce long-term demand for hyperscale and co-location, downgrading capital assumptions previously taken for granted.
“The announcement of DeepSeek represented several changes to the AI landscape, and not just technological,” says Simon Duckett, chief digital officer at renewable energy producer Sonnedix. “Similar to other disruptive moments in technology, it showed how limited resources (in this case, access to the latest graphics processing units) can have a positive effect in stimulating people and organisations to find alternative and innovative methods to reach their goals.”
The Chinese start-up reportedly drew upon existing ideas and open-source materials, built on the outputs of others and added new ideas to refine their model. “Outside of the technology, the breakthrough showed that there is a broader ecosystem of contributors beyond the big tech players, and that some participants are more willing to share the workings of their innovation so that others can benefit,” adds Duckett.
Fork in the road?
Time has passed since DeepSeek’s big reveal and the initial market panic has given way to more scrutiny. Industry experts have had the time to dig into the details and reflect on the realistic impact to previous capital assumptions.
“While the widely cited $6 million training cost for DeepSeek’s v3 model has caught attention, it likely understates the true investment required,” says Jonathan McMullan, global technology specialist at Schroders. “That figure only covers the final GPU training run, omitting significant costs like R&D, data acquisition, engineering talent and hardware infrastructure. DeepSeek also appears to have reduced costs by leveraging diffusion techniques – essentially building upon existing frontier models.”
McMullan explains that initial concerns that DeepSeek might dampen the growth assumptions for AI-driven data centres also appear largely overstated. “In many ways, this resembles the Jevons paradox: gains in efficiency tend to drive greater, not lesser, overall demand. As models become more powerful and reasoning-intensive, their compute requirements grow exponentially.”
McMullan is far from the only person to take this view. Sebastian Domenech, executive director of asset management at Australian asset manager IFM Investors, believes DeepSeek’s progression in reducing the cost of computing power is a natural evolution.
“Continuous improvements in efficiency are not just beneficial but necessary for the industry’s growth, and we believe this trajectory is the only viable path forward,” he says. “We think this is what will enable sustained growth in AI demand – and, with it, demand for computing power.”
Moving closer to the user
One argument runs that decentralised workloads could at least influence or shake the dominance of hyperscalers, the reasoning being that improvements in AI inference and training efficiency, combined with less powerful GPUs, could accelerate demand for edge and modular data centres.
“There is a long history of using supposedly ‘inferior’ hardware to achieve impressive goals in technology – but it doesn’t prevent cutting-edge technology developing in parallel,” says McMullan. “Instead, it reinforces that there are many approaches and use cases for AI that will need different levels of compute power. Essentially, we must pick the right tool for the job.”
Jim Footh, managing director of data centre investments and portfolio management at PGIM Real Estate, believes that inference workloads will expand significantly as AI applications are developed and broaden across industry and personal use.
To explain the two processes, training begins with feeding an AI model curated data, while inference is the process where a trained model makes decisions or predictions based on new, previously unseen data.
“The machine learning phase is not particularly latency sensitive, which allows data centres to be located in less expensive rural settings,” says Footh. “The inference phase of AI, while less compute intensive, is more latency sensitive, which is essential for applications like autonomous vehicles and Internet of Things devices. This is driving demand for edge computing solutions as AI technologies push data processing closer to both the end user and data generation.”
Another likely scenario is that firms employ hybrid AI solutions depending on their own individual needs. DeepSeek’s GPUs are less powerful than the industry standard, meaning companies may use hyperscale data centres for the training stage and decentralised edge computing for the less data-intensive inference stage, ultimately benefiting growth at both points.
“As AI adoption accelerates, the need for specialised infrastructure to support GPU-powered workloads will only intensify,” says Sikander Rashid, head of Europe for Brookfield Asset Management’s infrastructure group. “With capacity demand expected to outstrip supply, the market is open to innovative solutions, including modular data centres, to bridge the gap.”
Widening the investment landscape
Improvements in AI efficiency and greater decentralisation could equally impact capex spend on cloud computing. The shift towards more edge computing and processing data closer to the source hold the potential to reshape existing investment patterns.
A greater mix of cost-efficient models should similarly help democratise access to AI solutions, fostering a more diverse ecosystem, and potentially resulting in a less monopolistic sector dependant on huge general-purpose training models.
“This is already happening, and we are seeing a broadening of opportunities for growth across the AI ecosystem, in start-ups, smaller companies, academia and beyond,” says Duckett. “There will still be growth in the hyperscalers, but also for data centres with smaller footprints.”
Emily Foshag, head of listed infrastructure at global investment manager Principal Asset Management, takes a similar view that greater innovation should be seen as a positive.
“We believe reduced infrastructure barriers could enable a broader set of participants to engage more actively in AI deployment,” she explains. “This may foster a more decentralised ecosystem and reduce reliance on traditional hyperscale providers. From an investment standpoint, we see this as a potential catalyst for innovation in co-
location, edge computing and hybrid cloud solutions, particularly in markets where hyperscale expansion is constrained by power or land availability.”
Power and land limitations are obvious handicaps to data centre growth, but another question mark is how AI workloads processed with fewer resources might affect how regulators and enterprises push for more energy-efficient alternatives.
“We believe the emergence of more resource-efficient AI models could reinforce the growing emphasis on sustainability among regulators and enterprises,” says Foshag.
“While this may moderate the pace of hyperscale expansion in certain regions, we continue to see strong demand for power-intensive infrastructure – particularly where it is supported by renewable energy and sustainable design. In our view, aligning AI infrastructure growth with sustainability goals represents a compelling long-term opportunity.”
Geopolitical and cybersecurity implications
DeepSeek’s technological breakthrough should catalyse more AI innovation, but also poses potential security risks.
The Chinese start-up DeepSeek may have been trained at a fraction of the cost of other AI models, but research suggests it could come at the expense of adequate safety measures. US tech firm Cisco has already highlighted several security flaws with the frontier model. Using algorithmic jailbreaking techniques, researchers were able to attack with a 100 percent success rate.
Showcasing the lack of strong security guardrails, hackers and criminals have been able to manipulate the model to generate inappropriate content as well as malware. Research from security platform Enkrypt AI found that DeepSeek was 11 times more likely to create harmful output than rival OpenAI, as well as four times more vulnerable to generating insecure code.
Using the platform to generate malignant content is one concern, but another is its direct link to the People’s Republic of China. Analysis from The Center for Strategic & International Studies (CSIS) points to the privacy policy of DeepSeek, which states data generated is stored and suggests the government will have direct access.
“Despite technical progress, Chinese AI models like DeepSeek will likely face regulatory and geopolitical headwinds – particularly in markets like the EU and the US, where concerns around data sovereignty and national security remain high,” says Jonathan McMullan, a global technology specialist at Schroders. “However, DeepSeek’s open-source nature introduces an important distinction. When deployed through trusted cloud providers such as AWS or Azure, enterprises can retain full control over data and infrastructure, easing many of the concerns tied to foreign technology.”
More legitimate risks arise when using DeepSeek’s proprietary chatbot service, where data could potentially flow back to China – an outcome that would be far less acceptable for some use cases under emerging regulatory frameworks, says McMullan. The platform’s own website contains obfuscated code linked to China Mobile, a state-owned telecoms firm banned in the US due to its ties with the Chinese military.


Geopolitical and cybersecurity implications