AI

The AI ​​boom has not gone bankrupt, but AI computing is definitely changing

Don’t be too afraid of AI bears. They want to know whether the biggest boom in AI investment has arrived, whether there is a lot of market excitement and a lot of training systems spent on large AI training systems, powered by a large number of high-performance GPUs, and whether expectations for the AI ​​era should be fundamentally expanded.

But if you look closely at the plans of the major super-raters, AI investments are still alive. Meta, Amazon, Microsoft and Google have all doubled their investment in AI technologies recently. According to a recent story from the Financial Times, their collective commitment to 2025 totaled more than $300 billion. Microsoft CEO Satya Nadella said Microsoft may spend only $80 billion on AI this year. “We plan to invest 60-65B in CAPEX this year, and it has also greatly developed our AI team, and we have the funds to continue investing in the next few years,” Mark Zuckerberg, founder and CEO of Meta, said on Facebook.

This is not the voice of the AI ​​boom, but there is an increasing amount of uneasiness to enable AI applications. After at least two years of tech giants say they see a clear need for more computing power to help train large AI models, 2025 has begun, and these companies are calling out those on the carpet every day in business media to build so much of the AI ​​hype.

Why did it suddenly change from hope to attention? The answer can be partially found in the rapid increase in new AI applications in China. But to fully understand what is actually happening and what it means to AI investment and technology initiatives in the coming years, we must admit that the AI ​​era is transforming into a new stage of its development.

Search for the truth in depth

So far, the world’s understanding of DeepSeek, Chinese AI companies touted how it uses inference engines and statistical reasoning to train large language models more efficiently, and at a lower cost, at a lower cost than other companies have trained models.

Specifically, DeepSeek claims its technology leads it to require much fewer GPUs (less than 2,048 gpus), while powerful GPUs (NVIDIA H800s) are fewer than thousands of advanced performance GPUs (think some advanced H100s) need to train their models. In terms of cost savings, DeepSeek reportedly spent $6.5 million on training its R1 model, despite Openai spending billions on training Chatpt.

It should be noted that many experts doubt DeepSeek’s spending requirements, but the damage is caused, as news about its different approaches puts the stock value of Hyperscalers and Hyperscalers and the companies they spent billions of dollars to train AI models into a profound magnitude.

But, a few key points are lost in the chaos. One understanding, DeepSeek does not “invent” new ways to collaborate with AI. The second is that most of the AI ​​ecosystem is well aware of the upcoming shift in spending on AI investment funds and how AI itself will work in the next few years.

Regarding DeepSeek’s approach, the concept of using AI reasoning engines and statistical reasoning is nothing new. The use of statistical reasoning is an aspect of the broader concept of inference model reasoning, involving the ability of AI to make inferences based on pattern recognition. This is actually similar to the ability of humans to learn to solve problems and compare them to find the best solution. Inference-based model reasoning is available today and is not unique to Chinese startups.

Meanwhile, the AI ​​ecosystem for some time has anticipated fundamental changes in the way we work with AI and the computing resources needed. The first few years of the AI ​​era were important work in training large AI models on large data sets, all of which required a lot of processing, complex calculations, weight adjustments, and memory dependencies. After the AI ​​model is trained, the situation changes. AI is able to use inference to apply everything you learn to new datasets, tasks, and problems. Inferred that as a process with less computational intensity rather than training, it does not require as many GPUs or other computing resources.

The ultimate fact about DeepSeek is that while its approach doesn’t shock most of us in the AI ​​ecosystem as casually interested stock market investors, it does highlight that inference will be one of the core ways in which AI evolution will be next stage.

AI: The next generation

AI’s commitment and potential have not changed. The ongoing large-scale AI investments by major large-scale weight people show their belief in future value and the way AI can change the way nearly every industry works, and the daily life of almost everyone.

How these super-raters change is how these dollars are spent. In the first few years of the AI ​​era, most investments must be in training. If you think of AI as an hour and are still in development, we’ve been spending a lot of money sending it to the best schools and universities. Now, this child is an educated adult who needs to find a job to support himself. From a realistic perspective, we have made a lot of investments in training AI, and now we need to see the return on that investment by using AI to generate new revenue.

To achieve this return on investment, AI needs to become more efficient and at lower costs to help companies maximize their market appeal and the utility of as many applications as possible. The most profitable new service will be autonomous services that do not require human monitoring and management.

For many companies, this means fast and cost-effective machine-to-machine communications using resource-efficient AI computing technologies such as inference model reasoning. For example, in the wireless industry, AI can be used to autonomously analyze real-time data on spectrum utilization on mobile networks to optimize channel usage and mitigate interference between users, which ultimately allows mobile operators to support more dynamic spectrum sharing between their networks. This type of more efficient, autonomous AI-driven machine-to-machine communication will define the next generation of AI.

Like every other major computing era, AI computing continues to evolve. If the history of computing teaches us anything, it is that new technologies always require a lot of upfront investment, but the costs will fall, and as we start leveraging improved technologies and better practices, efficiency will increase to create more beneficial, affordable products and services to attract the most possible markets. Innovation always finds a way.

If you listen to the AI ​​bear, the AI ​​department seems to suffer setbacks lately, but the dollar spent by Hyperscalers this year, the use of reasoning-based technology increasingly tells a different story: AI computing is indeed changing, but AI’s commitment is completely complete.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button