Deepseek: China-Ki apparently consumes more energy than assumed

The contribution Deepseek: China-Ki apparently consumes more energy than assumed by Felix Baumann first appeared on Basic Thinking. You always stay up to date with our newsletter.

Deepseek Energy Sustainability Ki China

The China-Ki Deepseek is currently causing discussions. Because despite efficient training, the voice model apparently consumes more energy than previously assumed.

Deepseek caused a sensation in the AI ​​industry. According to company, the language model should not only be able to keep up with chatt, but be particularly energy -efficient. But current data that With Technology Review There are another picture. Deepseek could even consume more energy when generating answers than comparable AI models.

Because the China-Ki uses an improved variant of the so-called “Mixture of Experts” approach, in which only a part of the model parameters are active during training, which should save energy. But the actual sticking point is the use of the model (inference). This shows that Deepseek needs significantly more computing power due to its complex “chain-of-though” logic to generate answers.

Deepseek consumes significantly more energy after training

Specifically, this means: While the researchers optimized training, Deepseek needs more energy for each individual request than other models-sometimes up to 87 percent more than a comparable meta model with 70 billion parameters. Among other things, this is due to the fact that Deepseek’s answers are often significantly longer.

One problem is the possible spread of this technology. Because if other companies copy the approach from Deepseek and transferred to many AI applications, energy consumption could increase massively. Similar to the development of generative AI, the demand for powerful models could again destroy the efficiency gain through optimized training-a classic example of the so-called Jevons paradox.

Long -term problems in terms of sustainability

The long -term effects are still unclear. But experts like Sasha Lucioni from Hugging Face warn that the hype around Deepseek could lead to this compensation -intensive technology is used unnecessarily often. If AI models use “Chain-of-Though” processes continuously in the future, this could significantly deteriorate the entire energy efficiency of the industry.

See also  Briefly explains: What are AI Hallucinations?

Deepseek is a very competitive model, which also reveals the challenges of new AI technologies. Although it was trained more efficiently, its higher energy consumption could be a serious problem in terms of sustainability when using it. The coming months should show whether Deepseek’s approach will really prevail or whether the energy request will move some companies to rethink.

Also interesting:

  • Robot recognize human touch – without artificial skin
  • Self -healing power grid: Artificial intelligence should avoid blackouts
  • AI gap: Artificial intelligence ensures an even deeper “digital gap”
  • AI as a judge: the advantages and disadvantages of artificial intelligence in the judiciary

The contribution Deepseek: China-Ki apparently consumes more energy than assumed by Felix Baumann first appeared on Basic Thinking. Follow us too Google News and Flipboard.


As a Tech Industry expert, I would say that the revelation that Deepseek: China-Ki consumes more energy than initially assumed is concerning. Energy consumption is a critical issue in the tech industry, and companies must be mindful of their environmental impact. It is essential for companies like Deepseek to prioritize sustainable practices and energy efficiency in their operations. Additionally, transparency and accurate reporting of energy consumption are crucial for both companies and consumers to make informed decisions about their technology usage. This news underscores the need for continued monitoring and improvement of energy consumption in the tech industry.

Credits