The Cost and Savings of AI and Server Tech

AI has become a crucial part of the modern world, though its influence is often underestimated and misunderstood. From helping with user efficiency to protecting computers from harm, AI is the invisible hand that guides hundreds of millions of users every day. Relying on servers as a home, however, this AI also comes with a cost. Looking at some simple energy calculations, we want to investigate just how expensive AI can be to the global ecology, and why this cost alone is a poor indicator of worth.

What is AI?

Before going into calculations, we first need to note that AI is much simpler than the big illustrations from science fiction make us think. AI isn’t artificial human intelligence, rather it usually refers to a very narrow scope of functions designed to achieve a specific task. Essentially AI is a machine, just one that deals primarily with data rather than physical systems, though there is significant overlap.


Machine Learning & Artificial Intelligen” (CC BY 2.0) by mikemacmarketing

For an example of this in action, consider the value bet AI algorithms. Developed for horse racing, this AI manages data from a whole range of contemporary and historical data, using advanced systems to draw connections where humans alone could not. In this instance, the output is predictions of more statistically probable winners. While hugely helpful to those who like to gamble on the races, such AI is so specialized that it’s not suitable for any other task.

Calculating the Cost

Directly coming up with a set cost for AI isn’t easy, owing to how wildly different system demands on power can cost. For this reason, we’re going to have to use broad strokes and operate with averages to determine the total costs that AI produces. As a baseline, this means establishing every server as a system that uses AI, as this accurately represents how servers manage data input and output.

With this in mind, an average server uses about 1kWh if it is on for 24 hours a day. This translates to about $1.37 per day, or $1,500 a year. While current server numbers are unknown, 2020 had around 100 million active servers in the world. This would mean a total energy cost of 100GWh, or 100 million kWh, over a day. Also in a day, this would generate a total energy cost of $137 million, or nearly $150 billion over a year.

Introducing Complications

While the above calculations might make it seem like AI is a waste of energy, when taking a broader look, the exact opposite is usually the case. Running AI on servers can cost a lot, but the efficiency savings from AI versus traditional systems is enough that AI comes out ahead. To lean on the horse racing example above, consider that this system, while powerful, only utilizes a small number of servers. Achieving the same accuracy and calculations by hand without server systems would be far more costly and energy inefficient, meaning the AI alternative easily comes out ahead. In other words, while the ecological cost of AI might be high without context, it often generates superior ecological outcomes than more traditional alternatives.

The Cost and Savings of AI and Server Tech

server” (CC BY 2.0) by dariorug

Like any machine, the proper application of AI makes so many aspects of our lives simpler and more streamlined. It can be easy to take for granted, but it’s something worth taking a step back and appreciating from time to time. Even if we don’t see AI, it’s there, and it’s saving you more time and effort than you know.

Check Also
Back to top button