Showing posts with label energy consumption. Show all posts
Showing posts with label energy consumption. Show all posts

Sunday, November 02, 2025

Surging Power Costs Are Putting the Squeeze on residential and business Customers

Wall Street Journal seems to blame data centers! TechCrunch also just published an article on this subject based on a very recent business survey by a solar installer [???].

So is this alarmism and hysteria created to sell more to worried consumers and businesses on batteries and solar panels? 😊

"... The report, commissioned by solar installer Sunrun, found that 80% of consumers are worried about the impact of data centers on their utility bills. ..."

"... The report finds that data centers consumed about 4.4% of total U.S. electricity in 2023 and are expected to consume approximately 6.7 to 12% of total U.S. electricity by 2028. The report indicates that total data center electricity usage climbed from 58 TWh in 2014 to 176 TWh in 2023 and estimates an increase between 325 to 580 TWh by 2028. ..."

Surging Power Costs Are Putting the Squeeze on Customers - WSJ (behind paywall) "Data centers contribute to higher prices as fat energy bills electrify local politics"

Rising energy prices put AI and data centers in the crosshairs "As tech companies tout their plans for massive new data centers, consumers are increasingly worried the AI-driven gold rush will ultimately drive up the price they pay for electricity, according to a new survey."

New Sunrun Survey Finds Soaring Electricity Demand and Extreme Weather Are Fueling Homeowner Anxiety; 80% Fear Data Centers Will Drive Up Utility Prices "With the majority of Americans saying they feel unprepared for grid instability, electricity outages, and extreme weather, Sunrun offers peace of mind with home battery storage and solar"





Thursday, June 05, 2025

ML & AI uses more Energy and it saves Energy

Good news! Human ingenuity handles challenges like this!

I blogged here recently not to worry very much about the energy consumption of ML & AI.

"AI’s thirst for energy is growing, but the technology also could help produce huge energy savings over the next five to 10 years, according to a recent report. ... The International Energy Agency (IEA) ... performed a comprehensive analysis of AI’s energy consumption including energy required to obtain critical materials needed for chips and data centers.  ...

AI already makes energy generation, distribution, and use more efficient. The authors expect these savings to accelerate. 

Existing AI algorithms predict energy generation and consumption. This makes it easier to integrate renewable energy sources into the grid, which reduces reliance on fossil fuels and cuts the resulting pollutants and greenhouse gases.  ...
Widespread adoption of existing AI applications that streamline energy consumption in industry, transportation, and buildings ...
For example, scaling up existing AI optimization of heating, ventilation, and air-conditioning systems would save 300 TWh, about one-third of total energy used by data centers. ...
The energy costs of training, inference, and cooling hardware are expected to fall further thanks to trends in AI models (fewer parameters, more efficient algorithms, task-specific models) hardware (more energy-efficient chips, improved cooling methods), and usage (batch processing, running smaller models locally rather than in the cloud)."

DeepSeek-R1 Refreshed, AI’s Energy Conundrum, Agents Get Phished, Machine Translation in Action

AI is set to drive surging electricity demand from data centres while offering the potential to transform how the energy sector works "Major new IEA report brings groundbreaking data and analysis to one of the most pressing and least understood energy issues today, exploring AI’s wide range of potential impacts"

Monday, June 02, 2025

ML & AI and power consumption. Don't panic! No government interventionism please!

Update of 6/4/2025: Remember the early mainframe computers with electromechanical components and vacuum tubes.

It appears, the huge power consumption of running ML & AI models is becoming more and more a political issue.

ML & AI are still in their very dynamic infancy. Excessive concerns could become a drag on its development. Don't let this happen! Government intervention at this point would be counterproductive and undesirable!

As time goes by, we will find solutions to reduce the power consumption of ML & AI. Don't worry! E.g. the high electricity bills will be a constant reminder!

It could also very well be we need more electricity generation overall! Think e.g. about Africa were about six hundred million Africans are still without electricity in their daily lives.


Friday, February 14, 2025

IEA: Growth in global electricity demand is set to accelerate in the coming years due to developing economies and China

More people around the world can afford air conditioning! Electrification is fast progressing! Then there are data centers.

For some odd reason, IEA fails to mention India! Very strange! What a blind spot!

How does China meet its rising, enormous demand for electricity? By constructing hundreds of coal fired power plants! The West is so stupid demanding net zero and other nonsense!

"... Most of the additional demand over the next three years will come from emerging and developing economies, which account for 85% of the demand growth. The trend is most pronounced in China where electricity demand has been growing faster than the overall economy since 2020. China's electricity consumption rose by 7% in 2024 and is expected to grow by an average of around 6% through 2027. ...

The new report forecasts that growth in low-emissions sources – primarily renewables and nuclear – is sufficient, in aggregate, to cover all the growth in global electricity demand over the next three years. ...

In the United States, a strong increase in electricity demand is expected to add the equivalent of California's current power consumption to the national total over the next three years. 

Electricity demand growth is forecast to be more modest in the European Union, only rising back to its 2021 levels by 2027, following the major declines in 2022 and 2023 ..."

Growth in global electricity demand is set to accelerate in the coming years as power-hungry sectors expand - News - IEA "Increase in electricity consumption through 2027 expected to average around 4% annually, driven by growing use for industry, air conditioning, electrification and data centres"

Saturday, October 19, 2024

Integer addition algorithm could reduce energy needs of AI by 95%

Good news!

"... As just one example, ChatGPT now requires roughly 564 MWh daily, or enough to power 18,000 American homes. ...

that AI applications might be using around 100 TWh annually in just a few years ...

The one drawback it has is that it requires different hardware than that currently in use. But the research team also notes that the new type of hardware has already been designed, built and tested. ..."

From the abstract:
"Large neural networks spend most computation on floating point tensor multiplications. In this work, we find that a floating point multiplier can be approximated by one integer adder with high precision. We propose the linear-complexity multiplication L-Mul algorithm that approximates floating point number multiplication with integer addition operations. The new algorithm costs significantly less computation resource than 8-bit floating point multiplication but achieves higher precision. Compared to 8-bit floating point multiplications, the proposed method achieves higher precision but consumes significantly less bit-level computation. Since multiplying floating point numbers requires substantially higher energy compared to integer addition operations, applying the L-Mul operation in tensor processing hardware can potentially reduce 95% energy cost by element-wise floating point tensor multiplications and 80% energy cost of dot products. We calculated the theoretical error expectation of L-Mul, and evaluated the algorithm on a wide range of textual, visual, and symbolic tasks, including natural language understanding, structural reasoning, mathematics, and commonsense question answering. Our numerical analysis experiments agree with the theoretical error estimation, which indicates that L-Mul with 4-bit mantissa achieves comparable precision as float8_e4m3 multiplications, and L-Mul with 3-bit mantissa outperforms float8_e5m2. Evaluation results on popular benchmarks show that directly applying L-Mul to the attention mechanism is almost lossless. We further show that replacing all floating point multiplications with 3-bit mantissa L-Mul in a transformer model achieves equivalent precision as using float8_e4m3 as accumulation precision in both fine-tuning and inference."

Integer addition algorithm could reduce energy needs of AI by 95%



pJ = picoJoule





Thursday, August 08, 2024

Europe's reliance on and import of Russian gas through a pipeline across the Ukraine continues despite Russo-Ukraine War

Very recommendable! Why do Western media not report about this huge scandal!!!! Why is the Ukraine not stopping this!

Reality is stranger than fiction! Despite the war, the gas continues to flow through the Ukraine unabated.


Monday, April 22, 2024

U.S. Power Grid Struggles to Keep Up with Data Center Growth

The naked truth about AI! Its power consumption is enormous! Just how many solar power panels would be needed to power AI?

"The exponential growth of data centers with a tremendous appetite for electricity rapidly is outpacing the capacity of utilities to meet their needs, pushing data center developers to prioritize new markets where they can be sure they can hook up to the grid.

Electric utilities across the U.S. are doubling their forecasts of how much additional power they’ll need by the end of this decade to meet surging demand not only for power-hungry AI data processing facilities but also from a resurgence of U.S. manufacturing and the conversion to EVs.
The demand for electricity from data centers alone is projected to consume nearly 10% of the nation’s electricity before the end of this decade.

There were about 2,700 data centers in the U.S. in 2022, consuming an estimated 4% of the nation’s power output. In 2024, the total number of data hubs has grown to nearly 5,400, with power needs expanding from an estimated 16 gigawatts in 2022 ..."

U.S. Power Grid Struggles to Keep Up with Data Center Growth | GlobeSt

Saturday, April 20, 2024

New window film drops temperature, slashes energy consumption

Good news! Human ingenuity at its best!

"Assisted by quantum physics and machine learning, researchers have developed a transparent window coating that lets in visible light but blocks heat-producing UV and infrared. The coating not only reduces room temperature but also the energy consumption related to cooling, regardless of where the sun is in the sky. ..."

"... “Our coating maintains functionality and efficiency whatever the sun’s position in the sky.” ..."

From the highlights and abstract:
"Highlights
• A wide-angle spectral filter is designed by quantum annealing-enhanced active learning
Band-selective transmission is achieved from the optimized spectral filter
• Experiments verify the optical characteristics and cooling performance
• The filter for windows can reduce annual cooling energy by ∼97.5 MJ/m2 in hot climates
Summary
Multi-band spectral filters that can transmit visible light but block UV and infrared light in the solar spectrum are applicable to energy-saving windows. However, such filters are usually designed to consider normal incident light only. Here, we report photonic structures allowing selective solar spectrum transmission in wide angles using a quantum-computing-enhanced active learning scheme, which includes machine learning, quantum annealing, and wave-optics simulation in an iterative loop. We experimentally demonstrate the optical characteristics of the photonic structure and its capability to reduce the temperature rise in an enclosure when combined with a thermal radiation layer (temperature reduction of 5.4°C–7.2°C and annual energy saving of ∼97.5 MJ/m2). This structure can be incorporated into existing windows in buildings or automobiles to reduce cooling energy consumption, and the active learning scheme can be applied to design materials with complex properties in general."

New window film drops temperature, slashes energy consumption


Graphical abstract


Tuesday, February 27, 2024

How much electricity does AI consume?

One of those big questions! How many windmill farms does it take or is it time for more nuclear power? 😊

"AI's energy consumption, particularly during training, is difficult to quantify due to lack of transparency from companies, but estimates suggest it could reach significant levels, potentially comprising a substantial portion of global electricity consumption by 2027."

"... One important factor we can identify is the difference between training a model for the first time and deploying it to users. Training, in particular, is extremely energy intensive, consuming much more electricity than traditional data center activities. Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes. To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3. ...
ran tests on 88 different models spanning a range of use cases, from answering questions to identifying objects and generating images. In each case, they ran the task 1,000 times and estimated the energy cost. Most tasks they tested use a small amount of energy, like 0.002 kWh to classify written samples and 0.047 kWh to generate text. If we use our hour of Netflix streaming as a comparison, these are equivalent to the energy consumed watching nine seconds or 3.5 minutes, respectively. (Remember: that’s the cost to perform each task 1,000 times.) The figures were notably larger for image-generation models, which used on average 2.907 kWh per 1,000 inferences. As the paper notes, the average smartphone uses 0.012 kWh to charge — so generating one image using AI can use almost as much energy as charging your smartphone. ...
By combining this data, ... calculates that by 2027 the AI sector could consume between 85 to 134 terawatt hours each year. That’s about the same as the annual energy demand of ... country, the Netherlands. ..."

Last Week in AI #259: Google's Gemini AI controversy 🎭, Google's open-sources Gemma models 🌐, Moonshot AI's billion-dollar boost 🚀, and more!

How much electricity does AI consume? It’s not easy to calculate the watts and joules that go into a single Balenciaga pope. But we’re not completely in the dark about the true energy cost of AI.

Thursday, May 25, 2023

Breakthrough in computer chip energy efficiency could significantly cut data center and supercomputer electricity use

Good news! Sounds almost spectacular!

"... new, ultra-energy-efficient method to compensate for temperature variations that degrade photonic chips. Such chips “will form the high-speed communication backbone of future data centers and supercomputers,” ...
The issue with photonic chips is that up until now, significant energy has been required to keep their temperature stable and performance high. The team led by Wang, however, has shown that it’s possible to reduce the energy needed for temperature control by a factor of more than 1 million. ..."

From the abstract:
"Silicon microring resonators (Si-MRRs) play essential roles in on-chip wavelength division multiplexing (WDM) systems due to their ultra-compact size and low energy consumption. However, the resonant wavelength of Si-MRRs is very sensitive to temperature fluctuations and fabrication process variation. Typically, each Si-MRR in the WDM system requires precise wavelength control by free carrier injection using PIN diodes or thermal heaters that consume high power. This work experimentally demonstrates gate-tuning on-chip WDM filters for the first time with large wavelength coverage for the entire channel spacing using a Si-MRR array driven by high mobility titanium-doped indium oxide (ITiO) gates. The integrated Si-MRRs achieve unprecedented wavelength tunability up to 589 pm/V, or VπL of 0.050 V cm with a high-quality factor of 5200. The on-chip WDM filters, which consist of four cascaded ITiO-driven Si-MRRs, can be continuously tuned across the 1543–1548 nm wavelength range by gate biases with near-zero power consumption."

Breakthrough in computer chip energy efficiency could cut data center electricity use (secondary source)


Fig. 5 (a) Optical microscope image of the fabricated on-chip WDM filters consisting of four cascaded tunable Si-MRRs and testing setup (b) Zoom-in view of the individual tunable Si-MRR of the on-chip WDM filters. The dashed line highlights the ITiO gate. (c) The simulated carrier concentration (Nc), refractive index (n), and extinction coefficient (k) distributions with different applied biases at the ITiO/HfO2 and the Si/HfO2 interfaces.


Saturday, December 31, 2022

Dendrocentric AI Could Run on Watts (on your smartphone), Not Megawatts

Good news! This seems to be early stage research! One of the biggest obstacles to a broader application of AI & machine learning is the voracious consumption of energy while in training!

New, less energy hungry computer hardware is in dire need! There are also some serious computing bottlenecks!

If this researcher is not overpromising, then this could well become a game changer!

"... For instance, to train its state-of-the-art neural network GPT-3, OpenAI spent US $4.6 million to run 9,200 GPUs for two weeks. ...
AI currently advances by performing twice as many computations every two months. However, the electronics industry doubles the devices required to perform these operations only once every two years. This has meant that AI is typically limited to the cloud, which can provide the many thousands of processors needed for it. ...
Based on these findings, ... developed a computational model of a dendrite that responded only if it received signals from neurons in a precise sequence. This means that each dendrite could encode data in more than just base two—one or zero, on or off—as is the case with today’s electronic component. It will use much higher base systems, depending on the number of connections it has and the length of the sequences of signals it receives. ...
that a string of ferroelectric capacitors could emulate a stretch of dendrite and replace the gate stack of a field-effect transistor to form a ferroelectric FET (FeFET). A 1.5-micrometer-long FeFET with five gates could emulate a 15-µm-long stretch of dendrite with five synapses ..."

From the abstract:
"Artificial intelligence now advances by performing twice as many floating-point multiplications every two months, but the semiconductor industry tiles twice as many multipliers on a chip every two years. Moreover, the returns from tiling these multipliers ever more densely now diminish because signals must travel relatively farther and farther. Although travel can be shortened by stacking tiled multipliers in a three-dimensional chip, such a solution acutely reduces the available surface area for dissipating heat. Here I propose to transcend this three-dimensional thermal constraint by moving away from learning with synapses to learning with dendrites. Synaptic inputs are not weighted precisely but rather ordered meticulously along a short stretch of dendrite, termed dendrocentric learning. With the help of a computational model of a dendrite and a conceptual model of a ferroelectric device that emulates it, I illustrate how dendrocentric learning artificial intelligence—or synthetic intelligence for short—could run not with megawatts in the cloud but rather with watts on a smartphone."

Dendrocentric AI Could Run on Watts, Not Megawatts - IEEE Spectrum Artificial intelligence that mimics dendrites could enable powerful AIs to run on smartphones instead of the cloud

Dendrocentric learning for synthetic intelligence (no public access)

In this concept drawing of a dendrite-like nanoscale device, voltage pulses applied consecutively to all five gates from left to right flip all electric dipoles in the ferroelectric insulating layer from down to up.


Monday, November 08, 2021

Global Energy Review: CO2 Emissions in 2020 – Analysis - IEA

It seems the International Energy Agency forgot to research how much of this drop is due to the increased use of the home office!

One can also safely assume that the Covid-19 restrictions that are still imposed by so many Western countries serve predominantly to avoid CO2 emissions! What a scandal! The return to prepandemic normal is long overdue (since January or latest March of 2021)!

Since home office work, thereby reducing the daily commute and CO2 emmissions, is here to stay, the hysteric demagoguery about 2 degrees Celsius, Global Warmign hoax, Climate Change religion is even more ridiculous!

"... As primary energy demand dropped nearly 4% in 2020, global energy-related CO2 emissions fell by 5.8% according to the latest statistical data, the largest annual percentage decline since World War II. In absolute terms, the decline in emissions of almost 2 000 million tonnes of CO2 is without precedent in human history – broadly speaking, this is the equivalent of removing all of the European Union’s emissions from the global total.... A common theme across all economies is the scale of the impact of the pandemic and lockdown measures on transport activity. The decline in CO2 emissions from oil use in the transport sector accounted for well over 50% of the total global drop in CO2 emissions in 2020, with restrictions on movement at local and international levels leading to a near 1 100 Mt drop in emissions from the sector, down almost 14% from 2019 levels."

Global Energy Review: CO2 Emissions in 2020 – Analysis - IEA



Monday, August 16, 2021

Green progressives pushed for pot legalization for years. Now marijuana is posing a major 'inconvenient' threat to the climate

Getting high binging on energy! What kind of substitutes would be available?


"... The growth of dope's acceptance in American culture and the state-by-state rules governing its cultivation has made it "one of the most energy-intensive crops in the nation," ...
A big reason for grass' inordinate levels of energy consumption is the fact that it's grown mostly indoors with special lighting and environmental controls that consume up to 2,000 watts of electricity per square meter, Politico said — 40 times what it takes for leafy greens like lettuce to be grown indoors. ...
researchers estimate that Massachusetts' new blaze industry accounted for 10% of the state's entire industrial electricity consumption last year;
a study reported that the energy required to grow enough bud for a single joint (one gram) was equivalent to the energy used to drive a fuel-efficient car 20 miles ..."

Green progressives pushed for pot legalization for years. Now marijuana is posing a major 'inconvenient' threat to the climate. - TheBlaze

An inconvenient truth (about weed) Federal laws bar cannabis from crossing state lines, driving up the cost — and the emissions — of an industry using indoor grow operations.