Is AI Increasing Costs? (1 Viewer)

Cotswold

Well-known member
Local time
Today, 17:15
Joined
Dec 31, 2020
Messages
984
If as it says, tax payers are funding new power sources and distributing costs, then we are suffering cost in providing funding support to the AI companies in order to assist them in further regulating our lives.

 
I haven't particularly been made aware of any specific government subsidies that are happening that are other than the normal things governments do to try to attract businesses to the local area. Is such a thing occurring that's particular or special to AI?
 
Just the fact of higher demand implies costs because we surely don't have a surplus of power. Therefore, increased demand = increased costs. At a minimum, as the power companies have to gear up to produce more power. Since governments USUALLY regulate public utilities, I'm sure that government has its fingers in the pie. But just the fact of having regulations is enough to cause costs to go up because regulations NEVER result in a price reduction. They ALWAYS result in the utility having to work harder to conform to someone's idiotic standards that don't have anything to do with safety or production. See also the USA Supreme Court case now referred to as Loper-Bright.
 
Very true. I was listening to a podcast the other day about a guy remembering when he was a kid he worked for a company digging pipelines. California was special, because when they got to CA they had to hire a paleontologist full time to do nothing other than stand there looking down into the hole as it was dug, 24/7 8 hrs/day, to help avoid digging up something precious. This special regulation applied only in California.
So everything is more expensive and Yes ,It has to do with the layers upon layers upon layers of regluations and special hoops businesses have to jump through to get things done = higher cost.
 
SOME regulations make sense, though not all can make that claim. Here's one that DID actually make a lot of sense.

MANY years ago - before the Navy job, I worked for a company that made computer-based oil-and-gas pipeline control systems - an example of a SCADA (supervisory control and data acquisition) product. Long before AI, because the "supervisory" part means a PERSON always made any decisions based on the data we acquired for them. One of the common requirements was "leak detection" - the ability to recognize that your pipeline was not intact and was dripping or venting product. Besides the obvious desire to not vent valuable product, there was another reason that leak detection was important.

I helped to design their third generation of SCADA product to include leak detection that would notify you in 1 minute for sudden catastrophic failures, or 5, 10, 15, and 30-minute detection. If you ever got the 1-minute alarm, you had a super serious problem because our "scan all sensors" minimum time was 1 minute. The 30-minute alarm was a slow leak that you might even do nothing except dispatch an inspection team. We could use some advanced resonance testing to locate slow leaks within about 500 feet. That was considered advanced for that time.

Just for comparison, the way this USED to be done before we developed remote-readable digitized sensor units was that folks would visit pipeline pumping stations and read all the dials, then by hand compute the flow throughput at each point to verify that they matched. The fastest the hand method could ever detect was about 15-30 minutes between two readings. But most companies did that on an hourly basis or longer.

Anyway, the reason that leak detection became such a hot topic was that a pipeline, run by an interstate carrier, ran underneath an elevated railroad line that was on top of something like a levee. One day an extra-heavy rail cargo put too much pressure on a pipeline that DIDN'T have leak detection. The pipeline cracked and developed a medium-slow leak. The leak wan't detected right away. It wasn't easily visible because there was a sewer grate at the bottom of the levee. A lot of distilled product went into the sewers for an estimated couple of hours. The local sewer system went under a small, isolated neighborhood before it joined the main line. It was a working-class neighborhood and nobody was home to smell the oil leak. Another train went by later that afternoon, a train that had problems that caused it to give off sparks. One hot spark went down the drain full of distilled petroleum, and within a couple of minutes an entire neighborhood - maybe 40 houses or so - were burning. Nobody died but every house had to be rebuilt.

I won't name the company because they are still operating, but they had a huge property damage lawsuit on their hands and I don't recall the EXACT verdict, but the settlement for the residents was well over $30 million. Once the settlement was made, the president of the pipeline company came to us with a contract to update and digitize all 12 of their pipelines to include our best leak detection software. And he told the president of our company those words that make ANY company president happy to oblige... "Money is no object." The economics worked because our systems usually cost much less than $300K depending on the number of remote terminal units.

That one incident went a long way towards the modern regulation about leak detection capabilities on petroleum pipelines. From what I understood in our later contracts, the other pipeline companies saw that incident, saw the size of the settlement, did the math, and came to us or one of our competitors. But we had one of the better systems at the time so we had a lot of that business. In any case, this was was regulation that none of the pipeline operators ignore because they saw the economic risks involved in NOT having modern leak detection.
 
SOME regulations make sense, though not all can make that claim. Here's one that DID actually make a lot of sense.

MANY years ago - before the Navy job, I worked for a company that made computer-based oil-and-gas pipeline control systems - an example of a SCADA (supervisory control and data acquisition) product. Long before AI, because the "supervisory" part means a PERSON always made any decisions based on the data we acquired for them. One of the common requirements was "leak detection" - the ability to recognize that your pipeline was not intact and was dripping or venting product. Besides the obvious desire to not vent valuable product, there was another reason that leak detection was important.

I helped to design their third generation of SCADA product to include leak detection that would notify you in 1 minute for sudden catastrophic failures, or 5, 10, 15, and 30-minute detection. If you ever got the 1-minute alarm, you had a super serious problem because our "scan all sensors" minimum time was 1 minute. The 30-minute alarm was a slow leak that you might even do nothing except dispatch an inspection team. We could use some advanced resonance testing to locate slow leaks within about 500 feet. That was considered advanced for that time.

Just for comparison, the way this USED to be done before we developed remote-readable digitized sensor units was that folks would visit pipeline pumping stations and read all the dials, then by hand compute the flow throughput at each point to verify that they matched. The fastest the hand method could ever detect was about 15-30 minutes between two readings. But most companies did that on an hourly basis or longer.

Anyway, the reason that leak detection became such a hot topic was that a pipeline, run by an interstate carrier, ran underneath an elevated railroad line that was on top of something like a levee. One day an extra-heavy rail cargo put too much pressure on a pipeline that DIDN'T have leak detection. The pipeline cracked and developed a medium-slow leak. The leak wan't detected right away. It wasn't easily visible because there was a sewer grate at the bottom of the levee. A lot of distilled product went into the sewers for an estimated couple of hours. The local sewer system went under a small, isolated neighborhood before it joined the main line. It was a working-class neighborhood and nobody was home to smell the oil leak. Another train went by later that afternoon, a train that had problems that caused it to give off sparks. One hot spark went down the drain full of distilled petroleum, and within a couple of minutes an entire neighborhood - maybe 40 houses or so - were burning. Nobody died but every house had to be rebuilt.

I won't name the company because they are still operating, but they had a huge property damage lawsuit on their hands and I don't recall the EXACT verdict, but the settlement for the residents was well over $30 million. Once the settlement was made, the president of the pipeline company came to us with a contract to update and digitize all 12 of their pipelines to include our best leak detection software. And he told the president of our company those words that make ANY company president happy to oblige... "Money is no object." The economics worked because our systems usually cost much less than $300K depending on the number of remote terminal units.

That one incident went a long way towards the modern regulation about leak detection capabilities on petroleum pipelines. From what I understood in our later contracts, the other pipeline companies saw that incident, saw the size of the settlement, did the math, and came to us or one of our competitors. But we had one of the better systems at the time so we had a lot of that business. In any case, this was was regulation that none of the pipeline operators ignore because they saw the economic risks involved in NOT having modern leak detection.
The moral of stories like this is that risk management is a complex calculation between competing priorities and costs. A cost-benefit analysis is appropriate in this context.

I'll go down memory lane a bit and try to recollect the major outlines.

Basically, one can construct a four quadrant matrix for probability of a loss occurring and the relative cost of that occurrence.

In the upper-left quadrant you have low-probability/low-cost. In the upper-right quadrant you have low-probability/high-cost.
In the lower-left is high-probability/low-cost and in the lower-right is high-probability/high cost.

You may expand that to a 9 quadrant matrix, with low-, medium- and high- axes.

How much you budget for each risk depends on which of those quadrants it falls into.

The_Doc_Man's scenario is probably, low-probability-high cost and the mitigation costs are determined accordingly. Even though the probability of a repeat occurrence of the time he described is not great, the high-cost of that potential occurrence calls for an appropriate budget and management to prevent it or mitigate it.

An example of a high-probability/low-cost occurrence might be burnt-out light bulbs in your kitchen. You know it's going to happen, sooner or later. But when it does happen, the cost of a replacement is minimal. You don't invest hundreds of dollars in some scheme to monitor light usage in your kitchen to predict the next burnt out bulb.

However, if you add government regulation to the calculation, it skews things, sometimes dramatically.

For example, we've all heard the argument, "It if helps one person, it's worth it." That blows any rational considerations out of the water. Is it a Low-risk, low-cost occurrence? Doesn't matter, we will make laws to prevent the occurrence because if it helps one person, it's worth the cost of the preventive measure. And, of course, it's not the government who incurs the cost of that preventative measure.

I'm not sure how to apply that risk management matrix to the question of AI, Data Centers and electrical power generation. How do you quantify the risks involved? How do you quantify the cost of "an occurrence"? I am pretty sure, though, there are a lot of arguments based on whose interests are at stake, and not so much on objective measures regarding the costs of action vs non-action.

 
George is of course right about the risk analysis. We were the beneficiaries of a case where the risk for a single pipeline to break was not super low and the cost of mitigation was on the order of 1% to 5% of the risk value. So for this to NOT be favorable, the probability would have to be less than 1% for a smaller pipeline and 5% for the larger pipelines. BUT the probability of some junction failing in California (earthquake central) figured into that analysis. I'll give you the short answer - we got LOTS of business.
 
I haven't particularly been made aware of any specific government subsidies that are happening that are other than the normal things governments do to try to attract businesses to the local area. Is such a thing occurring that's particular or special to AI?
As the article states:
In New Jersey, which is home to the largest concentration of data centres in the world, electricity prices have risen by 34% in the last year
That to me indicates governmental adjustments, or taxes on electricity charges specifically in New Jersey.

In the UK it is understood that the daily charge added to all bills is actually 100% a tax element. Although for some time other hidden charges have been added to pay for the NetZero nonsense as well as general taxation. I wonder if around 20% of a utility bill is actually tax. It is a tax that is of course unavoidable.

As an aside:
I was in the past with E.ON for some time and their bills never totalled correctly, invariably being a few pence high. Not enough to complain about but multiplied by 8 or 9 million bills a month it could have added maybe half a £million to their income each a month.

One thing you should do with your gas and electric, is to check your meter readings and invoices. You may be being overcharged.
You will only see overcharges if you check your monthly invoices over a year, each year. Just checking individual months as the invoices arrive will not show the so called audit that causes overcharges. Take a meter reading from an invoice at a selected month and then another 12 months later. Then add up the total number of units charged from your 12 invoices. The difference in the two meter readings should be the exactly the same as the total units charged from the 12 invoices but don't be surprised if they are not! You then need to locate the invoice(s) that has slipped in the overcharge and claim it back.
 
Correlation doesn't prove causation. I'm not going to argue whether the increase in number and size of Data Centers in New Jersey has or has not contributed to increases in electricity prices there. It may well be a contributing factor.

I am going to suggest that it doesn't necessarily follow from the facts available. A lot of other factors can be contributors, including government actions.

In some places, government mandates for so-called "Green" energy sources can contribute to rising costs. In other places. it might reflect taxes to some extent.

If the price of keeping energy costs low is the suspension or banning of AI related Data Centers, though, I would argue that the supposed short-term benefit is hard to justify against the cost of losing out to other actors who have no such qualms about exploiting AI.
 
Muddying the waters ever so slightly. The current plight of technology-centric forums like AWF is clearly related to the use of AI. It's a case where I believe the correlation between increasing availability and usability of AI and decreasing activity at such forums does indicate a high likelihood of causation. Of course, we can't pin it all on AI. Perhaps decreasing use of Access overall could be a contributor as well. In the absence of other compelling factors, though, it's hard not to come to that conclusion.

Along with increasing government intervention in, and regulation of, public speech, the rise of competent AI search engines like ChatGPT and Claude and Gemini are, in my opinion, diverting traffic from forums. How much and to what degree those factors are responsible, I can't say.

For such forums, though, curtailing AI might seem like a good thing. Protectionism has its benefits, and its costs. I can't say that I'd support shutting down AI data centers for the goal of extending the longevity of other venues. Much as I miss UA, life goes on without it. The same would be true for others, I dare say.
 
I read an article which mentioned that the US data centers will require nearly 90% of the global supply of high end computer chips.
 
As the article states:
In New Jersey, which is home to the largest concentration of data centres in the world, electricity prices have risen by 34% in the last year
That to me indicates governmental adjustments, or taxes on electricity charges specifically in New Jersey.

In the UK it is understood that the daily charge added to all bills is actually 100% a tax element. Although for some time other hidden charges have been added to pay for the NetZero nonsense as well as general taxation. I wonder if around 20% of a utility bill is actually tax. It is a tax that is of course unavoidable.

As an aside:
I was in the past with E.ON for some time and their bills never totalled correctly, invariably being a few pence high. Not enough to complain about but multiplied by 8 or 9 million bills a month it could have added maybe half a £million to their income each a month.

One thing you should do with your gas and electric, is to check your meter readings and invoices. You may be being overcharged.
You will only see overcharges if you check your monthly invoices over a year, each year. Just checking individual months as the invoices arrive will not show the so called audit that causes overcharges. Take a meter reading from an invoice at a selected month and then another 12 months later. Then add up the total number of units charged from your 12 invoices. The difference in the two meter readings should be the exactly the same as the total units charged from the 12 invoices but don't be surprised if they are not! You then need to locate the invoice(s) that has slipped in the overcharge and claim it back.
The increase in cost in New Jersey could be because of some other factor that has nothing to do with data centers, but yeah I see what you mean it's possible. What I said though was different, I said I didn't see any evidence of government subsidizing them more than they already subsidized businesses in general to come to their local area
 
I read an article which mentioned that the US data centers will require nearly 90% of the global supply of high end computer chips.
Sounds like one more terrifying dependency we will then have. I know, I know, it's a global world and dependencies can't be avoided. But they still make me really nervous
 
I saw another item in my local news-rag that was something I hadn't considered before. These things use a LOT of electricity. They cool themselves with a water cooling system that will require a LOT of water, and the suggestion is that it will hit not only electricity costs, but water costs in the area. We're fairly well-supplied with water in southeast Louisiana, but farmers will be in competition for this water. Put one of the data centers in an area with an already-stressed water distribution system and you'll have one of those B-movie monster riots on your hands, complete with torches and pitchforks.
 
Who better to ask that one of the AI's?

This is from Claude.
Based on recent research and reporting, here are the key locations where AI data centers are being constructed with significant power and water constraints:


Major Constrained Locations


Northern Virginia (Loudoun & Fairfax Counties)


  • Known as "Data Center Alley," this region handles roughly 70% of global internet traffic and hosts over 500 data center facilities
  • Loudoun County alone has nearly 6 GW of operating and under-construction data centers
  • Power constraints: Officials report they're "toward the end of growth for data centers" due to energy constraints
  • Water constraints: Even recycled water doesn't return to the Potomac River, impacting downstream water supplies including Washington DC
  • Cost impact: Residential electricity bills expected to rise by up to 25% in central and northern Virginia by 2030, and $18/month in western Maryland

Phoenix/Arizona


  • Second-largest data center hub in the country with about 707 megawatts of IT capacity
  • Power constraints: Arizona Public Service (APS) reports they "do not have the energy and transmission infrastructure" to support demand - if every proposed data center was built, demand would reach 19,000 MW vs. current peak of 8,200 MW
  • APS is currently turning away data center customers, telling them "We can serve you, but just not now"
  • Water constraints: Annual water use from data center electricity demand expected to increase by 400% in the Phoenix region - enough to supply Scottsdale (240,000+ people) for over 2 years
  • Arizona faces severe drought and Colorado River water cuts, with data centers competing directly with housing development for limited water supplies

Texas


  • 395 data centers making it the second-highest state by count
  • Water constraints: Data centers in Texas projected to use 49 billion gallons of water in 2025, rising to 399 billion gallons by 2030
  • Specific locations include Dallas, Austin, San Antonio, and West Texas (Laredo area)

California (Silicon Valley)


  • California's chronic water scarcity clashes with heavy cooling needs - a single large facility can use 300,000 to 550,000 gallons of drinking water per day

Ohio (PJM Region)


  • Power constraints: Part of the PJM electricity market seeing residential bills rise by $16/month due to data center demand

Key Global Patterns


Nearly 600 data centers (under 10% of the global total) are located in areas with average annual temperatures above 27°C, making cooling particularly challenging. In 21 countries including Singapore, Thailand, Nigeria, and UAE, all data centers face excessive heat constraints


Around one-quarter of existing facilities and nearly one-third of data centers under construction are in regions projected to face greater water scarcity by 2050


Recommended Alternative Locations


Cornell University researchers recommend building in the Midwest and Plains states including Nebraska, South Dakota, Montana, and parts of Texas, which offer abundant water, clean-energy potential, and more stable grids


The core challenge is that electricity has become "the limiting factor" for data center expansion, with power constraints now determining where and how quickly capacity can come online.
 

Users who are viewing this thread

Back
Top Bottom