As someone that used to help manage data-centers I think this is fine. I had to load test the generators quarterly. It's good to ensure the transfer switches and generators are working and nothing beats a real world load test. The data-centers have contracts with diesel fuel providers to keep the main tanks full. They can run on diesel any time the infrastructure load is high and write off the cost in their taxes check with your tax lawyer. There may even be other tax advantages being the need for generators would be compelled by the state, perhaps a tax lawyer could find ways to make a generator tech refresh get a bigger write-down, write off better noise abatement walls as usage will increase. If a company was running with scissors and did not buy enough generator capacity to run everything then they will have to get their act together now vs. later.
This is true, but how long will firm load shed events last, and how many of them will happen? In California, when CAISO has events that lead to firm load shedding, they're predictable, rolling blackouts and everyone knows how long they'll last, and they're assigned on a customer-specific basis. You know your power will be cut, and you know it will be a certain amount of time, and you know roughly where you are in the schedule.
I could see operators of datacenters in Texas wondering about this. Also, it's underrated how much critical infrastructure is dependent on datacenters running. Like, are you going to pull someone's EHR system down that serves a local hospital, while keeping the local hospital on a critical circuit?
If power is out everywhere for an extended period of time, they aren’t doing anything but life saving surgery. Pulling up an EMR will be near the bottom of the list of concerns.
You know your power will be cut, and you know it will be a certain amount of time, and you know roughly where you are in the schedule.
Knowing when the power will be cut will not help unless I am misunderstanding you. If the data-center loses power for even a minute the generators will all fire up and then every ATS will count-down and transfer in 30 seconds. Battery backup only lasts just long enough to do a quick return to service on a failed generator and even that is sketchy at best. A properly engineered data-center can run on reduced generator capacity.
Some data-centers are indeed on circuits deemed to be critical but I could see regulations changing this so that they are "business critical" vs. "life support critical" and some changes could be made at substations so that data-centers could participate in shedding. I think you are right that they will be thinking about this and adding to this probably filing preemptive lawsuits to protect their business. Such changes can violate SLA contracts businesses have with power companies and Texas is very pro-business so I can not compare it to California.
Texas has shown no interest in life support critical. They prioritized operator profits over uptime. Hundreds died as a result.
> Knowing when the power will be cut will not help unless I am misunderstanding you. If the data-center loses power for even a minute the generators will all fire up and then every ATS will count-down and transfer in 30 seconds. Battery backup only lasts just long enough to do a quick return to service on a failed generator and even that is sketchy at best. A properly engineered data-center can run on reduced generator capacity.
If you know when the power will be cut, you can start the generators before the cut, and depending on your equipment, you may be able to synchronize the generator(s) with the grid and switch over without hitting the batteries. I assume big datacenters are on three phase, can you switch over each phase as it crosses zero, or do you need to do them all at once?
Again, that doesn't matter because everyone knows the grid isn't 99% reliable. They just pull the big power switch to the whole building and watch all the backup systems work. If anything fails it was broke already and they fix/replace it. Because this happens often they have confidence that most systems will work - even where it doesn't computers fail unexpectedly anyway and so they have redundancy of computers anyway (and if it is really important redundancy of a data center in a different state/country)
Synchronizing generators is a thing, but it isn't useful for this situation since they need to be able to handle the sudden without warning power loss where generators cannot be synchronized anyway.
At least in the data-centers I helped manage the inverters were in-line running 100% duty-cycle, meaning frequency sync is not required as there is no bypass. The servers never see the raw commercial power. Data-centers in the US are indeed 3-phase. FWIW the big Cats did have controllers that would maintain sync even when commercial power was gone but we did not need it. There wasn't even a way to physically merge commercial and generator power. ATS inputs and outputs were a binary choice.
I know what you mean though, the generators I worked with in the military had a manual frequency sync that required slowly turning a dial and watching light bulbs that got brighter with frequency offset. Very old equipment for Mystic Star, post-WWII era equipment. 50's to 90's
In the facilities I have been in (not managed), they were all in-line as you describe as well. Mains power is dirty. Having a data center without line condition on mains would be insane.
Manually syncing several of the MEP012 generators was always far more stressful to me than any physical dangers!
All of the major cloud EHRs run in multiple availability zones.
The data center might be... but are all fiber routers, amplifiers, PoPs etc. along the line from the datacenter to the hospitals and other healthcare providers backed up as well?
Particularly the "last mile" often enough is only equipped with batteries to bridge over small brownouts or outages, but not with full fledged diesel engines.
And while hospitals, at least those that deal with operating patients, are on battery banks and huge ass diesel engines... private small practices usually are not, if you're lucky the main server has a half broken UPS where no one ever looked after that "BATTERY FAULT" red light for a year. But the desktop computers, VPN nodes, card readers or medical equipment? If it's not something that a power outage could ruin (such as a MRT), it's probably not even battery backed.
There's a German saying "the emperor is naked, he has no clothes". When it comes to the resilience of our healthcare infrastructure, the emperor isn't just naked, the emperor's skin is rotting away.
Starlink
If they cannot handle the grid power being pulled due to load shedding, they have no business handling critical applications.
In particular there is no way you can predict when a backhoe will take our your power line. (Even if they call to get lines located - sometimes the mark is in the wrong spot, though most are they didn't call in the first place). There are lots of other ways the power can go down suddenly and nothing you can do about it except have some other redundancy.
For any big facility there will be pretty strict EPA limits on how long you can run the generators each year.
The EPA? Are they still a thing? I doubt anyone is concerned about the EPA under current management.
Indeed. I have faith that Texas will find a way around such rules especially if they are being regulated into running them. A Texas company I worked for was highly proficient in maximum shrugs.
The EPA doesn’t really have the resources to enforce that. And certainly won’t have that capability under the trump administration.
Nice to know that, but are the data-centers Carrington Event proof? Always wondered that, but never worked in a data-center.
Hospitals, first responders, and such, are all cloud driven operations nowadays.
It is really strange to see them categorized under
Data centers and other large, non-critical power consumers
Whose fault is it if hospitals place critical data and services on systems for which the SLA cannot guarantee the service they require?
Google Cloud in Texas:
Region: us-south1 (Dallas)
Zones: us-south1-a, us-south1-b, us-south1-c
Availability: Dallas, Texas is also listed as a location for Vertex AI.
Azure in Texas: Region: "South Central US" (paired with "North Central US")
Azure Government: Region in Austin, Texas
Availability: Azure has a physical presence in Texas.
Imagine a vendor picking multi zone, or even Multi cloud...This sounds like someone was trying to mitigate the "AI is eating our small town electricity" and threw the baby along with the water.
Aren't there special clouds accredited for such use cases? Ones that probably won't be targeted by this sort of bills?
This seems like the sort of thing that ought to be negotiated into the data center’s contract. Why force the grid operator to make every data center subject to curtailment?
Of course, it is easy enough to deal with curtailment for many services. But, it should be on the table, either way.
Yeah, if they needed a law it was to say you must have X% of your load subject to a load shed agreement.
The main problem with the Texas grid is really very simple.
Their "free market" real time power auction makes no allowances for long term concerns like reliability. Any provider who spends money to address these sort of issues is immediately priced out of the market.
Some things are too important to leave to the "free market".
I hear opinions like yours often but I am not sure it’s simple or that the reasons you’re citing are grounded in reality.
What is the alternative, have a state imposed monopoly with a single power company like PGE who I would not see as an ideal operator either. Same can be true for a lot of other similar generators across the company.
You’ll probably bring up the winter storm outage which is inexcusable but their neighbor to the north SPP had very similar failings in being prepared and only faired better because they have interconnects.
Texas has had some of the fastest adoption for wind and solar. It is far from perfect but I also think there is benefit to having multiple generation companies supplying to the grid. You have companies with different expertise and perhaps innovation.
They could just connect to the rest of the national grid to get power from other states when running short. That would work fine.
Except then they’d be subject to regulations. And we can’t have that now can we?
> What is the alternative Other markets in the US are generally energy + capacity markets -- you get paid both for what you actually provide and for your ability to provide a certain level of power, whereas Texas is an energy-only market (EOM). It needn't be the case that that if you don't do an EOM, you have to have a monopoly.
Definitely you could operate on a capacity model instead of generation. There are a lot of levers. My issue is mostly around how much uninformed hate a “free market” energy system gets.
PG&E only runs the grid(s), not generation
The government can structure a public market in many different ways (they do this in many aspects of the economy). It’s not limited to real time auction vs single provider.
Incorrect. They do have generation but it’s not a majority producer. My point being that PGE still is helping set the rate via the CPUC. They are purchasing power through some of the spot markets. They are unable to even manage their own transmission lines effectively though.
This can be a "have your cake and eat it too" situation. California, most of Northern Europe and many other jurisdictions have a power market without this problem. They do this by also establishing a market for capacity and/or reliability.
> California
> without a problem
That doesn't really line up.
California's myriad problems are delivery side, not supply side.
In a free market Texas could connect to the interstate grid which would average some of the localized reliability issues.
It’s because they want their “free market” that they don’t connect to the national grid.
Honestly, not that big of a deal. the price of batteries is so low per KW and the cost of generation is so low that, in the case of a major crisis, they could just keep them running. https://ourworldindata.org/grapher/average-battery-cell-pric...
And prioritizing humans (home heating, food freezing, etc ) over servers is a good thing.
Sodium batteries are about to enter the market at the fraction of the cost of current tech.
Recently I became aware of "heat batteries" that might also be useful for these applications. https://www.technologyreview.com/2023/10/24/1082217/heat-bat...
This seems like it will dicincentivize power plants from being built in Texas, which is probably a good thing in the long run.
Don't grid operators already have this power? That's the sort of thing they do during "rolling blackouts", isn't it?
This is a terrible idea. We live in the digital age, so almost everything depends on data. As in all mechanical devices, generators have a tendency to either break down(Let's not forget about associated switching gear.) or become subject to maintenance cut backs instituted my buffoons in management. We cannot depend on human errors, we can depend on the electric grid, if properly handled and maintained. Depending on generators just adds another link in the failure chain.
Texas is the perfect example of how not to run an electrical grid by not allowing other states to assist in an emergency.
> This is a terrible idea.
No, it isn't. Any decent datacenter will have on-site generation in event of power grid failure, anyway. When I was an intern, the company I worked for would routinely go off grid during the summer at a call from the electric company. The electric company actually gave us significant incentives to do so, because us running on our own 12MW generator was effectively like the grid operator farming out a 12MW peaker unit.
Not only will a data center of a generator, they will test it regularly and if it doesn't work get it fixed.
The power company has a long list of who has backup power. I know of one factory where the generator was installed in the 1920s on a boiler from the 1880's - it is horribly inefficient, but the power company still gives the owners incentive to keep it working because for 4x the normal cost of power and 12 hours notice that generator can run the entire town it is in, which they do every 5 years when things really go wrong with the grid.
> they will test it regularly and if it doesn't work get it fixed.
What is your definition of regularly, and what qualifies as getting it fixed? I know lots of places that had things scheduled, but on the day of, something "came up" that the test was pushed. I've seen others where they tested by only firing up the generator, but didn't actually use it to power the facility. I've also seen repair tags that sat "unlooked" at for years.
Not every facility is managed/financed the same for such a blanket statement as yours.
Exactly! And one of points, based on real work experience, I was trying to make.
There was recent news that a datacenter is going to be built that will consume few times more power than all homes in the state. I don't think they are gonna have on-site backup power. Although they'll probably have on-site powerplant for normal operations.
> We live in the digital age, so almost everything depends on data
Data that I can't consume if my house is browned out and my router doesn't work (on top of heating/cooling, lights, and other basic living-related services that are less essential than the almighty ONT). > As in all mechanical devices, generators have a tendency to either break down(Let's not forget about associated switching gear.) or become subject to maintenance cut backs instituted my buffoons in management
Famously, power infrastructure relies on no moving parts whatsoever since the abolition of contactors, relays, rotors (but not stators), turbines (both water and wind), and control rod actuators, though even before abolition, none of these devices needed any maintenance. > We cannot depend on human errors, we can depend on the electric grid, if properly handled and maintained
The electric grid, which famously has no human or mechanical errors like line sag or weirdly-designed interconnects or poorly-timed load shedding. > Depending on generators just adds another link in the failure chain.
Weird way to frame a redundancy layer, but sure. > Texas is the perfect example of how not to run an electrical grid by not allowing other states to assist in an emergency.
Again, weird way to frame this. You're actually technically right about this, but the redundancy offered through a better-integrated interconnect goes both ways, rather than just externalizing weaknesses in TX's own interconnect design.IIRC the reason Texas cannot get 'assistance' from other states is that the feds made it illegal to connect to most interstate grids without following their regulatory regimes. I believe Texas does connect to Mexico and possible some other regional grids although I don't really understand the exemption for those.
In this case it's not really Texas 'not allowing' other states to help but the other states not allowing Texas. Conceivably federal law could be updated to remove those regulations and Texas would absolutely connect to the interstate grid at that point.
> Conceivably federal law could be updated to remove those regulations and Texas would absolutely connect to the interstate grid at that point.
That, of course, ignores the fact that those regulations are in place for a reason. Texas refuses to play by the rules, and the impact of that is that they don't get help when it's important. It is unfortunate, but a direct consequence of the choices they made.
The Texas Interconnection does tie in to surrounding grids through DC-ties. Those are limited in how much power can be sent through them and ultimately isolate the AC frequency.
Rules like hardening your system to be resilient in high or low temperatures.
Super abusive. Let’s do away with safety systems that literally save human lives. Heck Texas doesn’t like them so let’s do away with them for the whole nation? How many people died during the last couple heat and cold induced grid outages in Texas? I lost count after a couple dozen. But those people were weak or poor anyway right? Texas strong!
The whole point of using a cloud data center is to be able to handle grid outages. I’d be using the cabinet under my table for otherwise.
Don't put all your digital services in a texas datacenter with no fallback then...
I'm happier with Texas being independent. Why should my state brown out because a bunch of companies put data centers in the hottest part of the continent?
> We live in the digital age, so almost everything depends on data
Agreed, the datacenters need to be extremely durable. What's more durable than proving you're able to withstand a power outage event? The grid does go down from time to time; they need to be ready to handle it. That's not a Texas-only kind of thing; power outages happen all over the US.
If the datacenter can't handle the outage that was announced as a probability ahead of time, they have no business running critical applications.