Flexential PDX02 reportedly lost power around the same time this started.

Seems related, Cloudflare just released this:

> Identified - Cloudflare is assessing a loss of power impacting data centres while simultaneously failing over services.

Yikes, I'm curious how that happened. Our Flexential has 5 generators and a basement full of battery backups along with isolated zones.

https://puck.nether.net/pipermail/outages/2023-November/0149...

> Looks like they lost utility, switched to generator, and then generator failed (not clear on scope of Gen failure yet).

Our local test those like once a month. Looks like our account manager will get a quick check in after this settles to make sure we won't have this issue.

Testing those is SOP but it’s easy to do wrong: I’ve heard of multiple times where the tests had worked but the system failed in a true emergency because of some other factor.

The best one was probably the time Internap made a cheap shot at AWS about outages before hurricane Sandy hit, only to have their NYC data center go offline once their generators burned through the small amount of fuel they had near the generator because the main fuel tank pump flooded (AWS was completely unaffected). You just don’t tempt the DR fates like that…

Same. Local company had a NOC and power went out in the area. They had moved into the build a long time ago and had two backup generators at the back end of the parking lot already installed. Power goes out, generator didn't start. 2nd generator started but it was now undersized and immediately tripped the breaker. First generator tried to start and it failed again.

Another local company had a backup generator in the basement. They were doing maintenance work on the power and messed up so the generator kicked in. Since it was in the basement, they had a belt to drive an exhaust fan. Generator kicked on, fan belt snapped, exhaust filled the entire basement, rolled into the service elevator, starts coming out the 3rd floor (top of the service elevator) and the building was then evacuated.

Last generator story, small IT shop with a small colo in the 2000s moved into a new building and got a natural gas generator installed with it running a self diagnostic once a month. Power goes out in the first month, lights go out, generator kicks on, lights come back, lights go back off. Turns out that the electrician didn't have a 60amp breaker that day and put in a 20 amp breaker for a 60+ amp generator. They meant to come back out and fix it and never did.

> Generator kicked on, fan belt snapped, exhaust filled the entire basement, rolled into the service elevator, starts coming out the 3rd floor (top of the service elevator) and the building was then evacuated.

I feel like a ton of these stories come down to infrequent use. I remember one where they’d tested the generator but only for like a couple of minutes each month and so the fuel in the tank was really old by the time there was a real outage and the generator clogged expensively about an hour in, right as people were relaxing and thinking it’d be smooth sailing until the main power line was restored.

Preventing such generator start failures is straightforward with industrial-grade fuel filters like MicFil.

These filters have a 0.5-micron filtration capability, making them a stalwart choice in generators and ships where failure is not an option.

Implementing a system that circulates the fuel/oil in the generator twice daily for a designated duration with a bypass pump to ensure cleanliness is straightforward. Issues with batteries, fuel, and water (via condensation) are the top culprits for generator start failures and should be addressed accordingly.

It is still a miracle to me why datacenters think they're special?

Which would you guess, cause or effect?