Had something similar happen at a telecom I worked at for years. We had a diesel generator and a couple of (bathroom sized) rooms full of (what looked like) car batteries. My understanding is that the two rooms were for redundancy. The batteries could power the DC for hours but were used only until the generator was ready.

The area our DC was located in was impressively reliable power-wise and -- in fact -- the backup systems had managed through the multi-state power outage in the early 2000s without a hitch (short of nearly running out of fuel due to our fuel supplier being ... just a little overwhelmed).

A few years later a two minute power outage caused the DC to go dark for a full day. Upon the power failing, the batteries kicked in and a few minutes after that the generator fired up and the DC went into holy terror.

About a minute after the generator kicked in, power to the DC blinked and ended. The emergency lights kicked in, the evacuate alarm sounded[0] and panic ensued.

My very pedestrian understanding of the problem was that a few things failed -- when the generator kicked in, something didn't switch power correctly, then something else didn't trip in response to that, a set of 4 batteries caught fire (and destroyed several nearby). They were extinguished by our facilities manager with a nearby fire extinguisher. He, incidentally, was the one who pulled the alarm (which wouldn't, on its own, trigger the Halon system, I think). The remainder of the day was spent dealing with the aftermath.

We were a global multi-national telecom with a mess of procedures in place for this sort of thing. Everything was installed by electricians, to very exacting standards[1] but -- as with most things "backup" -- the way it was tested and the frequency of those tests was inadequate.

From that point forward (going on over a decade) they thoroughly tested the battery/generator backup once a quarter.

[0] We were warned to GTFO if that alarm goes off due to the flooding of chemicals that would follow a few minutes later. That didn't happen.

[1] I remember the DC manager taking over in Cleveland making his staff work weeks of overtime replacing zip ties with wax lace (and it was done NASA style). We're talking thousands and thousands of runs stretching two complete floors of a skyscraper.

I lost track of how many datacenter outages we caused testing the power backup/failover back at eBay in the mid-2000s.

There's no winning when it comes to power redundancy systems.