Why doesn't this just produce a shutdown? Inverters have to track voltage and current on the input and outputs sides, and can turn themselves off. They shouldn't be that close to the absolute maximum voltage ratings on the components.

Too much current is a heat dissipation problem, and you've got some time to deal with that, at least tens of milliseconds.

Anyone have a teardown on these things? Are they using under-rated MOSFETs? That's all too common in solid state relays from China.

High voltage, low RDSON FETs are (slightly) more expensive, and these products are cheap. A better design would use a higher-voltage rated input switch with poor (slow) switching performance, like an IGBT. Don’t design critical infrastructure around EcoFlow hardware.

Fujitsu, which sells MOSFETs for this application, writes: "Firstly devices should be rated at 600V or 650V, as this will generally provide more than adequate protection against the threat of high voltage transients."[1] That's a nice big safety margin. It should hold until the voltage monitoring shuts the whole thing off.

Not seeing UL certification on this thing.

If we're going to have US protectionism against China, a good first step would be to require UL-type testing, carried out in the US, on all imported electrical devices that run on more than 12VDC or contain a battery chemistry capable of thermal runaway. Electrical safety is a solved problem if you can keep people from cheating.

[1] https://toshiba.semicon-storage.com/eu/semiconductor/design-...

The voltage ratings of the MOSFETs used for 220/230/240 V applications have been increased over the years.

Decades ago, when bipolar transistors were used, they were rated for 350 V, which is barely enough for 220 V + 10%.

When everybody started to design universal converters usable for 220/230/240 V, the ratings were increased to 400 V. The first power MOSFETs were also rated thus.

Then there were too many converters destroyed by random voltage spikes, so the standard ratings were increased to 500 V. That proved to still be not enough in many places over the world, so the ratings were increased to 600 V or 650 V, already many years ago, in order to make extremely unlikely the destruction of the transistors by voltage spikes much greater than the nominal mains voltage.

600 V or 650 V is used for converter topologies where the transistors see only the peak input voltage. For converter topologies that use fewer transistors, but those see peak-to-peak voltages, the rating of the transistors must be 1200 V.

For 650 V, gallium nitride FETs are the best available devices, while for 1200 V or higher voltages silicon carbide transistors are the best. Silicon transistors are the best only for ratings much lower than 100 V, but they may be preferred also at high voltages for being much cheaper.

My understanding is that mosfets themselves are usually not UL certified/listed. I recently did a UL certification of a power supply and the IGBTs we used were themselves also not UL certified. The UL certification was more about the overall system design.

[deleted]

> They shouldn't be that close to the absolute maximum voltage ratings on the components.

This appears to be a situation where the engineering team determined the absolute maximum input voltage and the marketing/product people put that number straight into the documentation.

Standard practice with electronic parts is to determine the absolute maximum rating, then to specify a recommended maximum that allows for some safety margin and variation.

Instead, this company determined the absolute maximum and then just shipped it.

One way or another, many of us are in agreement the company screwed up and it’s on them to fix it - whether that’s their marketing, their manual, their lack of over voltage protection, whatever it is it’s their fault.

Yet so many people in this thread are so keen to blame the customer, it’s pure ego from them. “I’m too smart for that to happen, so it’s all their fault!” they sneer. Classic bad faith forum behaviour…

That's humanity in general. But yea- general guide says "add up your open circuit voltage and don't exceed that." If something fails because of the panel manages to get more than 1000W of solar flux, and is cold... it's the mfgr's problem.

I need to actually look up why the extra flux increases voltage. Maybe it really doesn't but just moves the MPP to a higher voltage by having more current.