Typical hybrid inverters have an output rating around half the theoretical max input of the panels. This is due to theoretical max of panel input being very rare or even impossible in normal earth conditions, the presence of an attached battery to soak up part of the input, and the general cost benefit trade off of solar equipment (more throughput means more heat, means bigger heatsinks, means heavier and more expensive).

You can definitely get equipment that can do symmetrical input/output, but if you actually model out the supply and demand curves on the system it's not usually going to be worth the extra up front expense since peak input is a small portion of the day and that extra hardware will mostly sit idle.

For that matter people often design systems where peak input can't even be accepted by the inverter and the extra power is just wasted, because it's more valuable to have a steady input over a long period than to maximize the daily peak.

Yes, my grid-tied system is like this. The panels are ~410W and each one has a microinverter with ~390W maximum or something. The more expensive inverters were not worth capturing the peak. You’re better off putting that money into more panels.

In the US, most home solar installations do not have a in-home battery. It is not uncommon for rooftop solar to be producing >90% of nominal max, for hours at a time.

I know multiple people with solar and have discussed their specs with them extensively. Zero of them have inverters or microinverters sized below the theoretical max of their array.

Are you thinking of a purely off-grid setup without actually saying so?

Nope but in a different market so makes sense, those are probably pure grid tie inverters, which I don't have a lot of experience with because it's not commonly used here. I do see the EG4 hybrid has a similar ratio (we have the same tech here under the Luxpowertek brand).

Even without a battery people usually choose hybrid, which can function on and off grid.

Also to be honest I'm mostly looking at larger inverters so maybe that colors it. Not many users here need 24,000 watts continuous outside a commercial context, for instance, so an inverter with that as an input but 12,000 watts continuous AC output doesn't seem weird since part of the 24,000 watts DC can be sent to the battery.

Ok, yeah that makes sense. Over here people usually get direct grid tie inverters, and if there's no battery, there's no reason for a hybrid inverter. The cheapest way to do it is panels -> inverter -> grid. No cutoff switch, so the inverters stop functioning if the power goes out.

Then it's just a race to pay back the panels, which are most of the cost, so undersizing the inverter is wasing energy and leaving money on the table.

On my case I have 4500Wp of panels. The inverter is sized at 4200W. The next step up 4800 or 5200 was twice as expensive adding about €600. Not sure if I ever would have made that back. I hit the maximum only a few weeks in spring.

Also just as a follow on my assumption is it's much easier and cheaper to scale the DC side since it's often at the 400-500v range (for example 10 panels in series with open circuit voltage of 49v and operating voltage around 43v) vs the AC side in the 230v range, since resulting amperage is half. So that may account for the ratio.