This is what happens when something goes mainstream. The background knowledge that "everybody knows" when it's niche, because only turbo-nerds are into it, simply isn't common sense for everybody in the wider population.
Back when Home Power magazine started up, the panels were super expensive, and squeezing out every watt was important. Since high temperatures decrease voltage and output, keeping the panels cool (while baking in the sun!) was top-of-mind for every installation. And right along with learning that critical consideration, everyone also learned the caveat that in the bitter cold, that very same phenomenon means they can produce significantly more. Temperature coefficient was simply something "everyone knew".
Now they're so cheap nobody cares. The magazine shut down because "alternative power" and EVs aren't exactly alternative anymore, you can buy one off the dealer's lot, it's nuts. And the panels are crazy cheap now. If you lose 10% because the panels are hot, it's likely cheaper to just buy 10% more panels, than to redesign your support brackets to allow better airflow. But nobody highlights the phenomenon behind the efficiency loss.
"Everybody knew" that the ratings on the panel are at Standard Test Conditions: 25°C and 1000W/m². That's almost never the conditions in the real world, but it establishes a legal baseline whereby panels can be compared apples-to-apples and advertising kept honest (if anyone cared), but deviate from STC and output will go down, or up. Again, ask today's consumer what the ratings on the label mean, and most of 'em have never heard of STC nor could define how the nameplate wattage is just one point on a curve.
Is this the panel manufacturer's fault? They're labeling things precisely the same as they've labeled them for 40-plus years. (Perhaps there's even more data on the panel label now, as Vmp and Imp are typically specified now, and they weren't always universal.)
Edit to add: The label doesn't typically specify the temperature coefficient, but for every panel I've checked, it is in the datasheet. But who reads datasheets?
Is it the inverter manufacturer's fault? They're labeling things precisely the same as they've labeled them for 40-plus years. The input max is a hard limit where the silicon can take no more, and there's a certain amount of headroom required between that and the panels' max, after compensating for temperature coefficient. Of course you calculate your panel voltage for your local conditions before comparing it to the inverter input, duh!
Everyone knows that! Except now they don't.
The input max is for most inverters on the market now a setpoint where they will simply disconnect, stop generating power and go into an error state that you need to hard-reset if you want to use the inverter again. It's pretty typical to have a 1000V hard limit on a nominally 800V or 600V system. If you're going to ride close to the limit then on some days you will see overvoltage and disconnects. So it is something you simply should not do. But people think that if the label says 1000V then 990V total nominal panel voltage should be fine, which it obviously is not. Panels are analog devices, they will produce an open circuit voltage much higher than their nominal use voltage and when you're on a switched mode inverter that means that the average voltage may well be 'in spec' but the voltage from one millisecond to another may well be outside of that if the system was built 'on the edge'. And because inverters have to take the grid side into account as well (they are not allowed to exceed certain voltages) there is always the risk of not being able to load the panels sufficiently to get the voltage to drop. So you should side your system so that the open loop voltage of your panels under ideal conditions is still comfortably lower than the max input voltage of the inverter. 800V nominal is pretty close to that limit, 700 is better and 600 is playing it safe.
I think one of the main reasons why installers tend to overprovision voltage wise is that they count on the inverters switching off the whole string every now and then versus being able to make more power without a lot of additional wiring under normal conditions. The net effect of that is positive.
Low quality inverters (the ones without the ability to disconnect the HV side autonomously) should be avoided like the plague anyway, those are simply unsafe and as far as I am concerned should not be allowed for re-sale at all.
My own system can make 17KW on a very good day in March at 1 pm or so, but normally it is closer to 12KW even in the summer. So those peaks are actually substantially over the normal output. Over a given day the average is about 5 KW or so from sunrise to sundown, and those first and last hours hardly contribute.