Bear in mind that charging and discharging batteries has an efficiency penalty - perhaps 98% efficient for each. So 74 stored is worse than 72 used directly because 74 -> 72.52 stored -> 71.06 discharged.
And usually the efficiency is much worse than 98%.
Oh, and also batteries such as the tesla power wall can only be charged and discharged about 1000 times before they have lost a lot of capacity. So generating when you use also makes your batteries last much longer. You could think of this as a cost of battery depreciation per kWh stored.
That 1000 is for NMC batteries and its a 70% capacity. Still enough to be useful.
Also, there's a lot of factors that go into play. For example, this assumes the batteries are fully charged and discharged. If you do something smarter like going down to 40% and up to 80% then they end up being able to do a lot more cycles. In fact, battery age starts mattering more than the cycles.
But besides that, LFP batteries are currently being used in home battery storage (including powerwalls) because it's cheaper and it has 5000->10,000 cycles before dropping to 70% capacity.
Generally, though, I'd agree that having more generation throughout the day is better than having perfectly optimized generation.
> Oh, and also batteries such as the tesla power wall can only be charged and discharged about 1000 times before they have lost a lot of capacity.
Powerwall's cycle life is much better than 1000. The Powerwall warranty guarantees 70% capacity after 10 years of daily cycles (i.e. 3650 cycles). This means they expect the capacity to be substantially above 70%.
We posted an analysis of Powerwall capacity retention: https://www.netzero.energy/content/2025-02/powerwall-analysi...
> The Powerwall warranty guarantees 70% capacity after 10 years of daily cycles (i.e. 3650 cycles). This means they expect
... to have pulled some corporate restructure which leaves a bankrupt legal entity responsible for the warranty claims before they start costing any real money.
The Powerwall solar controller prioritizes the home before sending surplus to the grid. And in the home, the controller will send power direct to any running load first, then to the Powerwall battery. Any spillover then goes to the grid. It’s very dynamic. I would go with the algorithm to “capture the rain into my rain buckets the moment I can” because the rain could stop. Solar irradiance is unpredictable.
If you have time based billing you can also input that into the system and it's even more effective. For example, if you tell it that electricity is cheap from 9am - 5pm (peak solar) and expensive from 5pm - 9pm (peak residential demand) it will take your trending consumption and decide when your solar production isn't keeping up with foretasted demand and let you charge from the grid to at the cheap rate to make up the shortfall and minimize cost. It even factors in things like grid charging speed and total site usage limits, which are great given my 100amp panel.
The charging the battery from the grid on its own is interesting in spaces where the TOU between 4-7pm (or whatever yours is locally).
Here it is more than 3x, so if I can charge a battery and run off of that for those 3 hours, I am saving money.
And it's not that I can lose money, a charge in the battery doesn't become stale.
Don't forget to account for the additional battery wear from extra charging and discharging. However your cost saving probably exceeds the wear cost.
> Solar irradiance is unpredictable
Except in my case it is! I wanna do most of my charge at noon when I'll have far more than I can export. Export first, then charge!
That strategy make sense if you're goal is longevity for your first battery units. But with most early-adopter, often the attitude is to know you'll be upgrading somewhat frequently (3-5 years) due to rapid advances in the tech, until more advanced offerings come around that could least 10-20 years.
[dead]