Man it drives me crazy when people/products use Ah instead of Wh as a way to specify battery "capability".
Without knowing more details about the battery, "20Ah" alone does not convey enough information to determine how long the battery could power a given load for. If I need to power a 100 watt lightbulb, will a 20Ah battery power it for an hour? 10 hours? 10 days? No way to know.
Wh is the unit of stored energy, Wh is what I want to see. Even the official Amazon product page for it doesn't list a Wh figure.
I think they really just do it because generally everything you will connect to the battery, for the most part, will be using the same voltage, and it’s just easier to do your math and in your head. Remember, most people can’t just calculate a 20% tip in their head without having to think about it for a while or use an app.
Also, in this particular instance, phone batteries are measured in miliamp hours, so it makes the thing I actually want to know, how many times can it charge my cell phone, really easy to figure out.
But as somebody who tinkers with inverters and such, I agree, it is annoying. It is still generally not that hard to do in my head, and trivial with a calculator. But I’m with you.
That's wrong, if they only specifiy Ah, then they usually imply cell voltage (3.7V avg.) and not the 5V you're drawing over USB.
Even if you assume you're charging a phone with that, you first need to subtract 25-40% total losses. And then consider that phone batteries are LiHV with 3.85-3.9V nowadays.
Well, and if you are on that train: instead of Wh, why not use J?
(Though for American hikers it would be somewhat fun to use a unit of 'pound feet'.)
Isn't that 20Ah figure always relative to the internal voltage of the lithium batteries, 3.7v? At least that's what I always assumed.
10 years ago most of these battery packs were relative to the 5V output voltage, so they advertised lower amp-hours. That stopped making sense with fast charging at higher voltages so they restandardized on 3.7V.
But watt-hours would make infinitely more sense for all batteries.
Huh, I didn't get that memo so have been played a fool assuming 5v until now. Advertising mAh corresponding to a different unadvertised voltage than the output is just misleading advertising.
Then the fact we’ve “standardized” on mah as the unit is just another in our long proud tradition, same reason we still use “Watts” as the main measure of light output, even when it has to be made up for use on LED bulbs.
Could as least use Coulomb (or just count electrons), instead of taking Current = Charge / Time and multiplying by charge again.
For what it's worth: 1 mAh ~ 2.25 * 10^19 electrons. Or with SI-prefixes: 22.5 exa-electrons (= 2.25 Ee).
Mostly all flashlights advertise in lumen though? I suppose you mean light bulbs?
Yes, I mean the bulbs. Generally in the US at least we find them in the store labeled like "40 watt equivalent (small print: 2.1W)"
Not necessarily, no. If you have multiple batteries in series for a higher voltage pack, then it'll be less for the same amount of energy stored. But then the marketing for these packs will happily abuse the units to get the biggest numbers.
Internal voltage is chemistry dependent. In other words, cell manufacturer invariant. It's 3.7V for NMC, 3.2 for LFP, 2.3 for LTO, 3.0 for Na-ion.
Technically speaking, the pack voltage as well as Ah rating should be that of the pack and not cumulative total of the pack; two NMC 18650 in series should be 7.4V 2600mAh, not 3.7V 5200mAh. But denoting as if all cells are in parallel allow this figure to be maximally inflated and so that's what manufacturers do.
High voltage charging etc are not relevant. Though, high voltage assembled battery packs should be marked in that high voltage amp-hour ratings.
The technical reason why amp-hour rating exist is because there are parameters dependent on amperage than energy or voltage, such as thickness of the wire to be used in the device or cycle life of the cell. Voltage of a battery also kind of change proportionate to remaining energy in it, and values like 3.7 for NMC or 1.5 for Alkaline is a 50%, averaged, state.
There's a technical reason for it: the voltage sags when the battery is discharged quickly. Ah is relatively constant, but Wh decreases significantly with faster discharge rates, so it can't specified as a single figure.
That's a bit cursed mental model tbf... The voltages of batteries, in the first place, is function of state of charge. 100% = 4.2V, 0% = ~2.7V, 50% is 3.7V(by volume or something. 2.7 is also technical absolute minimums, cutoff voltage is usually more like 3.2V. Please don't abuse the battery in the ranges between 3.2 to 2.7V, let alone below).
Charge/discharge current capacity is constant throughout, at least so battery manufacturers say, at 1-20x the amp-hour capacity depending on the cell. Usually 5x or less.
Since energy = voltage x current, instantaneous W capacity is higher at first, reducing as it becomes supply side limited rather than load side limited.
But all those is irrelevant to why everyone uses mAh, it's because products with biggest numbers sell fastest. Marking capacity in Wh is noble, but it's a clearance worthy sin if you ask the shelves.
Is that because of internal resistance of the battery, or some other effect?
Yeah, but pretty much every power bank uses lithium ion batteries (or used to, at least), so you use 3.7 average volts.
If you want to quadruple your amp hours without modifying the hardware, just treat 25% of the voltage gradient of each battery as your unit of measurement.
Alternatively, since this is USB-C, and we assume the marketing copy is honest, use the max voltage USB-C can deliver: 20v.
So, draw 20V from this device and measure the amp hours it outputs.
Wh is really the only sane way to go.
Lithium NMC, which I think is the most common, is 3.7V. Lithium iron phosphate, which some power packs do use, has a nominal voltage of 3.2V.
And LTO, which is what I actually want, is 2.3 - 2.4 V.
Yeah, if you publish Ah, you should also publish voltage. Then it's just Ah * V = Wh.
…just remember V is a rather complicated function of many parameters instead of some fixed value.
Battery capacity is always measured in amp hours not watt hours, because it’s telling you more than just capacity. The rating is a measure of how many amps the battery can emit continuously for one hour. You can estimate how long the battery will last at different loads, but it won’t match up 1:1 because of efficiency differences depending on how fast the battery is discharged, and if it has a chance to recover between discharges. Basically watt hour measurements are path dependent, and using a fungible unit like watt hours obscures the meaning of the measurement.
> The rating is a measure of how many amps the battery can emit continuously for one hour.
This is absolutely not true at all. 'Ah' is a measure of capacity and 'amps' is a measure of current. Batteries typically have three measurements: nominal voltage, capacity (Ah or mAh), and rated continuous current (amps).
> watt hour measurements are path dependent
Watt hour is a normalized measurement of the battery's capacity. For example, it lets me compare a 12v/100Ah LifePO4 battery versus a 3.7v/3Ah Li-ion 18650 battery in terms of each batteries capacity (in this case 1200Wh versus 11 Wh).
Batteries have different capacities at different C rate discharge. 1C is the amperage listed as the battery amp hours, but at higher discharge, batteries have a lower capacity. Some batteries handle higher discharge better than others (closer to the rated capacity). You can of course estimate the battery capacity in watt hours, but it’s not how the battery is classified (eg in a data sheet)
Sort of. C rate discharge is just another (confusing) way of state CDR. You might as well just say the CDR in amps. C rate is really just to give a comparison on battery chemistries to illustrate how performant different chemistries can deliver power at a normalized value (to illustrate how the battery chemistry could in theory scale up)
For example, 1C is rated continuous discharge amps, which means a 1C rated battery will provide 1 * Ah. So if a 20,000 mAh battery is rated for 20,000 mAh @ 1C, it will (in theory) discharge 20,000 mAh at 20A in one hour.
> You can of course estimate the battery capacity in watt hours, but it’s not how the battery is classified (eg in a data sheet)
You're right but this is irrelevant because real life usage highly varies. Data sheets are just guides.
Ah does not mean it has to be discharged over exactly one hour in some test. It's simply a unit just like kWh. And most batteries are not optimised for one hour discharge. There's batteries with insanely high C-rates for delivering a lot of power quickly, discharging in 10 minutes. And some with really low that can standby for years. They'll still indicate the Ah capacity the same way.
The C rate still comes from the amp hour number on the battery. I’m aware you can discharge faster or slower than that, but the capacity measured will not be the same
I can see why you would think that, but no, that’s not the case. My RV battery is 200aH but can discharge at 380 amps and drain itself in well under an hour. (That’s a peak rate which it cannot sustain for long enough to discharge itself but it can still be well above 200 the entire time.)
Most lithium batteries can drain themselves much faster than an hour.
I understand that. What I’m saying is that the capacity is different depending on how quickly you drain it. Different battery construction can mitigate this effect to a higher or lower degree, but the battery is rated for 200A discharge in one hour. At 380A you will get less total watt hours out of the battery than you would at 200A, ie it will discharge in less than the 31.5 minutes you would otherwise expect.
If your 100W light bulb is a USB bulb running at 5V (to make the math a bit easier), with 20 Ah you’ll get 100Wh, ergo it’ll run your bulb for a hour. You just need to specify the voltage. There are numerous online calculators to do this math for you.
https://www.inchcalculator.com/ah-to-wh-calculator/
Watt-hours won’t save you, because we don’t know what voltage your bulb needs. Don’t assume it’s 120/240V.
I mean, I know what voltage/wattage my devices need. I have no idea what voltage the battery bank is going to offer.
...by that logic if my 100W bulb is running on 10V, with my same 20Ah I'll get 200Wh, so I'll get a bulb of the same wattage, with the same battery, running twice as long? Magical double energy!
...this is why we should measure the total energy in Wh, not Ah.