Say the car gets 4 miles per kWh. So a 500 W charging rate (neglecting losses) can be expressed as 2 mph.
Compare to a fast charger which will be several hundred mph.
Say the car gets 4 miles per kWh. So a 500 W charging rate (neglecting losses) can be expressed as 2 mph.
Compare to a fast charger which will be several hundred mph.
I think this is a flawed comparison. You only care about speed when driving, but charging we care about whenever the car gets sunlight. I would argue for most people car in sunlight time is a multiple of car driving time. Still pretty abysmal, but less bad than 2 mph.
Not sure if I've slipped a 0 here but 500w taken over the year, at say a 10% capacity factor, is still over 3500 miles of range per year. A fair bit short of the average mileage (in the UK somewhere around 10k) but still more significant than I expected. Of course 500w is a lot of solar for a car and 4 miles / kWh is also quite efficient.
This is a good way to look at it, but perhaps a new unit, like range per hour? Since mph is alreday a unit of velocity.
Range isn't a unit though, so it isn't actually telling you anything technical. Since range is a distance unit, it would still be "miles per hour" or "kilometers per hour" or "meters per second" or anything to let you know how long it will take to top up to full range.
Could be "%/minute" maybe, but that is less useful if you know you need to go 45 miles, you would want to know how many hours (or fraction there of) that would take.
It expresses how many miles you can get in a given number of hours. It is a velocity.
Same dimensions, same units. Sure it can be expressed more specifically e.g. "miles of nominal range per hour". But it's still miles per hour to facilitate mental calculation.
Miles (of range) per hour (of charge) is somewhat widely (and accurately) used as a metric for charging speed