From smaller devices like an Xbox controller to bigger devices like a battery-powered leaf blower or even a car, here’s how to figure out how it costs to recharge the batteries.
Why Calculate Rechargeable Battery Costs?
You certainly don’t have to calculate the cost of using rechargeable batteries to use them. Perhaps the convenience of never running to the store to buy more batteries for your Xbox controllers is high enough that you don’t care if the rechargeable batteries save you any money and you’d use them regardless.
But maybe you’re considering buying all lithium-ion-powered lawn and garden tools to replace your gas-powered devices, and you’re wondering if recharging them week-after-week in the summer will end up more expensive than you anticipate.
Or, perhaps, you’re like us and just relentlessly curious about everything and would love to know exactly how much it costs to charge and use your drone, leaf blower, or other battery-powered devices.
Whatever your motivation, it’s not hard to figure out how much rechargeable batteries big and small cost to charge.
How to Calculate Rechargeable Battery Costs
There are two ways to go about calculating how much it costs to recharge a rechargeable battery. One is very precise as it allows you to measure the exact amount of energy used—at the outlet level—by the charger and the battery combined. However big or small the overhead of using the charger is, this method accounts for that.
The other is less precise because it uses basic back-of-the-envelope calculations to determine how much power the batteries require to reach 100% capacity. Though you might be surprised just how accurate simple calculations are thanks to improvements in charger efficiency and battery technology.
Let’s start with the most precise method, in case you’re curious to get the exact numbers, and then move on to using a simple calculation to help you (with a fair degree of accuracy) estimate the usage.
Whatever method you opt to use, however, you’ll need a crucial bit of information. Before proceeding, grab your latest utility bill or check your electrical provider’s website for the cost of a kilowatt hour (kWh) in your area. The U.S. national average, as of the time of this article in mid-2022, is $0.14. That’s the value we’ll use.
Measure the Charger’s Real-Time Energy Use
If you want the easiest and most accurate answer to “How much does it cost to charge my device?” you’ll need to measure the charger itself in use, rather than calculate the cost based on the energy capacity of the battery alone.
The reason for this is simple: Every use of electricity incurs a loss in some form. If you’ve ever unplugged a charger for your phone or laptop and noticed the charger was warm to the touch, you’ve experienced a bit of that loss—some of the electricity flowing to the charging device radiated into the room as heat instead of ending up stored in the battery.
We have a detailed guide to measuring the energy use of devices and appliances around your home, and we’re going to recommend the same device meter method outlined in that guide. We’ve used a P3 International 4460 Kill-A-Watt device meter for years and can’t recommend it enough for these kinds of projects.
The nicest thing about the Kill-A-Watt is that you can plug the per kWh cost in and it will display the actual energy cost and not just the energy used—which allows you to skip doing any manual calculations to figure out how much running the charger is actually costing you.
Simply configure the Kill-A-Watt, plug in the battery charger, and reference the device when it is done charging the battery.
For the sake of example (and to compare it with the manual calculation technique we’re about to discuss in the next section), we charged a completely depleted 6.0A 40V lithium-ion battery while using the Kill-A-Watt meter. According to the meter, it cost $0.03 to charge the battery. Subsequent tests recharging the battery after using it to exhaustion yielded the same value: 3 cents a charge.
Do note that using a Kill-A-Watt or similar product only works if the device is charged through a standard outlet. For items, such as electric vehicles, that are charged through dedicated infrastructure wired into your home, you’ll need to use some of the other techniques in our guide to measuring energy use, such as using your electric meter as a monitor. Or you can move on to the next step and use a simple calculation to get a very reasonable estimate.
Calculate Cost Based on Battery Capacity
Although we spent a bit of time emphasizing factors like energy loss, factoring in the charger in the cost analysis, and such, one thing you’ll find—especially with smaller batteries—is that the simple manual calculations are surprisingly accurate.
Charger and battery technology are constantly improving. Most charging setups are highly efficient with 85-95%.
For small batteries that cost pennies to charge, a loss of 5-15% is a fraction of a cent worth of electricity. Even when you scale up to electric vehicle-size batteries in the 30-100 kWh range, the overhead loss per charge is around a dollar.
So, regardless of battery size, how do you get from the details you know about a given battery to figuring out much it costs to charge it from empty to 100%?
First, you need to know the voltage (V) and amperage (A) of the given battery. It’s usually printed directly on the battery casing but in instances where you don’t have direct access to the battery inside a device, you’ll need to check the technical documentation for the device. For some devices, like console controllers, it’s common to list the amperage but not the voltage, so you’ll need to dig a little.
In some instances, such as with generator-replacement “power stations” and electric vehicles, the batteries are already labeled with Wh or even kWh—if that’s the case with your battery you can skip the Wh conversion and Wh to kWh conversation steps, when applicable.
For smaller batteries, however, once you know the values for V and A, though, you can use a simple equation to determine the watt-hours (Wh) of energy the battery can hold.
V * A = Wh
So in the case of our example 6.0A 40V lithium-ion battery, we get the following watt-hours:
40V * 6.0A = 240Wh
Next, we divide the watt-hours by 1000 to convert them to kilowatt-hours (which is the unit electrical use is billed by).
240Wh / 1000 = 0.24 kWh
Next, we calculate the cost. If you really want to pad your estimate, you can replace the placeholder (1) in the equation below with a new value to stand in for the known or estimated charging efficiency.
If, for example, you know that the electric vehicle charger in your garage is rated for 85% efficiency (which means 15% of the energy is lost to the charging process) you can replace the 1 with 1.15 to add 15% to the amount of energy required to fully charge the battery.
But again, for smaller chargers where the efficiency is unknown and the expense of waste is trivial (fractions of a cent, even) feel free to not worry about it.
So, as our last step, we multiply the kWh value by the cost we pay per kilowatt-hour. In this case, that’s $0.14 per kWh.
0.24 kWh * (1) * $0.14 = $0.0336
Round that value off to the two decimals places the Kill-A-Watt meter displays and you get $0.03—the same 3-cent recharge it calculated for us. Not bad—both our real-world measurement and our calculation match up.
And that’s it! Whether you use a physical monitor to get the to-the-penny cost of charging your battery or do a little back-of-the-envelope math to get a pretty good estimate, now you can figure out how much it costs to charge everything from a game controller to an electric car.