So what is wattage? Put simply, wattage is a measurement of the amount of work electricity is doing. We need a unit to quantify the energy used or needed for an electronic device to function. Now let’s look at a few examples to help explain wattage.
Charger output in watts and why it is important…
I’ve seen several posts on internet forums that start like this: “My new charger says it can do 10A but it only seems to be charging at 3A …” or “I can charge my 3s pack at 4A but when I put on a 6s pack, it won’t go any higher that 2.2A …”. The simple answer is that you have to take into account the maximum wattage, not amps.
This limitation is mostly due to the maximum wattage (output) of your charger. Many smaller chargers are limited to 50W or 80W. I know that number means little to most, so here is how to understand how it effects your charging…
First off, wattage is calculated by measuring the voltage and amperage of a circuit and then multiplying them together. The equation (Joule’s Law) looks like this:
Watts = Volts * Amps
How do we use it? I learn best by example, so here comes some examples.
Say you want to charge a 2200mAh 3s pack at 1C. How many watts does it require for charging? We know the voltage of the battery (4.2 volts/cell) and we know the C for the pack is 2.2A, so that makes the 1C charge rate 2.2A. Now we plug them in:
Watts = 3s * 1C = (12.6V) * (2.2A) = 27.7W
So the charger will be outputting at most 27.7W. We say most because the voltage of the battery is changing the whole charge cycle. It will likely start in the mid 11V range (storage voltage) and will finish when reaching 12.6 volts. So in order to calculate the max wattage needed, we use the max voltage involved.
Now let’s look at what it would take to charge a big pack quickly, a 5000mAh 6s pack at 2C. The charger will need to be set to 25.2V (6s) and 10A (5Amps X 2C).
Watts = 6s * 2C = (25.2V) * 2(5A) = 252W
As you can see it takes nearly 10 times the wattage to charge this pack than the first example pack.
Now that we have calculated watts, let’s swap around the equation and find the maximum amps for a charger with a 50W output. We start by rearranging the above equation
Amps = Watts / Volts
Now we plug in numbers. Lets start with a 3s pack. We will use the maximum voltage for 3s.
Amps = charger wattage / battery voltage = 50W / 12.6V = 4A
Now for a 6s pack.
Amps = 50W / 25.2V = 2A
As you can see, there is a reason new chargers have more than 50W of output. When charging 6s lipos, some as large as 5800mAh, more wattage are needed to allow the packs to be charged in a reasonable amount of time.
Other places wattage shows up…
A common question on the forums from is “How can I charge my battery at 20A if my wall outlet is only rated at 15A?” Well the simple answer is that you can not compare amps, you have to compare watts.
The average US household outlet is rated for 120V and 15A. If you take those numbers and plug them into the wattage equation above, you get
Watts = (120V) * (15A) = 1800W
Now lets compare that to a charger charging a 6s lipo at 20A.
Watts = (25.2V) * (20A) = 504W
Even charging at this high rate on a relatively large lipo, we see it is using a less than 1/3 of what the wall outlet is capable of outputting. There are of course other considerations such as efficiency losses but even with those calculated in, it is very apparent that the standard wall outlet is capable of powering any normal charging needs.
Wattage plays a crucial role in charging. Don’t forget to take it into account as you choose chargers, choose power supplies and even plug into the wall. I’ll discuss this more as a separate discussion at the next meeting.