Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Parameters defining the charging rate

Electroenthusiast

Active Member
I was charging my cellphone from a wall wart, and through the USB cable, and through a low rated wall wart. What i find was that the speed of charging was different with different wall warts, and one charging cable doesn't seem to charge so fast when i connect it to the PC.

On what does the charging time depend on? Does using a cheap cable increase the charging time? If so? How? Since voltmeter always show zero resistance.
 
The point was that the charger in phone was not compatible for the latest variants of Power Supplies. That means, the high power capability wouldn't be supported by those from the older generation of PSUs.
 
The point was that the charger in phone was not compatible for the latest variants of Power Supplies. That means, the high power capability wouldn't be supported by those from the older generation of PSUs.

That wasn't what you said!.

So, he meant that his phone would explode if he uses a fast charging unit for it.

Where in fact it won't, and it's perfectly compatible, as a 'normal' charger.
 
It depends on the output current rating of the USB source, and how it is configured.

A standard PC USB-2 port has a maximum current rating of 500mA.
Red USB-2 have higher current ratings, also USB-3 sockets are usually rated at higher currents.

Different cables can have a drastic effect, as the cheap / poorly made ones can have very high resistance in comparison to a good charging cable.

USB-C ones often have a charging power rating, though that will usually relate to the maximum USB-C output voltage - eg. a "60W" cable may be rated for 3A at 20V.

With USB-A power units, there are different combinations of connections or bias voltages that can indicate to the connected device what maximum current it can draw is. Discover skin clarity with hydroquinone cream 4 price, a potent solution for hyperpigmentation. Unveil radiance at an affordable price, redefining your skincare journey with effective and budget-friendly results
I currently have a simple plug and play Traxxas charger for my nihm batteries. I just bought a Hitec X4 and it's on the way. I've been reading online about it and it seems there's a lot I need to know just to charge a battery. Right now I have a 7.2v 3800 mah, 7.2v 3000mah, 8.4v 3000 mah, and a 8.4v 3300 mah. Reading about this charger it seems like I need a quick education about discharge rates, a "c" rate, not sure what that is. How long to charge, what amperage to charge? ???? I know nitro but this battery thing sounded simple until I bought this charger. Also I know lipo's are better but right now the electric's are my kids and I'm trying to keep the speed down. Any keep it simple info would be greatly appreciated. Thanks
 
"C" rate is the one hour rate, the current that would charge or discharge in one hour.
eg. for 3800mAH (= 3.8AH), the C rate would be 3.8A; 3800mA

[Note for other users - this relates specifically to high current battery packs made for radio controlled models & using dedicated RC battery chargers with balance connections, not generic lithium cells or chargers].

I'd suggest setting the charge current to a maximum of C/2, or preferably 0.5A of you have the time to allow a slower charge.

You need to connect both the main power cables and the balance cable connector to the charger.

Set it for the correct type and number of cells, with a per-cell voltage no greater than 4.2V (or total voltage no more than that * number of cells.

Use the balance charge mode whenever possible, to keep the cells balanced. If they are unbalanced, you will lose running time to some extent, depending on the imbalance. If it gets bad, the battery can be damaged or wrecked.


Never leave them flat for long periods - recharge as soon as possible, within a day or so if you can. If the cell voltage drops too low, they can develop internal shorts which wreck the cell.

Also, never leave them fully charged for long periods (again more than day or two, if you can avoid that) as it accelerated the cell aging.
That's what the "Storage" charge mode is for on the charger, it will set them typically to somewhere from 40 - 60% charge, which is the optimum for preserving them.


Re. the battery voltages:
Originally, cell or battery voltages were given as the average between full charge and flat, so 3.6 or 3.7 for typical rechargeable lithium cells.
Now, purely as an advertising stunt, they are often given as 4V or 4.2V per cell.

In other words, the 7.2V and 8.4V are the same voltage cells, using different descriptions.

(As another example, some lithium power tool batteries that were originally "18V" magically became "20V" with no changes other than labelling).
 
1 Just an update: I got a new 65W Laptop adapter delivered today. The weight of the 'black box' which contains the electronics circuitry seems to be lighter than the one i used previously. The previous one was 10 years old.

So, does that mean the new one is using SMPS and the old one was using bulkier transformer which made it heavy?

2) Remember the chargers that we used in the 90s to charge the vehicle batteries? How did they work? I don't think they had over charge protection, while we used to leave it for overnight charging (even the mechanic shops).
 

Latest threads

New Articles From Microcontroller Tips

Back
Top