Charging 20 Thunderskys

How do you store and manage your electricity?
Post Reply
Loose_Bruce
Noobie
Posts: 16
Joined: Tue, 20 Apr 2010, 03:10
Real Name: Simon Williams
Location: Hobart

Charging 20 Thunderskys

Post by Loose_Bruce »

What should the output voltage of the charger be when charging 20 Thunderskys. Getting a Kelly charger.... they say 82V, should this not be 72V??
User avatar
coulomb
Site Admin
Posts: 4498
Joined: Thu, 22 Jan 2009, 20:32
Real Name: Mike Van Emmerik
Location: Brisbane
Contact:

Charging 20 Thunderskys

Post by coulomb »

Loose_Bruce wrote: What should the output voltage of the charger be when charging 20 Thunderskys. Getting a Kelly charger.... they say 82V, should this not be 72V??

Those options are 4.1 and 3.6 VPC average respectively. It depends on how high you want to charge each cell. There is very little SOC change from say 3.5 VPC up.

Thunder Sky / Winston cells are supposed to be able to charge to 4.1 V, so this should be OK. It's strange though that Sky Energy/CALB are not supposed to exceed 3.60 VPC.

In short, anywhere between and including 72 to 82 should work. I'd be tempted to go for 3.65 VPC, and this is a commonly used value for LiFePO4 chargers.
Nissan Leaf 2012 with new battery May 2019.
5650 W solar, 2xPIP-4048MS inverters, 16 kWh battery.
1.4 kW solar with 1.2 kW Latronics inverter and FIT.
160 W solar, 2.5 kWh 24 V battery for lights.
Patching PIP-4048/5048 inverter-chargers.
T1 Terry
Senior Member
Posts: 1210
Joined: Thu, 30 Sep 2010, 20:11
Real Name: Terry Covill
Location: Mannum SA

Charging 20 Thunderskys

Post by T1 Terry »

with the cell resistance being so low will the cells accept charge even if the input voltage was only 0.1v higher than terminal voltage? I.E. If the operating voltage is 3.2v per cell will they recharge at 3.3v per cell? Would the charge rate in amps be the same as it would if the charge voltage was 3.6v per cell?
The reason I ask is I'm looking at the charge board per cell method rather than the full string charge method and current inrush can be an issue causing the boards to over heat. Over on the DIY forum Jimdear2 & rwaudio seem to be having some sucess with the single cell charging method and overheat was one of the issues.

T1 Terry
Green but want to learn
antiscab
Senior Member
Posts: 2720
Joined: Mon, 26 Nov 2007, 05:39
Real Name: Matthew Lacey
Location: Perth, WA

Charging 20 Thunderskys

Post by antiscab »

charge profile is max current till voltage reaches 3.6v, then hold 3.6v till current falls below 0.05C.

so yes, charge current at 3.3v will be the same as charge current at 3.599V.

72V CV is the best solution.

i the charger output voltage is not changeable, does the rest of your system support more cells?

Matt
Matt
2017 Renault zoe - 25'000km
2007 vectrix - 156'000km
1998 prius - needs Batt
1999 Prius - needs batt
2000 prius - has 200 x headway 38120 cells
User avatar
coulomb
Site Admin
Posts: 4498
Joined: Thu, 22 Jan 2009, 20:32
Real Name: Mike Van Emmerik
Location: Brisbane
Contact:

Charging 20 Thunderskys

Post by coulomb »

T1 Terry wrote: with the cell resistance being so low will the cells accept charge even if the input voltage was only 0.1v higher than terminal voltage? I.E. If the operating voltage is 3.2v per cell will they recharge at 3.3v per cell?

Yes, low internal resistance means that the amount of voltage rise needed to get charge current to flow is quite low; 0.1 V would be plenty. So you need a charge method that is inherently current limited, as well as voltage limited. But the charge system has to track the voltage of the cells; it's no good to have something that is essentially a 3.3 V constant voltage source, since the cells will charge to way higher than that (3.6 V) near the end of charge. While charging, LiFePO4 cells spend most of their time between 3.30 and 3.45 V.
Would the charge rate in amps be the same as it would if the charge voltage was 3.6v per cell?
I think what you're asking is this: if I have a charge method that puts a constant voltage across the cells, and that constant voltage is 0.1 V higher than the terminal voltage, will this induce about the same current when the terminal voltage is 3.2 V compared to when it's 3.6 V?

If that's the question, the answer is no, the apparent internal resistance near end of charge seems to be higher. So you will push less current in near the end than at the beginning, but that's not necessarily a bad thing.

But any kind of constant voltage charge method (e.g. transformer with rectifier) would seem less than ideal for charging LiFePO4 cells. But maybe I should find that thread and read the details.
Nissan Leaf 2012 with new battery May 2019.
5650 W solar, 2xPIP-4048MS inverters, 16 kWh battery.
1.4 kW solar with 1.2 kW Latronics inverter and FIT.
160 W solar, 2.5 kWh 24 V battery for lights.
Patching PIP-4048/5048 inverter-chargers.
Tritium_James
Senior Member
Posts: 683
Joined: Wed, 04 Mar 2009, 17:15
Real Name: James Kennedy
Contact:

Charging 20 Thunderskys

Post by Tritium_James »

It also depends on when the cells were manufactured. The datasheet on the cells we put in the Civic 2-3 years ago says 4.25V top of charge, so that would be 85V for your 20 cells.
User avatar
coulomb
Site Admin
Posts: 4498
Joined: Thu, 22 Jan 2009, 20:32
Real Name: Mike Van Emmerik
Location: Brisbane
Contact:

Charging 20 Thunderskys

Post by coulomb »

Assuming that this is JimDear2's idea:

http://www.diyelectriccar.com/forums/sh ... post227582

he seems to be using 48 V to ~ 3.6 V DC/DC converters. These usually have a method of adjusting the output voltage (within a limited range, but 3.0 to 3.6 should be achievable), and you could attach a small circuit there to monitor the charging current, and achieve a current and voltage limited charger. This would be ideal for main charging and also possibly for "charge balancing" (propping up the weakest cells while driving).

I'd disagree with JimDear2's use of 48 V DC/DC, when his pack is 170 V. It seems to be that it would be better to make each DC/DC run at about pack voltage, for charge balancing.

But either way, you need a separate box (charger?) to supply the DC/DCs when mains charging. I suppose if your mains is 120 VAC, you could just rectify it and use the resultant 170 VDC to run the DC/DCs. But the power factor with that scheme is terrible, and a battery charger pulls non trivial mains current.

Then there is the issue of distributing pack voltage all through the pack, for the DC/DCs. It's a nightmare of isolation and protection. From that point of view, the 48 V DC bus makes more sense, though you'll need a quite high current 48 V charger (edit: or power supply). It also means that while driving, you'll have two conversion steps for charge balancing: pack voltage to 48 V, then 48 V to cell voltage. It starts to get rather inefficient, to eek out that last bit of capacity from your pack.

Edit: You also get two conversion steps while home charging: mains to 48V, and 48 V to cell voltage.

Edit: typo
Last edited by coulomb on Sat, 16 Apr 2011, 07:03, edited 1 time in total.
Nissan Leaf 2012 with new battery May 2019.
5650 W solar, 2xPIP-4048MS inverters, 16 kWh battery.
1.4 kW solar with 1.2 kW Latronics inverter and FIT.
160 W solar, 2.5 kWh 24 V battery for lights.
Patching PIP-4048/5048 inverter-chargers.
T1 Terry
Senior Member
Posts: 1210
Joined: Thu, 30 Sep 2010, 20:11
Real Name: Terry Covill
Location: Mannum SA

Charging 20 Thunderskys

Post by T1 Terry »

The thought I had was to tap off the pack at 48v intervals and power the single cell boards from there while limiting the total pack charge rate to 3.3v per cell. this way each cell gets top balanced but the chance of a single cell being pushed over the 4.0v threshold is reduced.

T1 Terry
Green but want to learn
User avatar
coulomb
Site Admin
Posts: 4498
Joined: Thu, 22 Jan 2009, 20:32
Real Name: Mike Van Emmerik
Location: Brisbane
Contact:

Charging 20 Thunderskys

Post by coulomb »

T1 Terry wrote: The thought I had was to tap off the pack at 48v intervals and power the single cell boards from there while limiting the total pack charge rate to 3.3v per cell. this way each cell gets top balanced but the chance of a single cell being pushed over the 4.0v threshold is reduced.

If you charge each cell only with the cell-top boards (no regen), then the chances of overcharging one cell becomes zero.

The trouble with tapping the pack is that you are encouraging imbalance. Suppose the 3 lowest cells are all in the same group of 15 or 16 (for the 48 V nominal for the DC/DCs), then this group will get depleted relative to the other groups.

However, the purpose of the "charge balancing" is presumably not to balance every cell's voltage, but to charge the lowest voltage few from the higher voltage majority. So that may not be too bad; the imbalance would be taken care of next time you garage charge.

You also need a different threshold while driving compared to garage charging (perhaps the 3.3 V you suggested while driving) so that you don't try to circulate charge all the time while driving. Though once the group average goes less than 3.3 V, you will be constantly charging all the cells, wasting energy. So perhaps it would be best to have some sort of intelligence (it could be quite simple, analogue for example) that figures out the present average cell voltage for the group, and sets the recharge threshold to a little lower than that (say 0.5 V lower). That way, only the lowest voltage cells will get charged from the group.

[ Edit: with the DC/DCs connected to taps of the pack, you presumably need contactors to change over to a DC bus while garage charging. Or maybe I have that wrong: maybe the idea is to garage charge with a standard charger, and the DC/DCs would effectively provide top balancing. Or maybe you use a low threshold on the DC/DCs, so that they bottom balance automatically, if the pack is depleted enough. I think that this idea has some merit; thanks. ]
Last edited by coulomb on Sun, 17 Apr 2011, 08:37, edited 1 time in total.
Nissan Leaf 2012 with new battery May 2019.
5650 W solar, 2xPIP-4048MS inverters, 16 kWh battery.
1.4 kW solar with 1.2 kW Latronics inverter and FIT.
160 W solar, 2.5 kWh 24 V battery for lights.
Patching PIP-4048/5048 inverter-chargers.
T1 Terry
Senior Member
Posts: 1210
Joined: Thu, 30 Sep 2010, 20:11
Real Name: Terry Covill
Location: Mannum SA

Charging 20 Thunderskys

Post by T1 Terry »

The though train was to have a cut out/in circuit as part of the DC/DC boards so when 3.5 volts was reached the DC/DC cut out and didn't cut back in until the cell dropped to 2.5 or maybe 2.8v. I was thinking a basic BMS unit like the one Rod Dikes is marketing could possibly be adapted for the job. That way even if the balancing gizmo went astray the cell would be protected from being fully discharged.
The 48v packs going out of balance was something that had slipped my mind in a high voltage pack.
What if the 48v packs were charged separately via a solar control or mains charging, what ever was available at the time? If a contactor separated the packs for charging and disconnected the charger when the packs were linked could a single mains charger be used? It would need to be a break before make type switch contact to be fail safe I'm guessing but that’s how a switching relay works anyway isn’t it? This could be used to drive a high power contactor for linking the battery pack ready for driving.
Does this sound feasible or have I run off the tracks some where?

T1 Terry
Green but want to learn
Post Reply