Electronics-Related.com
Forums

Automotive electronics - Honda charging system

Started by Peabody December 7, 2016
pcdhobbs@gmail.com wrote:

>>As I brought up the voltage, the current rose from a very low level, >>perhaps 100mA, then suddenly went into current limiting around 14.5 >>Volts. It stayed in current limit for only a few minutes, then >>gradually >>tapered off in a linear fashion down to 50 mA. Their threshold is very >>sharp, perhaps 100 to 200mV.
>>This tells me the alternator voltage must be high enough to rise above >>the battery threshold in order to charge the battery. A lower voltage, >>even a couple of tenths of a volt, will do little to charge the >>battery
> IOW large lead-acid batteries have very low impedances, which is more > or less the point. The open-circuit alternator voltage will be nearly > proportional to rotor speed till iron and copper loss take over. > That's why I suggested a smaller alternator pulley (upthread > someplace.)
> Cheers
> Phil Hobbs
Yes, I saw that. The technique was popular with mobile hams who needed large current when the car was idling. However, since I started monitoring the alternator voltage, I found it can easily reach 15V when the engine is idling. At this level, the battery only takes current for a short time, then the drain drops off to near zero. So the issue is not so much getting enough voltage from the alternator, but rather to intercept the PWM commands from the engine control and insert commands that will tell the alternator to put out the correct voltge to charge the battery. Since modern engines start so quickly, there is not much energy taken from the battery. It can be easily replaced, even on a short trip, if the alternator voltage is set correctly. It appears that the Ford Taurus and Focus, my neighbour's Trans AM, and the Camry and Honda fail to do so.
On Sun, 16 Jul 2017 00:20:54 GMT, Steve Wilson <no@spam.com> wrote:

>pcdhobbs@gmail.com wrote: > >>>As I brought up the voltage, the current rose from a very low level, >>>perhaps 100mA, then suddenly went into current limiting around 14.5 >>>Volts. It stayed in current limit for only a few minutes, then >>>gradually >>>tapered off in a linear fashion down to 50 mA. Their threshold is very >>>sharp, perhaps 100 to 200mV. > >>>This tells me the alternator voltage must be high enough to rise above >>>the battery threshold in order to charge the battery. A lower voltage, >>>even a couple of tenths of a volt, will do little to charge the >>>battery > >> IOW large lead-acid batteries have very low impedances, which is more >> or less the point. The open-circuit alternator voltage will be nearly >> proportional to rotor speed till iron and copper loss take over. >> That's why I suggested a smaller alternator pulley (upthread >> someplace.) > >> Cheers > >> Phil Hobbs > >Yes, I saw that. The technique was popular with mobile hams who needed >large current when the car was idling. > >However, since I started monitoring the alternator voltage, I found it >can easily reach 15V when the engine is idling. At this level, the >battery only takes current for a short time, then the drain drops off to >near zero. > >So the issue is not so much getting enough voltage from the alternator, >but rather to intercept the PWM commands from the engine control and >insert commands that will tell the alternator to put out the correct >voltge to charge the battery. > >Since modern engines start so quickly, there is not much energy taken >from the battery. It can be easily replaced, even on a short trip, if the >alternator voltage is set correctly. It appears that the Ford Taurus and >Focus, my neighbour's Trans AM, and the Camry and Honda fail to do so.
So roll your own regulator. I've previously posted the voltage required versus temperature. ...Jim Thompson -- | James E.Thompson | mens | | Analog Innovations | et | | Analog/Mixed-Signal ASIC's and Discrete Systems | manus | | STV, Queen Creek, AZ 85142 Skype: skypeanalog | | | Voice:(480)460-2350 Fax: Available upon request | Brass Rat | | E-mail Icon at http://www.analog-innovations.com | 1962 | I'm looking for work... see my website. Thinking outside the box...producing elegant & economic solutions.
On Saturday, July 15, 2017 at 4:30:38 PM UTC-7, Steve Wilson wrote:

> The alternator is starting to put out 14.2 to 14.3V, > which I believe is too low to charge the battery. Accordingly, the > unloaded battery voltage is starting to decrease. It was 12.2v this > morning. So I believe the battery is heading to an early death, which > could be prevented if the alternator put out a higher voltage.
That voltage range should be fine if the battery is in good condition and you drive reasonable length trips. Charging systems today are designed to maintain a battery but not necessarily do a full recharge which takes many hours. One exception might be driving several hundred miles. A sulfated battery is very difficult to charge because of higher internal resistance and surface charge will build up quickly and reduce the charge current. I assume you have a conventional flooded lead acid battery. It might be best to put it on a good quality charger and see if you can restore the charge. I would also check the specific gravity of each cell for any variation. When my battery needed a jump, I found that driving 20 miles still left the specific gravity bordering on empty. I connected a Black & Decker 2 amp charger/maintainer and the 2 amp charge terminated after about 20 hours but the specific gravity had only moved up a small amount. I left the charger/maintainer connected for several weeks and to my surprise the specific gravity slowly increased to 1.295 which is the upper limit. It was a very slow process all occurring as the battery maintainer float voltage rose from 13.51 volts initially to 13.68 volts at the end of the period. Make sure you measure battery voltage at the battery posts. In regards to your power supply reading 14.5 volts, there could be significant voltage drop depending on wire size and length. Reading the voltage at the supply could show a higher voltage than what is actually seen at the battery posts. The state of charge would also be a factor and the voltage would be higher if the battery was close to full charge.
kt77 <kawill70@gmail.com> wrote:

> On Saturday, July 15, 2017 at 4:30:38 PM UTC-7, Steve Wilson wrote: > >> The alternator is starting to put out 14.2 to 14.3V, which I believe >> is too low to charge the battery. Accordingly, the unloaded battery >> voltage is starting to decrease. It was 12.2v this morning. So I >> believe the battery is heading to an early death, which could be >> prevented if the alternator put out a higher voltage. > > That voltage range should be fine if the battery is in good condition > and you drive reasonable length trips. Charging systems today are > designed to maintain a battery but not necessarily do a full recharge > which takes many hours. One exception might be driving several hundred > miles. A sulfated battery is very difficult to charge because of higher > internal resistance and surface charge will build up quickly and reduce > the charge current. > > I assume you have a conventional flooded lead acid battery. It might be > best to put it on a good quality charger and see if you can restore the > charge. I would also check the specific gravity of each cell for any > variation. > > When my battery needed a jump, I found that driving 20 miles still left > the specific gravity bordering on empty. I connected a Black & Decker 2 > amp charger/maintainer and the 2 amp charge terminated after about 20 > hours but the specific gravity had only moved up a small amount. I left > the charger/maintainer connected for several weeks and to my surprise > the specific gravity slowly increased to 1.295 which is the upper limit. > It was a very slow process all occurring as the battery maintainer > float voltage rose from 13.51 volts initially to 13.68 volts at the end > of the period. > > Make sure you measure battery voltage at the battery posts. In regards > to your power supply reading 14.5 volts, there could be significant > voltage drop depending on wire size and length. Reading the voltage at > the supply could show a higher voltage than what is actually seen at the > battery posts. The state of charge would also be a factor and the > voltage would be higher if the battery was close to full charge.
Yes, the no load voltage shows the state of charge. The trend is obvious - last fall it was 12.4V, a while ago it was 12.3V, now it is 12.2V. The battery will probably not last through the next winter. The voltage drop in the car is negligible. The wiring diagram shows a 30A fuse between the 12V cigarette lighter and the battery. The voltmeter has a 100k resistor at the input. The drain is I = E / R, = 12 / 1e5, = 0.00012 A. The wiring and fuse have very low resistance. Say we allow 0.1 volt drop at 30 A. The resistance is R = E / I, = 0.1 / 30, = 3.33 e-3 Ohms. With the current drain from my voltmeter, the drop is E = I * R, = 0.00012 * 3.33 e-3, = 4e-7 V, = 400 nanovolts. The least significant digit on my voltmeter is 0.001 Volt. I cannot even see 400 nanovolts. When chargin the battery, the power supply is connected to the battery with short lengths of 16Ga wire. The supply shows the voltage during current limiting. I never measured the voltage during this interval since it was so short. However, when it came out of current limit and the current decayed, the voltage quickly rose to 14.5V. So I take that as the voltage required to charge the battery. It is clear the alternators in these discussions do not meet the minimum voltage required to charge the battery. At most, perhaps a few hundred mA flow, which will not restore the charge on a short trip. I can't change the battery chemistry, or the length of the trips. But I can change the alternator output voltage. I am studying the schematics for the Ford Focus, and it seems the interface is very simple. It uses open- collector drivers for the command to the alternator and the feddback from the alternator. The frequency appears to be 150 Hz. It looks like all I have to do is cut the command line and insert my own signal, with the duty cycle selecting the desired alternator voltage. A very simple microprocessor can measure the battery voltage and change the duty cycle to drive the alternator voltage to 14.5V, and increase or decrease the voltage according to the ambient temperature. I have high confidence this will work on the Ford Focus. I do not have schematics for the Honda, Camry, or other cars, so I cannot tell how difficult it would be to convert their systems to a controlled charging voltage. However, car manufacturing is driven by cost, and other manufacturers will probably use a very similar system.
On Monday, July 17, 2017 at 3:11:25 AM UTC-7, Steve Wilson wrote:

> It is clear the alternators in these discussions do not meet the minimum > voltage required to charge the battery.
It might be worth reconnecting your power supply to verify the charging threshold. If the wiring resistance was 0.05 ohms that would drop 0.25 volts at 5 amps. My battery at nearly full charge measured 14.24 volts with a 2 amp charge. The amateur radio site that I referenced earlier mentioned 14.2-14.5 volts to charge in a reasonable time. The Battery University website has information on the temperature compensation which is apparently 3 mv per cell per degree centigrade. That would be 0.18 volts for a six cell battery for a change of 10 degrees C or 18 degrees F. I just realized that the alternator will warm up much faster than the heavy battery so the voltage may actually be too low until the battery warms up to compensate. That may never happen on a short trip. Here is something else that might be of interest. There is a thread online mentioning that the internal resistance of a car battery is in the area of 0.02 ohms. That would drop the voltage 2 volts during a 100 amp load test. It would also mean that the charging voltage would need to rise an extra 0.2 volts under a 10 amp charge. I actually calculated the same value earlier. My high beam headlights draw 10 amps. I turned on the headlights until the voltage stabilized and then turned on the high beams. The additional voltage drop was about 0.2 volts. Using ohms law the series resistance would be 0.2/10 = 0.02 ohms. My previous car was a 2000 Camry. A Sears Diehard battery installed in 2005 lasted 8 years which was quite a surprise. The battery still started the engine but the specific gravity of one cell indicated half charge. I now wish I had measured the alternator voltage under different conditions. It's not hard to guess that the voltage regulator specifications differed from those of current models. I think my battery was slowly discharged over time as a result of infrequent use, short trips, and parasitic current drain. The surprise is that the battery maintainer appears to have restored the battery to almost new condition.
kt77 <kawill70@gmail.com> wrote:

> On Monday, July 17, 2017 at 3:11:25 AM UTC-7, Steve Wilson wrote: > >> It is clear the alternators in these discussions do not meet the >> minimum >> voltage required to charge the battery.
> It might be worth reconnecting your power supply to verify the charging > threshold. If the wiring resistance was 0.05 ohms that would drop 0.25 > volts at 5 amps. My battery at nearly full charge measured 14.24 volts > with a 2 amp charge. The amateur radio site that I referenced earlier > mentioned 14.2-14.5 volts to charge in a reasonable time.
> The Battery University website has information on the temperature > compensation which is apparently 3 mv per cell per degree centigrade. > That would be 0.18 volts for a six cell battery for a change of 10 > degrees C or 18 degrees F.
> I just realized that the alternator will warm up much faster than the > heavy battery so the voltage may actually be too low until the battery > warms up to compensate. That may never happen on a short trip.
> Here is something else that might be of interest. There is a thread > online mentioning that the internal resistance of a car battery is in > the area of 0.02 ohms. That would drop the voltage 2 volts during a 100 > amp load test. It would also mean that the charging voltage would need > to rise an extra 0.2 volts under a 10 amp charge.
> I actually calculated the same value earlier. My high beam headlights > draw 10 amps. I turned on the headlights until the voltage stabilized > and then turned on the high beams. The additional voltage drop was > about 0.2 volts. Using ohms law the series resistance would be 0.2/10 = > 0.02 ohms.
> My previous car was a 2000 Camry. A Sears Diehard battery installed in > 2005 lasted 8 years which was quite a surprise. The battery still > started the engine but the specific gravity of one cell indicated half > charge. I now wish I had measured the alternator voltage under > different conditions. It's not hard to guess that the voltage regulator > specifications differed from those of current models.
> I think my battery was slowly discharged over time as a result of > infrequent use, short trips, and parasitic current drain. The surprise > is that the battery maintainer appears to have restored the battery to > almost new condition.
The charging current drops rapidly as the battery is charged. At 50mA, the voltage drop is negligible. Say the resistance is 0.02 ohms. The voltage drop is then E = I * R = 50e-3 * 0.02 = 0.001 Volt, or 1 millivolt. That is the least significant digit on my voltmeter. I think the problem is not so much measuring the battery voltage as much as convincing the alternator to ignore the commands from the PCM and set the charging voltage to 14.5V. Then the battery will be fully charged, even on short trips. This voltage is mentioned in numerous places. But none of the cars deliver it.
On Monday, July 17, 2017 at 11:36:02 AM UTC-7, Steve Wilson wrote:

>14.5V
It looks like fuel economy standards have really impacted voltage regulator specifications to reduce alternator load. I also would benefit if the charging system in my car was smart enough to properly charge the battery under all conditions. I think 14.2 volts would still be adequate if it stayed at that level somewhat longer. The charge current should be a few amperes under those conditions and probably higher for a partially discharged battery. The voltage would need to be under 14.2 volts at higher temperatures. My thought on your power supply would be to adjust the voltage to 15 volts and let the current limit take over. In that case you should have a constant current 5 amp charge. The voltage at the battery posts would rise as the battery charged but you could still measure the approximate threshold where the battery allowed a charge current of several amperes. The increase in voltage should be very gradual as it can take a long time to fully charge a battery. If you did this starting from 12.2 volts, I believe the voltage would rise fairly quickly as surface charge builds up on the plates. You could then see where the voltage leveled off before rising on a more gradual basis.
kt77 <kawill70@gmail.com> wrote:

> On Monday, July 17, 2017 at 11:36:02 AM UTC-7, Steve Wilson wrote:
>>14.5V
> It looks like fuel economy standards have really impacted voltage > regulator specifications to reduce alternator load. I also would > benefit if the charging system in my car was smart enough to properly > charge the battery under all conditions.
I doubt fuel economy has much to do with it. Modern cars start so quickly that little energy is drawn from the battery. At 14.5V, the energy is replaced in a few minutes, then the battery drain drops to negligible level. Where you get the huge drain is heated seats, rear window defrost, high power audio systems, and all the various electronic gadgets that add to the alternator drain. Charging the battery is completely negligible, if the car manufacturers would settle on the correct voltage.
> I think 14.2 volts would still be adequate if it stayed at that level > somewhat longer. The charge current should be a few amperes under those > conditions and probably higher for a partially discharged battery. The > voltage would need to be under 14.2 volts at higher temperatures.
The battery accepts very little current at 14.2V. Many sites recommend 14.5V.
> My thought on your power supply would be to adjust the voltage to 15 > volts and let the current limit take over. In that case you should have > a constant current 5 amp charge. The voltage at the battery posts would > rise as the battery charged but you could still measure the approximate > threshold where the battery allowed a charge current of several amperes. > The increase in voltage should be very gradual as it can take a long > time to fully charge a battery. If you did this starting from 12.2 > volts, I believe the voltage would rise fairly quickly as surface charge > builds up on the plates. You could then see where the voltage leveled > off before rising on a more gradual basis.
I did walk the current up at the beginning, but after a half-dozen or so times, I found it did exactly the same thing every time so I went to straight 14.5V and let it go into current limiting. The whole charge cycle only lasted a few minutes, then the battery drain rapidly dropped to 50mA or so and stayed there. The same thing would happen in a car. Again, many sites recommend 14.5V to charge the battery. Just for infomation, here is the lab power supply I used: http://cgi.ebay.com/272724909909 I bought three on the recommendation of another poster on this newsgroup. It turns out to be very poor for electronics work. There is a huge capacitor on the output. This will blow LEDs and ICs if you rely on current limiting to save the device. In addition, there is a large overshoot on power on, which could blow sensitive devices like laser diodes. I definitely do not recommend this unit for lab work, but it is OK for charging batteries. But I bought too many and paid too much.
Steve Wilson wrote:
> > I did walk the current up at the beginning, but after a half-dozen or > so times, I found it did exactly the same thing every time so I went > to straight 14.5V and let it go into current limiting. The whole > charge cycle only lasted a few minutes, then the battery drain > rapidly dropped to 50mA or so and stayed there. The same thing would > happen in a car. > > Again, many sites recommend 14.5V to charge the battery. > > Just for information, here is the lab power supply I used: > > http://cgi.ebay.com/272724909909 > > I bought three on the recommendation of another poster on this newsgroup. > > It turns out to be very poor for electronics work. There is a huge > capacitor on the output. This will blow LEDs and ICs if you rely on > current limiting to save the device. In addition, there is a large > overshoot on power on, which could blow sensitive devices like laser > diodes. I definitely do not recommend this unit for lab work, but it > is OK for charging batteries. But I bought too many and paid too much.
I bought an Instek (Goodwill Industries) GPS1850D version for $17, but I prefer the 50 year old HP power supplies that I buy 'not working/for parts only' and rebuild. Some are so large that they need a decent relay rack to handle their weight. :) -- Never piss off an Engineer! They don't get mad. They don't get even. They go for over unity! ;-)
On Tue, 18 Jul 2017 02:55:02 GMT, the renowned Steve Wilson
<no@spam.com> wrote:

> >I doubt fuel economy has much to do with it. Modern cars start so quickly >that little energy is drawn from the battery. At 14.5V, the energy is >replaced in a few minutes, then the battery drain drops to negligible >level. Where you get the huge drain is heated seats, rear window defrost, >high power audio systems, and all the various electronic gadgets that add >to the alternator drain. Charging the battery is completely negligible, if >the car manufacturers would settle on the correct voltage.
Not just gadgets, many cars have electric power steering, electric water pumps etc. --sp -- Best regards, Spehro Pefhany