Since my last post I carried out a test to see how efficient it is to use the auxiliary battery to generate a supply in the field. In my case as a supply for my ham radio activity.
I attached a car head lamp to the auxiliary battery and measured the voltage, current, and power in watts. I turned on the ignition to Ready and took measurements every 15min until the fuel gauge notch when down by one. I previously charged the battery to full to start from a maximum supply.
The voltage during the period was 13.96 volts. Thus the dc to dc converter was operational. The current marked 4.3 amps. Thus the bulb consumed 60.73 watts. I took 4.25 hours to consume one notch. Hence the load consumed 258 w/hrs.
To control the power consumption figure I recharged the battery to full and measured the kw/hr to obtain full charge. This marked 1.49kw/hr.
My question is if I used 258w/hr to light the bulb why it took 1.49kw/hr to replenish the Battery? I would assume that the two figures to be close. What is consuming power to make up the difference?
I attached a car head lamp to the auxiliary battery and measured the voltage, current, and power in watts. I turned on the ignition to Ready and took measurements every 15min until the fuel gauge notch when down by one. I previously charged the battery to full to start from a maximum supply.
The voltage during the period was 13.96 volts. Thus the dc to dc converter was operational. The current marked 4.3 amps. Thus the bulb consumed 60.73 watts. I took 4.25 hours to consume one notch. Hence the load consumed 258 w/hrs.
To control the power consumption figure I recharged the battery to full and measured the kw/hr to obtain full charge. This marked 1.49kw/hr.
My question is if I used 258w/hr to light the bulb why it took 1.49kw/hr to replenish the Battery? I would assume that the two figures to be close. What is consuming power to make up the difference?