suradasa
Member
Hi there.
I figured that the best way to measure capacity loss over time is to measure how many kWh I'm putting into the battery per bar. Then I can track how it changes over time.
I've done it twice so far to benchmark the initial state, but the two numbers are farther apart than I would have expected:
1) Charged from just below 2 bars to just above 12 bars (L1, had to go to work). I guesstimated that I put about 10.1-10.2 bars in, measured 10.87 kWh --> 1.065-1.075 kWh/bar
2) Charged from almost exactly 7 bars to full. So about 9 bars in, measured 10.28 kWh --> 1.14 kWh/bar
So that's 7-ish% difference. Any thoughts? Maybe the bar readings aren't very precise? Obviously, charging from near-zero to full would be ideal for this, but I try not to fully discharge it (and it takes a long time).
I can just take a bunch of readings over time and see what happens, but curious to see what people think.
Surdas
I figured that the best way to measure capacity loss over time is to measure how many kWh I'm putting into the battery per bar. Then I can track how it changes over time.
I've done it twice so far to benchmark the initial state, but the two numbers are farther apart than I would have expected:
1) Charged from just below 2 bars to just above 12 bars (L1, had to go to work). I guesstimated that I put about 10.1-10.2 bars in, measured 10.87 kWh --> 1.065-1.075 kWh/bar
2) Charged from almost exactly 7 bars to full. So about 9 bars in, measured 10.28 kWh --> 1.14 kWh/bar
So that's 7-ish% difference. Any thoughts? Maybe the bar readings aren't very precise? Obviously, charging from near-zero to full would be ideal for this, but I try not to fully discharge it (and it takes a long time).
I can just take a bunch of readings over time and see what happens, but curious to see what people think.
Surdas