I'm trying to calculate the total energy usage of a dynamic system that is powered by a bank of lead-acid batteries.
My method is simple - log current (using current clamp) and voltage across the battery then post process the logged data -
a) For each sample point, Instantaneous Power = V*I
b) Energy used by that sample point is Power * Time Period
c) Total Energy Used in log period is the sum of all energies calculated in (b)
This works OK, but is subject to quite a bit of error if there is any offset in the current clamp at 0A. If there is any noise on the current clamp and 0A is, say, measured as 0.1A, then this creates a false Power reading from V*I and this error accumulates across the entire measurement cycle, eventually creating quite a big error.
Is there a more clever way to calculate power usage that nullifies this current-offset error? I was thinking to try and measure the drop in charge of the battery box, but for most trials we do this drop is negligible and almost unmeasureable (at least, that is, based on my understanding of LA batteries)