## watt time is left

The simplest interface to get battery status info is to run *apm*. This gives us both percentage and an estimate of time remaining.

`Battery state: high, 78% remaining, 197 minutes life estimate`

Is it accurate? Where do the numbers come from? Looking at the acpibat sensors on a modern system with *sysctl* helps.

```
hw.sensors.acpibat0.volt0=15.20 VDC (voltage)
hw.sensors.acpibat0.volt1=15.87 VDC (current voltage)
hw.sensors.acpibat0.power0=11.41 W (rate)
hw.sensors.acpibat0.watthour0=48.07 Wh (last full capacity)
hw.sensors.acpibat0.watthour1=2.40 Wh (warning capacity)
hw.sensors.acpibat0.watthour2=0.20 Wh (low capacity)
hw.sensors.acpibat0.watthour3=37.55 Wh (remaining capacity), OK
hw.sensors.acpibat0.watthour4=50.08 Wh (design capacity)
hw.sensors.acpibat0.raw0=1 (battery discharging), OK
```

We take remaining remaining capacity, divide by power (the discharge rate), and voila. `37.55 / 11.41 * 60 = 197.45836985100788781720`

Apple can’t handle the precision!

Of course, that’s with me deliberately burning up CPU to spike the power rate. At idle, we have numbers more like this.

```
hw.sensors.acpibat0.power0=5.72 W (rate)
hw.sensors.acpibat0.watthour3=35.91 Wh (remaining capacity), OK
Battery state: high, 74% remaining, 376 minutes life estimate
```

Math checks out here too.

These numbers don’t actually come from OpenBSD, though. It’s just reading the ACPI values. There’s a bit of hysteresis applied (I assume), but maybe we can do better. Apparently Apple’s problem (or one problem) is that the remaining time is inaccurate because background tasks are starting and stopping and it’s not possible to calculate what normal is because every time you measure there’s a new normal.

At least on my Thinkpad, the watthour3 sensor is fairly precise. Even while idle, I can watch it tick down a centiwatthour every 10 seconds. Instead of using the instantaneous discharge rate, we can watch remaining capacity over a longer time, calculate an effective rate, and then calculate remaining time with that.

Just to demonstrate, here’s a shell script that gives decent, if not great, results.

`aa=``sysctl -n hw.sensors.acpibat0.watthour3 | cut -f1 -d ' '`

while : ; do
bb=$aa
sleep 60
aa=`sysctl -n hw.sensors.acpibat0.watthour3 | cut -f1 -d ' '`

delta=`dc -e "$bb $aa - p"`

echo -n "Time remaining... "
dc -e "$aa $delta / p"
done

A one minute capacity delta ends up having less precision than the currently measured discharge rate, so it flops around a bit. (32.7 divided by 0.09 or 0.10).

```
Time remaining... 364
Time remaining... 327
Time remaining... 326
Time remaining... 361
```

Running apm, my estimated battery remaining is closer to 347 minutes. A step backwards? We wanted more accurate and more consistent, not less. We can adjust the sleep interval, however. Maybe five minutes? `32.7 * 5 / 0.47 = 348`

That’s enough time to let the delta calculation soak up some more precision. Our estimate is no longer immediately responsive to changing conditions, but that’s exactly what we want. If I run a quick CPU intensive task, my battery life doesn’t immediately flatline. Or, if not programming in shell, perhaps we may consider a trailing average of samples. In the end, we’re just reinventing whatever the ACPI firmware does to calculate discharge rate, but with more insight into the source of the numbers.

Or we could choose to not run dozens of background tasks, har har.