The pedometer function is one of the most useful in many of the wearable electronic devices available today. Your steps taken during the day are used to calculate both distance and calories burned.
According to historians, in 15 BC the Roman architect and engineer Vitruvius mounted a large wheel of known circumference in a small frame. When it was pushed along the ground by hand, it automatically dropped a pebble into a container at each revolution, giving a measure of the distance traveled. Essentially, that was the first odometer. Today’s electronic gadgets don’t feature wheels – they rely on indirect methods for measuring distance and the number of steps. The only way to do that is by using data from microelectromechanical systems (MEMS), such as gyroscopes and accelerometers. While different manufacturers have developed various proprietary methods, the general principle remains the same: the devices count the steps by registering spatial oscillations that result from the user walking around.
So, without further ado, let’s look at a couple of ways to estimate the steps count based on the data from an LIS331DLH sensor manufactured by STMicroelectronics.
The LIS331DLH sensor is a 3-axis accelerometer that senses motion along three axes: x, y and z. As an example, let’s look at the acceleration data from the X-axis generated by walking:
As you can see, the peaks in the chart clearly approximate the user’s steps. Unfortunately, the sensor is far from ideal and produces quite a bit of noise, which will be important to keep in mind when deciding on the algorithm later. In fact, every single accelerometer on the market generates some level of noise. Take a look at the data feed from a device that’s just sitting on a desk:
As you can see, even the device that is at rest is not returning a straight line.
Each MEMS sensor returns data measured along three orthogonal axes. Since we don’t know in advance the device’s physical orientation on the user’s person, we need to process the data from all three axes.
One simple but effective method is averaging the values form all three axes. This approach yields decent results, as you can see on the chart:
This method produces clearly identifiable steps while reducing the noise through the averaging process.
Another way to tackle the problem would be by calculating the angle between the sensor’s acceleration vectors in the current and the previous point. Since the device registers its data at equal time intervals, the angle between the two vectors will give us the sensor’s angular velocity. The angle between vectors is calculated using the following formula:
It is obvious that a device at rest (the difference between angles is zero) will generate 1 on the axis; the greater the oscillation, the larger the change of the angle. These results correlate with the user’s steps pretty nicely; however, this algorithm is more sensitive to the noise.
Obviously, none of these algorithms are perfect. The quality of their results will depend on a number of variables, both well known (such as the devices’ own noise) and less predictable. Even the device’s position on the user’s person (in the pants pocket, on the hip, or on the wrist) will affect the results. We have tested a number of pedometer applications on both iOS and Android, and they have all produced diverging results, as have several fitness wristbands such as Fitbit, Nike and others.
There are several methods for refining the results:
As far as the hardware is concerned, we could always use more accurate MEMS. I particularly like the MPU-6000 sensors designed specifically for use in smartphone platforms and wearables. Obviously, the devices already in circulation will not be getting a hardware upgrade, so we need to look to the future products for these improvements.
I’d like to note an interesting tack taken by Apple, who introduced a dedicated motion co-processor (Apple M7). Here’s why it’s so cool: All algorithms like the ones described above require the application to be constantly running and making calculations, reducing the mobile or wearable device’s battery life and the system’s overall snappiness. Apple’s engineers decided to move all these calculations into a separate ARM Cortex-M3 compatible co-processor that calculates and stores the motion data and serves them up to applications by request. The data available to iOS developers through the Core Motion Framework include Motion Activity (type of motion: walking, running, driving) and Step Counter. The user can read the current motion data, as well as ask for the data from a specified period of time in the past.
This is how counting the steps works in today’s electronics devices – as you can see, it is not super complicated, and the accuracy is sufficient for most applications.