Counting Steps with MEMS
The pedometer function is one of the most useful in many of the wearable electronic devices available today. Your steps taken during the day are used to calculate both distance and calories burned.
According to historians, in 15 BC the Roman architect and engineer Vitruvius mounted a large wheel of known circumference in a small frame. When it was pushed along the ground by hand, it automatically dropped a pebble into a container at each revolution, giving a measure of the distance traveled. Essentially, that was the first odometer. Today’s electronic gadgets don’t feature wheels – they rely on indirect methods for measuring distance and the number of steps. The only way to do that is by using data from microelectromechanical systems (MEMS), such as gyroscopes and accelerometers. While different manufacturers have developed various proprietary methods, the general principle remains the same: the devices count the steps by registering spatial oscillations that result from the user walking around.
So, without further ado, let’s look at a couple of ways to estimate the steps count based on the data from an LIS331DLH sensor manufactured by STMicroelectronics.
The LIS331DLH sensor is a 3-axis accelerometer that senses motion along three axes: x, y and z. As an example, let’s look at the acceleration data from the X-axis generated by walking:
As you can see, the peaks in the chart clearly approximate the user’s steps. Unfortunately, the sensor is far from ideal and produces quite a bit of noise, which will be important to keep in mind when deciding on the algorithm later. In fact, every single accelerometer on the market generates some level of noise. Take a look at the data feed from a device that’s just sitting on a desk:
As you can see, even the device that is at rest is not returning a straight line.
Each MEMS sensor returns data measured along three orthogonal axes. Since we don’t know in advance the device’s physical orientation on the user’s person, we need to process the data from all three axes.
Processing The Results
One simple but effective method is averaging the values form all three axes. This approach yields decent results, as you can see on the chart:
This method produces clearly identifiable steps while reducing the noise through the averaging process.
Another way to tackle the problem would be by calculating the angle between the sensor’s acceleration vectors in the current and the previous point. Since the device registers its data at equal time intervals, the angle between the two vectors will give us the sensor’s angular velocity. The angle between vectors is calculated using the following formula:
It is obvious that a device at rest (the difference between angles is zero) will generate 1 on the axis; the greater the oscillation, the larger the change of the angle. These results correlate with the user’s steps pretty nicely; however, this algorithm is more sensitive to the noise.
Perfecting The Algorithms
Obviously, none of these algorithms are perfect. The quality of their results will depend on a number of variables, both well known (such as the devices’ own noise) and less predictable. Even the device’s position on the user’s person (in the pants pocket, on the hip, or on the wrist) will affect the results. We have tested a number of pedometer applications on both iOS and Android, and they have all produced diverging results, as have several fitness wristbands such as Fitbit, Nike and others.
There are several methods for refining the results:
- Averaging: This is the most direct and simple method that corrects the most glaring errors
- Digital filters: Most of the noise are high frequency signals that can be cut off by a low-pass filter with the right cutoff frequency (for step count, the cutoff frequency of about 10 Hz). The software implementation of a first-order or even a second-order filter is easily achievable even on low-power wearable devices.
- Using inertial measurement units (IMU): Utilizing additional data from other sensors may enhance the results, e.g. by combining the data from an accelerometer and a gyro, or from an accelerometer and a compass. This will reduce the number of “false positives”. Additionally, by throwing more data into the mix (e.g., the feed from a GPS chip) we can make some smart adjustments to the algorithm, for example, based on the distance traveled by the user; it would also be possible to estimate the user’s stride length, and even his or her height.
As far as the hardware is concerned, we could always use more accurate MEMS. I particularly like the MPU-6000 sensors designed specifically for use in smartphone platforms and wearables. Obviously, the devices already in circulation will not be getting a hardware upgrade, so we need to look to the future products for these improvements.
I’d like to note an interesting tack taken by Apple, who introduced a dedicated motion co-processor (Apple M7). Here’s why it’s so cool: All algorithms like the ones described above require the application to be constantly running and making calculations, reducing the mobile or wearable device’s battery life and the system’s overall snappiness. Apple’s engineers decided to move all these calculations into a separate ARM Cortex-M3 compatible co-processor that calculates and stores the motion data and serves them up to applications by request. The data available to iOS developers through the Core Motion Framework include Motion Activity (type of motion: walking, running, driving) and Step Counter. The user can read the current motion data, as well as ask for the data from a specified period of time in the past.
This is how counting the steps works in today’s electronics devices – as you can see, it is not super complicated, and the accuracy is sufficient for most applications.
Download Case Study
Need more details?
Fill in the form and we’ll contact you as soon as possible.
David Tedford has over 20 years of sales experience within the IT/software industry. He excels at being immersed in a customer's environment, understanding his customers requirements, crafting solutions to meet those requirements, and ultimately providing solutions to his customers.
Senior Vice President
As the head of business development for First Line Software, Vladimir heads up business development in Western Europe and Russia.
Vladimir began his career in IT in 2002, when, as a student of Faculty of Automation of Computer Science of the First Electrotechnical University (ETU “LETI”), he began his work at The Morfizpribor Central Research Institute (CRI). Vladimir joined the StarSoft team (predecessor of First Line Software) in 2004 as a Junior Software Developer. As he gained experience with more and more projects, he was promoted to leadership roles.
The Hague, Netherlands
Praha, Czech Republic
UK Business Development
Richard has over 15 years of sales and account management expertise in the IT and Tech sector. He has worked on many outsourcing engagements with global companies.
Gloucestershire, United Kingdom
David is a business development professional with more than 20 years’ experience as a specialist in the acquisition of partnerships and IT/software services for associations, not-for profits and corporations in Australia, New Zealand and USA. He has specific expertise in the healthcare, legal and hospitality industries.