Improving orientation tracking – handling sensor fusion
One of the limitations with sensor-based tracking is the sensors. As we introduced before, some of the sensors are inaccurate, noisy, or have drift. A technique to compensate their individual issue is to combine their values to improve the overall rotation you can get with them. This technique is called sensor fusion. There are different methods for fusing the sensors, we will use the method presented by Paul Lawitzki with a source code under MIT License available at http://www.thousand-thoughts.com/2012/03/android-sensor-fusion-tutorial/. In this section, we will briefly explain how the technique works and how to integrate sensor fusion to our JME AR application.
Sensor fusion in a nutshell
The fusion algorithm proposed by Paul Lawitzki merges accelerometers, magnetometers, and gyroscope sensor data. Similar to what is done with software sensor of an Android API, accelerometers and magnetometers are first merged to get an absolute orientation...