Sensor's sampling period, power, and battery consumption
When you are registering an event listener, you can suggest a sampling period or delay between the sensor event values in microseconds. This sampling period is only a signal to the operating system to send the sensor values at the suggested sampling rate via the OnSensorChanged()
method. The operating system might choose a bigger delay, depending on the load on the processer, and that's why it is discouraged to build a time-sensitive logic that relies on the delay between the sensor events.
You can only specify the absolute delay from Android 3.0 (API Level 11) and above. Prior to this version, you could only use the following four constants supported by the platform:
SENSOR_DELAY_FASTEST
: This has a default value of 0 microseconds. It is not recommended to use this delay, as it increases the CPU cycles by multiple times and drains the battery much faster.SENSOR_DELAY_GAME
: This has a default value of 20,000 microseconds. It is only recommended for those games that need the highest degree of precision and accuracy.SENSOR_DELAY_UI
: This has a default value of 60,000 microseconds and is recommended for most cases.SENSOR_DELAY_NORMAL
: It has a default value of 200,000 microseconds and is used for reducing the extra CPU cycles and saving the battery.
It's the choice of the developer to either use the delay constants or specify their own delay value. Power consumption and the degree of precision are the two important factors to consider before deciding the right sampling period. The power consumption of any sensor can be checked via the getPower()
method of the sensor object, which returns the power in mA. Among the physical sensors, the accelerometer is the most power efficient and has the least battery consumption. The gyroscope and magnetometer come after the accelerometer with regard to power efficiency and battery consumption.
You will often hear the terms delay and sampling period being used interchangeably because they mean the same thing. There is another term called sampling frequency, which is the inverse of the sampling period (in seconds) and is measured in Hertz (Hz). For example, if you are using the sampling period of 60,000 microseconds for a sensor, then the sampling frequency will be 16.66 Hz. This conversion is just a two-step process. First, convert the time into seconds, as 1 second is 10 to power 6 microseconds, so 60,000 microseconds will be 0.06 seconds. Now, the frequency (the inverse of delay) is 1/0.06 = 16.66 Hz.