Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Android Sensor Programming By Example
Android Sensor Programming By Example

Android Sensor Programming By Example: Take your Android applications to the next level of interactivity by exploring the wide variety of Android sensors

Arrow left icon
Profile Icon Nagpal
Arrow right icon
NZ$14.99 NZ$51.99
Full star icon Full star icon Full star icon Full star icon Half star icon 4.8 (4 Ratings)
eBook Apr 2016 194 pages 1st Edition
eBook
NZ$14.99 NZ$51.99
Paperback
NZ$64.99
Subscription
Free Trial
Arrow left icon
Profile Icon Nagpal
Arrow right icon
NZ$14.99 NZ$51.99
Full star icon Full star icon Full star icon Full star icon Half star icon 4.8 (4 Ratings)
eBook Apr 2016 194 pages 1st Edition
eBook
NZ$14.99 NZ$51.99
Paperback
NZ$64.99
Subscription
Free Trial
eBook
NZ$14.99 NZ$51.99
Paperback
NZ$64.99
Subscription
Free Trial

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Table of content icon View table of contents Preview book icon Preview Book

Android Sensor Programming By Example

Chapter 1.  Sensor Fundamentals

In this chapter, we will understand the fundamentals of sensors and explore what the sensor world looks like from an Android perspective. We will also look at the classes, interfaces, and methods provided by the Android platform to access sensors. This chapter will also focus on the standards and best practices for using Android sensors.

You will learn the following topics in this chapter:

  • What are sensors?
  • Different types of sensors and values.
  • Individual sensor descriptions and their common usage.
  • How to use sensor coordinate system?
  • What is Android Sensor Stack?
  • Understanding the Sensor framework APIs and classes.
  • Understanding the sensor sampling period, frequency, and reporting mode.
  • Specific sensor configuration and sensor availability based on the API level.
  • Best practices to access and use sensors.

What are sensors?

In simple words, sensors measure a particular kind of physical quantity, such as force acting on device, light falling on a surface, or the temperature in a room. These are examples of a basic physical quantity that sensors can measure. Most Android phones come with advance sensors that can measure valuable information such as relative humidity, atmospheric pressure, magnetic field, steps taken, the rate of rotation of a device on the x, y, and z axes, proximity to an object, and many more. The majority of the sensors are Micro Electro Mechanical Sensors (MEMS), which are made on a tiny scale (in micrometers), usually on a silicon chip, with mechanical and electrical elements integrated together.

The basic working principle behind MEMS is to measure the change in electric signal originating due to mechanical motion. This change in electric signals is converted to digital values by electric circuits. The accelerometer and gyroscope are the main examples of MEMS. Most of the sensors in an Android phone consume minimal battery and processing power. We will discuss all the important sensors in detail in the coming chapters.

Types of sensors

Sensor can be broadly divided into the following two categories:

  • Physical Sensors: These are the actual pieces of hardware that are physically present on the device. They are also known as hardware sensors. Accelerometers, gyroscopes, and magnetometers are examples of physical sensors.
  • Synthetic Sensors: These are not physically present on the device, and they are instead derived from one or more sensors. They are also called virtual, composite, or software sensors. Gravity, linear acceleration, and step detector are examples of synthetic sensors.

The Android platform doesn't make any distinction when dealing with physical sensors and synthetic sensors. The distinction is mostly theoretical to understand the origin of the sensor values.

Types of sensor values

Sensor values can be broadly divided into the following three categories:

  • Raw: These values are directly given by the sensor. The operating system simply passes these values to the apps without adding any correction logic. Accelerometers, proximity sensors, light sensors, and barometers are sensors that give raw values.
  • Calibrated: These values are computed by the operating system by adding extra correction algorithms, such as drift compensation and removing bias and noise over the raw values given by sensors. Step detector, step counter, and significant motion are sensors that give calibrated values by using an accelerometer as their base sensor. The magnetometer and gyroscope are special kinds of sensor that give both raw and calibrated values.
  • Fused: These values are derived from a combination of two or more sensors. Generally, these values are calculated by leveraging the strength of one sensor to accommodate the weaknesses of other sensors. Gravity and linear acceleration give fused values by using the accelerometer and gyroscope.

Motion, position, and environmental sensors

The Android platform supports mainly three broad categories of sensors: the motion, position, and environment-based sensors. This categorization is done based on the type of physical quantity detected and measured by the sensors.

Motion sensors

Motion sensors are responsible for measuring any kind of force that could potentially create motion in the xy, and z axes of the phone. The motion could be either a linear or angular movement in any direction. This category includes accelerometers, gravity, gyroscope, and rotational vector sensors. Most of these sensors will have values in the x, y, and z axes, and the rotational vector will especially have extra value in the fourth axis, which is the scalar component of the rotation vector.

The following table summarizes the motion sensor usage, types, and power consumption:

Sensor

Type

Value

Underlying Sensors

Description

Common Usage

Power Consumption

Accelerometer

Physical

Raw

Accelerometer

This measures the acceleration force along the xy, and z axes (including gravity). Unit: m/s2

It can be used to detect motion such as shakes, swings, tilt, and physical forces applied on the phone.

Low

Gravity

Synthetic

Fused

Accelerometer, Gyroscope

This measures the force of gravity along the xy, and z axes. Unit: m/s2

It can be used to detect when the phone is in free fall.

Medium

Linear Acceleration

Synthetic

Fused

Accelerometer, Gyroscope

It measures the acceleration force along the xy, and z axes (excluding gravity). Unit: m/s2

It can be used to detect motion such as shakes, swings, tilt, and physical forces applied on phone.

Medium

Gyroscope

Physical

Raw, Calibrated

Gyroscope

This measures the rate of rotation of the device along the xy, and z axes. Unit: rad/s

It can be used to detect rotation motions such as spin, turn, and any angular movement of the phone.

Medium

Step Detector

Synthetic

Calibrated

Accelerometer

This detects walking steps.

It can be used to detect when a user starts walking.

Low

Step Counter

Synthetic

Calibrated

Accelerometer

It measures the number of steps taken by the user since the last reboot while the sensor was activated

It keeps track of the steps taken by the user per day.

Low

Significant Motion

Synthetic

Calibrated

Accelerometer

It detects when there is significant motion on the phone because of walking, running, or driving.

It detects a significant motion event.

Low

Rotation Vector

Synthetic

Fused

Accelerometer, Gyroscope, Magnetometer

This measures the rotation vector component along the x axis (x * sin(θ/2)), y axis (y * sin(θ/2)), and z axis (z * sin(θ/2)). Scalar component of the rotation vector ((cos(θ/2)). Unitless.

It can be used in 3D games based on phone direction.

High

Position sensors

Position sensors are used to measure the physical position of the phone in the world's frame of reference. For example, you can use the geomagnetic field sensor in combination with the accelerometer to determine a device's position relative to the magnetic North Pole. You can use the orientation sensor to determine the device's position in your application's frame of reference. Position sensors also support values in the x,y, and z axes.

The following table summarizes the position sensor's usage, types, and power consumption:

Sensor

Type

Value

Underlying Sensors

Description

Common Usage

Power Consumption

Magnetometer

Physical

Raw, Calibrated

Magnetometer

This measures the geomagnetic field strength along the xy, and z axes. Unit: μT

It can be used to create a compass and calculate true north.

Medium

Orientation (Deprecated)

Synthetic

Fused

Accelerometer, Gyroscope, Magnetometer

This measures the Azimuth (the angle around the z axis), Pitch (the angle around the x axis), and Roll (the angle around the y axis). Unit: Degrees

It can be used to detect the device's position and orientation.

Medium

Proximity

Physical

Raw

Proximity

This measures the distance of an object relative to the view screen of a device. Unit: cm

It can be used to determine whether a handset is being held up to a person's ear.

Low

Game Rotation Vector

Synthetic

Fused

Accelerometer, Gyroscope

This measures the rotation vector component along the x axis (x * sin(θ/2)), y axis (y * sin(θ/2)), and z axis (z * sin(θ/2)). It is the scalar component of the rotation vector (cos(θ/2)). Unitless. It is based only on the Gyroscope and Accelerometer and does not use the Magnetometer.

It can be used in 3D games based on phone direction.

Medium

Geomagnetic Rotation Vector

Synthetic

Fused

Accelerometer, Magnetometer

This measures the rotation vector component along the x axis (x * sin(θ/2)), y axis (y * sin(θ/2)), and z axis (z * sin(θ/2)). It is the scalar component of the rotation vector (cos(θ/2)). Unit less. * It is based only on the Magnetometer and Accelerometer and does not use the Gyroscope.

It can be used in augmented reality apps, which are based on the phone and compass direction.

Medium

Environmental sensors

Environment sensors are responsible for measuring environmental properties, such as temperature, relative humidity, light, and air pressure near the phone. Unlike motion and position sensors, which give sensor values multi-dimensional arrays, the environment sensors report single sensor values.

The following table summarizes the environment sensor's usage, types, and power consumption:

Sensor

Type

Value

Underlying Sensors

Description

Common Usage

Power Consumption

Ambient Temperature

Physical

Raw

Thermometer

This measures the ambient air temperature. Unit: Degrees Celsius

It is used for monitoring temperatures.

Medium

Light

Physical

Raw

Photometer

This measures the ambient light level (illumination). Unit: lx

It can be used to dim the screen brightness of the phone.

Low

Barometer

Physical

Raw

Barometer

This measures the ambient air pressure. Unit: mPa or mbar

It can be used to measure height relative to sea level.

Medium

Relative Humidity

Physical

Raw

Relative Humidity

This measures the relative ambient humidity in percentage. Unit: %

It can be used for calculating the dew point, and absolute and relative humidity.

Medium

Sensors' coordinate system

Most of the sensors use the standard 3-axis coordinate system to represent the sensor values. This coordinate system is similar to the 3-axis coordinate system used to measure the length, breadth, and height of any 3D object in space, along with the difference of the frame of reference and the orientation of the 3-axis. As depicted in the following figure, the origin of this coordinate system lies in the center of the screen. When the device is in its default orientation (generally the portrait mode), the x axis is in the horizontal direction with the right-hand side having positive values and the left-hand side having negative values. Similarly, the y axis is in the vertical direction and the z axis is coming out of the phone screen. Points above the origin in a vertical direction are positive, and the ones below the origin in vertical direction are negative for the y axis. Similarly, the points coming out of the screen are positive, and the points behind the phone screen are negative for the z axis.

Sensors' coordinate system

This particular xy, and z axis orientation stands good for all the devices that have their default orientation as portrait mode, as shown in the previous figure. But for any device, especially tablets, the orientation of the x and y axes are swapped when their default orientation is in landscape mode. The z axis' orientation remains the same. So, before making any assumption about the orientation of an axis, it's always a good practice to confirm the default mode of the device. In this coordinate system, we always use the device's frame as a point of reference. The device coordinate system is never changed or swapped, especially when the phone is moved or rotated in any direction. The OpenGL (Graphic library) uses the same coordinate system and rules to define its values.

Some position sensors and their methods use a coordinate system that is relative to the world's frame of reference, as opposed to the device's frame of reference. These sensors and methods return data that represents the device motion or device position relative to the earth. The Orientation Sensor, Rotation Vector Sensor, and getOrientation() method use the world's frame of reference coordinate system, while all the other position, motion, and environmental sensors use the device's frame of reference coordinate system.

Android Sensor Stack

The following figure represents the layers in the Android Sensor Stack. Each layer in the sensor stack is responsible for a specific task and communicating with the next layer. The top-most layer consists of Android Apps, which are the consumers of the data from sensors. The second layer is the Android SDK layer, through which the android applications can access the sensors. The Android SDK contains APIs to list the available sensors to register to a sensor and all the other sensor functionality. The third layer consists of the Android Framework, which is in charge of linking several applications to a single HAL client. The framework consists of various components to provide simultaneous access to multiple applications. It is discussed in detail in the next section. The fourth layer is called HAL (Sensors' Hardware Abstraction Layer), which provides the interface between the hardware drivers and the Android framework. It consists of one HAL interface sensor and one HAL implementation, which we refer to assensors.cpp. The HAL interface is defined by the Android and AOSP (Android Open Source Project) contributors, and the implementation is provided by the manufacturer of the device. The Sensor Drivers are the fifth layer of the stack, and they are responsible for interacting with the physical devices.

In some cases, the HAL implementation and the drivers are the same software entity, while in other cases, the hardware integrator requests the sensor chip manufacturers to provide the drivers. The Sensor Hub is the sixth optional layer of the stack. The Sensor Hub generally consists of a separate, dedicated chip for performing low-level computation at low power, while the application processor is in the suspended mode. It is generally used for sensor batching and adding hardware FIFO queue (which is discussed in detail in the Wake locks, wakeup sensors, and FIFO queue section of Chapter 4Light and Proximity Sensors). The final seventh layer consists of the physical hardware sensors. Mostly, they are made up of the MEMS silicon chip, and they do the real measuring work.

Android Sensor Stack

Components of the sensor framework

Android has provided methods, classes, and interfaces for accessing sensors and their data that is available on an Android device. These sets of methods, classes, and interfaces are collectively referred to as the sensor framework and are a part of the android.hardware package. It consists of four major components: SensorManagerSensorSensorEvent, and SensorEventListener. The entry point to the framework is the SensorManager class, which allows an app to request sensor information and register to receive sensor data. When registered, sensor data values are sent to a SensorEventListener interface in the form of a SensorEvent class that contains information produced from a given sensor. Let's look at each component in detail.

SensorManager

SensorManager is the class that makes it possible for your app to get access to the sensors. It creates the instance of the system sensor service, which provides various APIs to access sensor information on the device. It exposes the methods that list the available and default sensors on the device. This class also provides several sensor constants that are used to report sensor accuracy, sampling period, and calibrate sensors. One of the important tasks of this class is to register and unregister sensor event listeners for accessing a particular sensor.

SensorEventListener

SensorEventListener is the interface that provides two callbacks to receive the sensor notification (sensor event). OnSensorChanged() is the first method of the interface, which is called whenever there is any change in the sensor values. The change in sensor value is communicated through the SensorEvent object, passed as a parameter to this method. OnAccuracyChanged() is the second method, which is called whenever there is a change in the accuracy of sensor values. The sensor object and newly reported accuracy in integers are sent as parameters to this method. There are four accuracy integer constants supported by SensorManager. They are as follows:

  • SENSOR_STATUS_ACCURACY_HIGH
  • SENSOR_STATUS_ACCURACY_MEDIUM
  • SENSOR_STATUS_ACCURACY_LOW
  • SENSOR_STATUS_ACCURACY_UNRELIABLE

Sensor

Sensor is the class that is used to create an instance of a specific sensor. This class provides various methods that let you determine a sensor's capabilities:

  • Maximum Range
  • Minimum Delay
  • Name
  • Power
  • Resolution
  • Reporting Mode
  • Type
  • Vendor
  • Version
  • isWakeUp Sensor

We will be discussing each capability and method in detail in the Time for action - knowing the individual sensor capability section of Chapter 2Playing with Sensors.

SensorEvent

SensorEvent is a special kind of class that is used by the operating system to report changes in the sensor values to the listeners. This SensorEvent object contains the following four elements:

  • values[]: This is a multidimensional array that holds the sensor values
  • timestamp: This refers to the time in nanoseconds at which the event happened
  • accuracy: This is one of the four accuracy integer constants
  • sensor: This is the sensor type that generated this data

The following class diagram depicts the important methods and variables for the four key components of the Sensor Framework:

SensorEvent

Sensor's sampling period, power, and battery consumption

When you are registering an event listener, you can suggest a sampling period or delay between the sensor event values in microseconds. This sampling period is only a signal to the operating system to send the sensor values at the suggested sampling rate via the OnSensorChanged() method. The operating system might choose a bigger delay, depending on the load on the processer, and that's why it is discouraged to build a time-sensitive logic that relies on the delay between the sensor events.

You can only specify the absolute delay from Android 3.0 (API Level 11) and above. Prior to this version, you could only use the following four constants supported by the platform:

  • SENSOR_DELAY_FASTEST: This has a default value of 0 microseconds. It is not recommended to use this delay, as it increases the CPU cycles by multiple times and drains the battery much faster.
  • SENSOR_DELAY_GAME: This has a default value of 20,000 microseconds. It is only recommended for those games that need the highest degree of precision and accuracy.
  • SENSOR_DELAY_UI: This has a default value of 60,000 microseconds and is recommended for most cases.
  • SENSOR_DELAY_NORMAL: It has a default value of 200,000 microseconds and is used for reducing the extra CPU cycles and saving the battery.

It's the choice of the developer to either use the delay constants or specify their own delay value. Power consumption and the degree of precision are the two important factors to consider before deciding the right sampling period. The power consumption of any sensor can be checked via the getPower() method of the sensor object, which returns the power in mA. Among the physical sensors, the accelerometer is the most power efficient and has the least battery consumption. The gyroscope and magnetometer come after the accelerometer with regard to power efficiency and battery consumption.

You will often hear the terms delay and sampling period being used interchangeably because they mean the same thing. There is another term called sampling frequency, which is the inverse of the sampling period (in seconds) and is measured in Hertz (Hz). For example, if you are using the sampling period of 60,000 microseconds for a sensor, then the sampling frequency will be 16.66 Hz. This conversion is just a two-step process. First, convert the time into seconds, as 1 second is 10 to power 6 microseconds, so 60,000 microseconds will be 0.06 seconds. Now, the frequency (the inverse of delay) is 1/0.06 = 16.66 Hz.

The reporting modes of sensors

Sensors can generate events in different ways called reporting modes. Each sensor has a particular type of reporting mode. The reporting mode is an Integer constant of the Sensor class, which can be obtained using the getReportingMode() method of the Sensor object. Knowing the reporting mode of a sensor can help developers write an efficient logic. Reporting modes can be categorized into following four types:

  • Continuous: In continuous reporting mode, the sensor events are generated at a constant rate defined by the sampling period. This sampling period is set at the time of registering the listener for the sensor. For example, the sensors using the continuous reporting mode are the accelerometer and gyroscope.
  • On Change: In the on-change reporting mode, the sensor events are generated only if the measured values have changed from the last known values. For example, sensors using the on-change reporting mode are the step counter, proximity, and heart rate sensors.
  • One Shot: The one shot reporting mode is based on the fire and forget concept. They are triggered only once in the entire duration of the event. The significant motion sensor uses the one shot reporting mode to notify the event. It is only fired once, when the sensor detects the start of significant motion because of walking, running, or driving.
  • Special Trigger: The special trigger is fired on each occurrence of a particular event. Upon the detection of an event, the sensor values are generated and passed to the listener. The sampling period is ignored in this case. The step detector sensor is an example of the special trigger reporting mode, which is fired on every step taken.

Dealing with specific sensor configuration

There might be some scenarios in which certain features of your application might depend on a specific sensor, and that sensor is not present on the device. In such cases, a good option would be to either turn off that dependent feature or not allow the user to install the application. Let's explore each option in detail.

Checking the availability of the sensor at runtime

If you have a weather utility app, and it uses the pressure sensor on the phone to check the atmospheric pressure, then it's not a good idea to directly use the sensor. There are many Android phones that don't have a pressure sensor on them. If such cases are not handled properly, your application might even crash, which will be a bad user experience.

It's always recommended to check the availability of a sensor before using it in the application. The following code snippet shows how to check the availability of the sensor:

private SensorManager mSensorManager; 
... 
mSensorManager= 
(SensorManager)getSystemService(Context.SENSOR_SERVICE); 
if(mSensorManager.getDefaultSensor(Sensor.TYPE_PRESSURE)!=null){ 
  // Success! There's a pressure sensor. 
}else{ 
  // Failure! No pressure sensor. 
} 

Declaring the sensor as mandatory feature

If measuring atmospheric pressure using the phone pressure sensor is the main feature of your application, then you may not want to support those devices that don't have a pressure sensor in them. The Android platform supports this functionality by declaring uses-feature filters in the AndroidManifest.xml file:

<uses-feature android:name="android.hardware.sensor.barometer"  android:required="true" /> 

This code snippet informs the Android platform that the pressure sensor is required for this app to function. Google Play uses this uses-feature to filter out those devices that don't have the pressure sensor in them, and hence your app is only installed on the supported devices. The sensors that are supported by uses-feature are the accelerometer, gyroscope, light, barometer (pressure), compass (geomagnetic field), and proximity sensors.

If your application uses a sensor for some feature, but can still run without that sensor by turning off that feature, then it's advisable to declare the sensor in  uses-feature but still set the required value to false (android:required="false"). This informs the operating system that your application uses that sensor, but it can still function without it. It's the developer's responsibility to check the availability of the sensor at runtime.

Sensor availability based on the Android API level

There is a wide variety of sensors that are supported on Android devices. As Android evolved over a period of time, new sensors were added, and some old, inefficient sensors were removed. With the release of newer versions of Android, they got better and more accurate, and the list of supported sensors got bigger. Most of the apps have to support older versions of Android to target the wider audience. But at the same time, not all sensors are supported by older versions of Android. It's a tradeoff between supporting older versions of Android versus getting to use the latest and more advanced sensors that are only available in newer versions of Android.

The following table provides the sensor availability list based on the Android version and API levels. This table illustrates four major platforms to show availability, as the major changes were made in these four platforms only:

Sensor

Android 6.0 (API Level 23)

Android 4.0 (API Level 14)

Android 2.3 (API Level 9)

Android 2.2 (API Level 8)

Accelerometer

Available

Available

Available

Available

Ambient temperature

Available

Available

NA

NA

Gravity

Available

Available

Available

NA

Gyroscope

Available

Available

Available

NA

Light

Available

Available

Available

Available

Linear acceleration

Available

Available

Available

NA

Magnetic field

Available

Available

Available

Available

Orientation

Deprecated

Deprecated

Deprecated

Deprecated

Pressure

Available

Available

Available

NA

Proximity

Available

Available

Available

Available

Relative humidity

Available

Available

NA

NA

Rotation vector

Available

Available

Available

NA

Step Detector

Available

NA

NA

NA

Step Counter

Available

NA

NA

NA

Temperature

Deprecated

Deprecated

Available

Available

Best practice for accessing sensors

Android devices are manufactured by different OEMs (Original Equipment Manufactures) and come with various configurations. Each OEM is free to support its own set of sensors, which again come from different vendors. This creates the problem of device fragmentation. This problem is further complicated by addition and deprecation of sensors with different Android API levels. The following are some best practices that will help you deal with this device fragmentation problem and avoid common pitfalls and mistakes:

  • Before using the sensor coordinate system, confirm the default orientation mode of the device and check for the orientation of the x and y axes.
  • Check the availability, range, minimum delay, reporting modes, and resolution of the sensor before using it.
  • Before selecting the sampling period of any sensor, check for its power consumption. Also, keep your application precision and accuracy needs in mind before deciding the sampling period. It's recommended that you select one of the constants given by the operating system.
  • Do not block or do heavy processing on the OnSensorChanged() method. Your app might miss callbacks or go into ANR (Application Not Responding) mode. The app might even crash in the worst cases if this callback is blocked.
  • Every registration of the event listener should be paired with the un-registration of the same listener. This should be done at the right time and place. (More on this, in the next chapter).
  • Avoid using deprecated sensors and any of the deprecated APIs.
  • Never write any kind of application logic based on the delay between the sensor events. Always use the timestamp from the sensor event to do your time-related calculations.
  • If some sensors are mandatory for your application to function, then use the uses-feature filter in the Manifest.xml file and change the required value to true.
  • Check your application and its sensor behavior on more than one device, as the sensor values and range may vary with different devices.

Summary

We looked at the important concepts of sensor, their types, values, and common uses. The best practices discussed in this chapter will save you from common errors and mistakes that developers make while writing the code for sensors. It is advisable that you give a second thought to selecting the right sampling period of a sensor, before using them in your code.

This chapter prepared you to dive deep into the Android world of sensors. In the next chapter, we will take a closer look at the classes, interfaces, and methods for accessing sensors, and we will also start writing the code for sensors.

Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • • Get a thorough understanding of the fundamentals and framework of Android sensors.
  • • Acquire knowledge of advance sensor programming, and learn how to connect and use sensors in external devices such as the Android Watch, Polar heart rate monitors, Adidas speed cells, and so on.
  • • Learn from real-world sensor-based applications such as the Pedometer app to detect daily steps, the Driving app to detect driving events, and the Professional Fitness tracker app to track heart rate, weight, daily steps, calories burned, and so on.

Description

Android phones available in today’s market have a wide variety of powerful and highly precise sensors. Interesting applications can be built with them such as a local weather app using weather sensors, analyzing risky driving behavior using motion sensors, a fitness tracker using step-counter sensors, and so on. Sensors in external devices such as Android Watch, Body Analyzer & Weight Machine, Running Speed Cell, and so on can also be connected and used from your Android app running on your phone. Moving further, this book will provide the skills required to use sensors in your Android applications. It will walk you through all the fundamentals of sensors and will provide a thorough understanding of the Android Sensor Framework. You will also get to learn how to write code for the supportive infrastructure such as background services, scheduled and long running background threads, and databases for saving sensor data. Additionally, you will learn how to connect and use sensors in external devices from your Android app using the Google Fit platform. By the end of the book, you will be well versed in the use of Android sensors and programming to build interactive applications.

Who is this book for?

This book is targeted at Android developers who want to get a good understanding of sensors and write sensor-based applications, or who want to enhance their existing applications with additional sensor functionality. A basic knowledge of Android development is required

What you will learn

  • • Learn about sensor fundamentals, different types of sensors, and the sensor co-ordinate system
  • • Understand the various classes, callbacks, and APIs of the Android Sensor framework
  • • Check all the available sensors on an Android device and know their individual capabilities—for example, their range of values, power consumption, and so on.
  • • Implement sensor fusion using two or more sensors together and learn to compensate for the weakness of one sensor by using the strength of another
  • • Build a variety of sensor based, real-world applications such as Weather, Pedometer, Compass, Driving Events Detection, Fitness Tracker, and so on.
  • • Get to know about wake up and non-wake up sensors, wake locks, and how to use sensor batch processing along with the sensor hardware FIFO queue
  • • Develop efficient battery and processor algorithms using raw sensor data to solve real-world problems
  • • Connect to a variety of remote sensors such as body weight measurement and body fat percentage measurement using the Google Fit platform from your Android app

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Apr 29, 2016
Length: 194 pages
Edition : 1st
Language : English
ISBN-13 : 9781785284663
Category :
Languages :
Tools :

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Product Details

Publication date : Apr 29, 2016
Length: 194 pages
Edition : 1st
Language : English
ISBN-13 : 9781785284663
Category :
Languages :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$199.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just NZ$7 each
Feature tick icon Exclusive print discounts
$279.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just NZ$7 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total NZ$ 185.97
Android Sensor Programming By Example
NZ$64.99
Android 6 Essentials
NZ$48.99
Android Application Development Cookbook
NZ$71.99
Total NZ$ 185.97 Stars icon
Banner background image

Table of Contents

7 Chapters
1. Sensor Fundamentals Chevron down icon Chevron up icon
2. Playing with Sensors Chevron down icon Chevron up icon
3. The Environmental Sensors – The Weather Utility App Chevron down icon Chevron up icon
4. The Light and Proximity Sensors Chevron down icon Chevron up icon
5. The Motion, Position, and Fingerprint Sensors Chevron down icon Chevron up icon
6. The Step Counter and Detector Sensors – The Pedometer App Chevron down icon Chevron up icon
7. The Google Fit Platform and APIs – The Fitness Tracker App Chevron down icon Chevron up icon

Customer reviews

Rating distribution
Full star icon Full star icon Full star icon Full star icon Half star icon 4.8
(4 Ratings)
5 star 75%
4 star 25%
3 star 0%
2 star 0%
1 star 0%
Sagar Ahuja May 05, 2016
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Quite useful book with handful examples..
Amazon Verified review Amazon
Ankita Ahuja May 06, 2016
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Reached on the third chapter, so far it is pretty nice and systematic. I like the examples and full explanation of the source code, as it help beginners like me to understand and connect the pieces together.
Amazon Verified review Amazon
Akhilesh Mani May 05, 2016
Full star icon Full star icon Full star icon Full star icon Full star icon 5
This book offers a really complete coverage of the Android Sensors. I am still reading through it, and the learning curve for me is rather steep, even though I'm an experienced programmer.
Amazon Verified review Amazon
Paul in Idaho Feb 02, 2019
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
This is a good book for someone who's familiar with Android programming but is just starting out with sensors. The author leaves out a few details that a beginner would have to search out. The example applications could be a little more focused.
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

How do I buy and download an eBook? Chevron down icon Chevron up icon

Where there is an eBook version of a title available, you can buy it from the book details for that title. Add either the standalone eBook or the eBook and print book bundle to your shopping cart. Your eBook will show in your cart as a product on its own. After completing checkout and payment in the normal way, you will receive your receipt on the screen containing a link to a personalised PDF download file. This link will remain active for 30 days. You can download backup copies of the file by logging in to your account at any time.

If you already have Adobe reader installed, then clicking on the link will download and open the PDF file directly. If you don't, then save the PDF file on your machine and download the Reader to view it.

Please Note: Packt eBooks are non-returnable and non-refundable.

Packt eBook and Licensing When you buy an eBook from Packt Publishing, completing your purchase means you accept the terms of our licence agreement. Please read the full text of the agreement. In it we have tried to balance the need for the ebook to be usable for you the reader with our needs to protect the rights of us as Publishers and of our authors. In summary, the agreement says:

  • You may make copies of your eBook for your own use onto any machine
  • You may not pass copies of the eBook on to anyone else
How can I make a purchase on your website? Chevron down icon Chevron up icon

If you want to purchase a video course, eBook or Bundle (Print+eBook) please follow below steps:

  1. Register on our website using your email address and the password.
  2. Search for the title by name or ISBN using the search option.
  3. Select the title you want to purchase.
  4. Choose the format you wish to purchase the title in; if you order the Print Book, you get a free eBook copy of the same title. 
  5. Proceed with the checkout process (payment to be made using Credit Card, Debit Cart, or PayPal)
Where can I access support around an eBook? Chevron down icon Chevron up icon
  • If you experience a problem with using or installing Adobe Reader, the contact Adobe directly.
  • To view the errata for the book, see www.packtpub.com/support and view the pages for the title you have.
  • To view your account details or to download a new copy of the book go to www.packtpub.com/account
  • To contact us directly if a problem is not resolved, use www.packtpub.com/contact-us
What eBook formats do Packt support? Chevron down icon Chevron up icon

Our eBooks are currently available in a variety of formats such as PDF and ePubs. In the future, this may well change with trends and development in technology, but please note that our PDFs are not Adobe eBook Reader format, which has greater restrictions on security.

You will need to use Adobe Reader v9 or later in order to read Packt's PDF eBooks.

What are the benefits of eBooks? Chevron down icon Chevron up icon
  • You can get the information you need immediately
  • You can easily take them with you on a laptop
  • You can download them an unlimited number of times
  • You can print them out
  • They are copy-paste enabled
  • They are searchable
  • There is no password protection
  • They are lower price than print
  • They save resources and space
What is an eBook? Chevron down icon Chevron up icon

Packt eBooks are a complete electronic version of the print edition, available in PDF and ePub formats. Every piece of content down to the page numbering is the same. Because we save the costs of printing and shipping the book to you, we are able to offer eBooks at a lower cost than print editions.

When you have purchased an eBook, simply login to your account and click on the link in Your Download Area. We recommend you saving the file to your hard drive before opening it.

For optimal viewing of our eBooks, we recommend you download and install the free Adobe Reader version 9.