TensorFlow models can also be used in applications running on mobile and embedded platforms. TensorFlow Lite and TensorFlow Mobile are two flavors of TensorFlow for resource-constrained mobile devices. TensorFlow Lite supports a subset of the functionality compared to TensorFlow Mobile. TensorFlow Lite results in better performance due to smaller binary size with fewer dependencies.
To integrate TensorFlow into your application, first, train a model using the techniques we mention throughout the book and then save the model. The saved model can now be used to do the inference and prediction in the mobile application.
To learn how to use TensorFlow models on mobile devices, in this chapter we cover the following topics:
- TensorFlow on mobile platforms
- TF Mobile in Android apps
- TF Mobile demo on Android
- TF Mobile demo on iOS
- TensorFlow...