Machine learning is at the edge of the next wave, where we try to make ML ubiquitous in our everyday life. It has several advantages such as offline access, data privacy, and so on.
In this chapter, we looked at a new library from Google known as TensorFlow Lite, which has been optimized for deploying ML models on mobile and embedded devices. We understood the architecture of TensorFlow Lite, which converts the trained TensorFlow model into .tflite format. This is designed for inference at fast speed and low memory on devices. TensorFlow Lite also supports multiple platforms, such as Android, iOS, Linux, and Raspberry Pi.
Next, we used the MNIST handwritten digit dataset to train a deep learning model. Subsequently, we followed the necessary steps to convert the trained model into .tflite format. The steps are as follows:
- Froze the graph with variables converted to constants...