On-device inference with TensorFlow Lite for Microcontrollers
Here we are, ready to dive into our first ML application on microcontrollers.
This recipe will guide us through deploying the trained model using TensorFlow Lite for Microcontrollers (tflite-micro) on the Arduino Nano and Raspberry Pi Pico.
Getting ready
Tflite-micro is a component of TensorFlow Lite designed explicitly by Google and the open-source community to run ML models on microcontrollers and other devices with only a few kilobytes of memory.
Theoretically, nothing prevents you from using tflite-micro to run ML models on your laptop. However, it may not perform well since tflite-micro is optimized for low-resource devices such as microcontrollers.
Running a model with TensorFlow Lite or tflite-micro typically consists of the following:
- Loading the model: We load the weights and network architecture stored in the TensorFlow Lite model.
- Preparing the input data: We convert the...