TensorFlow Lite is a mobile framework that allows you to run TensorFlow models on mobile and embedded devices. It supports Android, iOS, and Raspberry Pi. Unlike Core ML on iOS devices, it is not a native library but an external dependency that must be added to your app.
While Core ML was optimized for iOS device hardware, TensorFlow Lite performance may vary from device to device. On some Android devices, it can use the GPU to improve inference speed.
To use TensorFlow Lite for our example application, we will first convert our model to the library's format using the TensorFlow Lite converter.