The main version of TensorFlow is designed for Windows, Linux, and Mac computers. To operate on other devices, a different version of TensorFlow is necessary. TensorFlow Lite is designed to run model predictions (inference) on mobile phones and embedded devices. It is composed of a converter transforming TensorFlow models to the required .tflite format and an interpreter that can be installed on mobile devices to run inferences.
More recently, TensorFlow.js (also referred to as tfjs) was developed to empower almost any web browser with deep learning. It does not require any installation from the user and can sometimes make use of the device's GPU acceleration. We detail the use of TensorFlow Lite and TensorFlow.js in Chapter 9, Optimizing Models and Deploying on Mobile Devices.