Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tensorflow Lite developer preview is Here

Save for later
  • 3 min read
  • 15 Nov 2017

article-image
Team TensorFlow announces the developer preview of TensorFlow Lite, a feather-light upshot for mobile and embedded devices, at the I/O developer conference.

TensorFlow has been a popular framework grabbing everyone’s attention since its inception. Its adoption can be seen right from within the enormous server racks to tiny IoT(Internet of Things) devices; now it’s time for mobile and embedded devices! Also, since TensorFlow Lite made its debut in May, many other opponents have come up with their version of AI on mobile-- Apple’s CoreML, and the Cloud service from Clarifai are some popular examples.

TensorFlow Lite is available for both Android and iOS devices.

TensorFlow Lite is designed to be:

  • Lightweight: It allows inference of the on-device machine learning models that too with a small binary size, allowing faster initialization/ startup.
  • Speed: The model loading time is dramatically improved, with an accelerated hardware support.
  • Cross-platform: It includes a runtime tailormade to run on various platforms--starting with Android and iOS.

Recently, there has been an increase in the number of mobile devices that make use of a custom-built hardware to carry out ML workloads efficiently. Keeping this in mind, TensorFlow Lite also supports the Android Neural Networks API to leverage the advantages of the new accelerators.

Another feature of TensorFlow Lite is that when the accelerator hardware is not available, it relies on the optimized CPU for execution. This ensures that your models run fast on a large set of devices. It also allows a low-latency inference for the on-device ML models.

Let’s now have a look at the lightweight architecture:

tensorflow-lite-developer-preview-img-0

Source: https://www.tensorflow.org/mobile/tflite/

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at AU $24.99/month. Cancel anytime

Starting from the top and moving down:

  • A Trained TensorFlow Model, which is saved on disk.
  • A TensorFlow Lite Converter program which converts the Tensorflow model into the TensorFlow Lite format.
  • A TensorFlow Lite Model File format based on FlatBuffers, optimized for maximum speed and minimum size.

Further down the architecture, one can see how Tensorflow Lite Model file is deployed onto Android and iOS Applications. Now, within each mobile Application, there is a Java API, a C++ API and an interpreter. Developers also have a choice to implement custom kernels with the C++ API which can be used by the Interpreter.

Tensorflow also has support for various models, trained and optimized for the mobile devices. The models are:

  • MobileNet, which is able to identify across 1000 varied object classes. It is designed specifically for an efficient execution on mobile and embedded devices.
  • Inception v3: This is an image recognition model and is similar to MobileNet in functionality. Though large in size, it offers higher accuracy.
  • Smart Reply: An on-device conversational model that provides replies to incoming chat messages with one touch. Many Android wears possess this feature within their messaging apps.

Both, Inception v3 and MobileNets are trained using the ImageNet dataset. Using this dataset one can easily retrain the two models on their own image datasets via transfer learning.

TensorFlow already has a TensorFlow Mobile API that supports mobile and embedded deployment of models. The obvious question then is, why TensorFlow Lite? Well, team TensorFlow’s answer to this on their official blog post is, “Going forward, TensorFlow Lite should be seen as the evolution of TensorFlow Mobile, and as it matures it will become the recommended solution for deploying models on mobile and embedded devices. With this announcement, TensorFlow Lite is made available as a developer preview, and TensorFlow Mobile is still there to support production apps.”

For more information on Tensorflow Lite, you can visit the official documentation page here.