Packaging Deep RL agents for mobile and IoT devices using TensorFlow Lite
This recipe will show how you can leverage the open source TensorFlow Lite (TFLite) framework for serving your Deep RL agents on mobile, IoT, and embedded devices. We will implement a complete script to build, train, and export an agent model that you can load into a mobile or embedded device. We will explore two methods to generate the TFLite model for our agent. The first method involves saving and exporting the agent models in TensorFlow's SavedModel file format and then using a command-line converter. The second method leverages the Python API to directly generate the TFLite models.
Let's get started!
Getting ready
To complete this recipe, you will first need to activate the tf2rl-cookbook
Python/conda virtual environment. Make sure to update the environment to match the latest conda environment specification file (tfrl-cookbook.yml
) in the cookbook's code repo. If the following imports...