Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
TinyML Cookbook

You're reading from   TinyML Cookbook Combine artificial intelligence and ultra-low-power embedded devices to make the world smarter

Arrow left icon
Product type Paperback
Published in Apr 2022
Publisher Packt
ISBN-13 9781801814973
Length 344 pages
Edition 1st Edition
Tools
Arrow right icon
Author (1):
Arrow left icon
Gian Marco Iodice Gian Marco Iodice
Author Profile Icon Gian Marco Iodice
Gian Marco Iodice
Arrow right icon
View More author details
Toc

Table of Contents (10) Chapters Close

Preface 1. Chapter 1: Getting Started with TinyML 2. Chapter 2: Prototyping with Microcontrollers FREE CHAPTER 3. Chapter 3: Building a Weather Station with TensorFlow Lite for Microcontrollers 4. Chapter 4: Voice Controlling LEDs with Edge Impulse 5. Chapter 5: Indoor Scene Classification with TensorFlow Lite for Microcontrollers and the Arduino Nano 6. Chapter 6: Building a Gesture-Based Interface for YouTube Playback 7. Chapter 7: Running a Tiny CIFAR-10 Model on a Virtual Platform with the Zephyr OS 8. Chapter 8: Toward the Next TinyML Generation with microNPU 9. Other Books You May Enjoy

What this book covers

Chapter 1, Getting Started with TinyML, provides an overview of TinyML, presenting the opportunities and challenges to bring ML on extremely low-power microcontrollers. This chapter focuses on the fundamental elements behind ML, power consumption, and microcontrollers that make TinyML unique and different from conventional ML in the cloud, desktop, or even smartphones.

Chapter 2, Prototyping with Microcontrollers, presents concise and straightforward recipes to deal with the relevant microcontroller programming basics. We will deal with code debugging and how to transmit data to the Arduino serial monitor. After that, we will discover how to program GPIO peripherals with the ARM Mbed API and use the breadboard to connect external components, such as LEDs and push-buttons. In the end, we will see how to power the Arduino Nano 33 BLE Sense and the Raspberry Pi Pico with batteries.

Chapter 3, Building a Weather Station with TensorFlow Lite for Microcontrollers, guides you through all the development stages of a TensorFlow-based application for microcontrollers and teaches you how to acquire temperature and humidity sensor data. The application developed in the chapter is an ML-based weather station for snow forecasts.

In the first part, we will focus on dataset preparation by acquiring historical weather data from WorldWeatherOnline. After that, we will present the relevant basics to train and test a model with TensorFlow. In the end, we will deploy the model on the Arduino Nano 33 BLE Sense and the Raspberry Pi Pico with TensorFlow Lite for Microcontrollers.

Chapter 4, Voice Controlling LEDs with Edge Impulse, shows how to develop an end-to-end keyword spotting (KWS) application with Edge Impulse and get familiar with audio data acquisition and analog-to-digital (ADC) peripherals. The application considered for this chapter voice controls the LED emitting color (red, green, and blue) and the number of times to make it blink (one, two, and three).

In the first part, we will focus on the dataset preparation, showing how to acquire audio data with a mobile phone. After that, we will design a model using the mel-frequency cepstral coefficient (MFCC) features and optimize the performance with EON Tuner. In the end, we will finalize the KWS application on the Arduino Nano 33 BLE Sense and the Raspberry Pi Pico.

Chapter 5, Indoor Scene Classification with TensorFlow Lite for Microcontrollers and the Arduino Nano, aims to show you how to apply transfer learning with TensorFlow and get familiar with the best practices to use a camera module with a microcontroller. For the purpose of this chapter, we will develop an application to recognize indoor environments with the Arduino Nano 33 BLE Sense and the OV7670 camera module.

In the first part, we will see how to acquire images from the OV7670 camera module. After that, we will focus on the model design, applying transfer learning with Keras to recognize kitchen and bathroom rooms. In the end, we will deploy the quantized TensorFlow Lite model on the Arduino Nano 33 BLE Sense with the help of TensorFlow Lite for Microcontrollers.

Chapter 6, Building a Gesture-Based Interface for YouTube Playback, aims to develop an end-to-end gesture recognition application with Edge Impulse and the Raspberry Pi Pico to get acquainted with inertial sensors, teach you how to use I2C peripherals, and write a multithreading application in Arm Mbed OS.

In the first part, we will collect the accelerometer data through the Edge Impulse data forwarder to prepare the dataset. After that, we will design a model using features in the frequency domain to recognize three gestures. In the end, we will deploy the application on the Raspberry Pi Pico and implement a Python program with the PyAutoGUI library to build a touchless interface for YouTube video playback.

Chapter 7, Running a Tiny CIFAR-10 Model on a Virtual Platform with the Zephyr OS, provides best practices to build tiny models for memory-constrained microcontrollers. In this chapter, we will be designing a model for the CIFAR-10 image classification dataset on a virtual Arm Cortex-M3-based microcontroller.

In the first part, we will install Zephyr, the primary framework used in this chapter to accomplish our task. After that, we will design a tiny quantized CIFAR-10 model with TensorFlow. This model will fit on a microcontroller with only 256 KB of program memory and 64 KB of RAM. In the end, we will build an image classification application with TensorFlow Lite for Microcontrollers and the Zephyr OS and run it on a virtual platform using Quick Emulator (QEMU).

Chapter 8, Toward the Next TinyML Generation with microNPU, helps familiarize you with microNPU, a new class of processors for ML workloads on edge devices. In this chapter, we will be running a quantized CIFAR-10 model on a virtual Arm Ethos-U55 microNPU with the help of TVM.

In the first part, we will learn how the Arm Ethos-U55 microNPU works and install the software dependencies to build and run the model on the Arm Corstone-300 fixed virtual platform. After that, we will use the TVM compiler to convert a pretrained TensorFlow Lite model to C code. In the end, we will show how to compile and deploy the code generated by TVM into Arm Corstone-300 to perform inference with the Arm Ethos-U55 microNPU.

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image