Summary
The recipes presented in this chapter demonstrated how to build an end-to-end gesture recognition application with Edge Impulse and the Raspberry Pi Pico.
Initially, we learned how to connect an IMU sensor with the microcontroller using the I2C communication protocol and leverage the Mbed OS API to read the accelerometer measurements.
Afterward, we delved into the dataset preparation, model design, and model training using Edge Impulse. Here, we introduced the spectral feature ingredient necessary to obtain a compact model for gesture recognition. Then, we discovered how to implement a multithreading program using Arm Mbed OS to run the model inference concurrently with the accelerometer data acquisition.
Finally, we concluded the project by developing a Python script that leverages the PyAutoGUI library to simulate keyboard input, thereby facilitating the control of YouTube playback.
In this chapter, we began discussing how to build compact ML models...