This chapter was the glue between the last two chapters. We began the chapter by taking a deep dive into how our node server communicates with native libraries. You learned about the asynchronous model that node uses to make I/O operations more efficient. This model is important to understand because it will also carry forward to the coming chapters.
We proceeded to integrate the sensor API from the previous chapter with our node server, and then we went on to see some of its limitations due to the hardware and sampling rate. Caching turned out to be a convenient and effective way to get around these limitations. The kind of local memory caching used in this chapter, however, would not be effective in a system that has multiple Raspberry Pi systems and sensors working together. In the upcoming chapters, you will learn about a more production-grade model to store and read...