Caching results with Joblib
If your model takes time to make predictions, it may be interesting to cache the results: if the prediction for a particular input has already been done, it makes sense to return the same result we saved on disk rather than running the computations again. In this section, we’ll learn how to do this with the help of Joblib.
Joblib provides us with a very convenient and easy-to-use tool to do this, so the implementation is quite straightforward. The main concern will be about whether we should choose standard or async functions to implement the endpoints and dependencies. This will allow us to explain some of the technical details of FastAPI in more detail.
We’ll build upon the example we provided in the previous section. The first thing we must do is initialize a Joblib Memory
class, which is the helper for caching function results. Then, we can add a decorator to the functions we want to cache. You can see this in the following example...