With the help of distributed caching techniques, we can improve the scalability of our RESTful web services (web API). A distributed cache can be stored on multiple nodes of a cluster. A distributed cache enhances a web service's throughput, as the cache no longer requires an I/O trip to any external resource.
This approach has the following advantages:
- Clients get the same results
- The distributed cache is backed up by a persistence store and runs as a different remote process; even if the app server restarts or has any problems, it in no way affects the cache
- The source's data store has fewer requests made to it