Distributed caching
A distributed cache or global cache is a single instance or a group of cache servers with a dedicated network. As the applications hit the distributed cache, if cached data related to the application's request does not exist, the request redirects to the database to query the data. Otherwise, the distributed cache will simply respond with the data needed by the applications.
Here's a diagram of two servers sharing the same Redis instance for distributed caching:
The preceding diagram shows the requests from two servers hitting a Redis cache first before deciding whether to query from the database or not.
What happens if one of your services crashes? Nothing really because everyone is going to be querying the distributed cache anyway. And because the cache is distributed, it's going to be maintaining the data consistency. We can offload all that information and all that headache...