Optimizing time series data streams with downsampling
In the previous recipe, we learned how ILM automates the process of moving older data to less expensive storage tiers, optimizing costs without affecting data availability. In this recipe, we will explore how to use downsampling to reduce the granularity of data by aggregating it over larger time intervals, leading to further cost savings in both storage and operational expenses.
Getting ready
Make sure you have completed the recipes in Chapter 10.
In this recipe, we will explore how to downsample metrics data.
How to do it…
In this recipe, you will learn how to configure downsampling with ILM policies, verify that downsampling is correctly triggered, and examine the effects of these operations on your data at the document level and in terms of overall data visualization:
- Go to Kibana | Stack Management | Index Lifecycle Policies, and type
metric
in the search bar. Then toggle on Include managed system...