Summary
In this chapter, we highlighted the significance of data normalization, a practice that’s particularly fundamental in multi-vendor environments with diverse data sources. The need for consistent data representation remarks the relevance of initiatives such as OpenTelemetry and the Elastic Common Schema for observability data models. Telegraf, along with its range of inputs and processor plugins, plays a vital role in this area, especially in the network industry. We delved into how Telegraf achieves data normalization through its processor plugins, and we ran practical labs that guided us in shaping metrics to fit a hypothetical desired metric model.
Similarly, we manipulated log messages with Logstash and introduced filter plugins to help us normalize this type of data. We did a deeper dive into the Grok filter plugin, which provides us with an excellent way to get structured data from the text data of log messages.
Next, we delved into data enrichment, where...