In the previous section, we covered how to get data in Elasticsearch using different Beats. They're easy to install, configure, and then you can start receiving data from the server. Sometimes we need to do more than just configure a specific, single-purpose Beats that sits on the server and sends data to an Elasticsearch cluster and for the Logstash that's there. Logstash is a data pipeline we can use to configure input to take data from multiple types of data sources, such as files, databases, CSV, or Kafka, and after taking the input, we can configure the output to send data on different sources, such as files, databases, Kafka, or Elasticsearch. Another important feature of Logstash is filter, using which we can transform the input data before sending it to the output. Let's check out a Logstash configuration format:
input
{
...