Most of the time, data, such as log files, is designed so that humans can easily understand what the events mean. This type of data is unstructured, as machines can't easily index the events since they don't follow the same structure or format. Take system logs and Apache, for example. While each log provides different types of events, none follow the same format or structure, and, for an indexing system, this becomes a problem. That's where Logstash comes in.
Logstash data processing parser is capable of receiving data from several sources simultaneously, and then transforming the data by parsing it into a structured form, and later shipping it to Elasticsearch as indexed, easily-searchable data.
One of the main features of Logstash is the vast amount of plugins available for filters such as Grok, allowing greater flexibility on what type of data can be...