Moving data from Kafka to Elastic with Logstash
Logstash is a tool from Elastic (http://www.elastic.co/). Logstash simplifies log extraction from any source with Elasticsearch. It also allows centralizing data processing and normalizing schemas and formats for several data types. This recipe shows how to read with Logstash from Kafka and push the data to Elastic.
Getting ready
Have a Kafka cluster up and running. To install Elasticsearch follow the instructions on this page: https://www.elastic.co/guide/en/elasticsearch/reference/current/_installation.html.
To install Logstash follow the instructions on this page: https://www.elastic.co/guide/en/logstash/current/installing-logstash.html.
How to do it...
To read data from Kafka and write it into Elasticsearch with Logstash:
- Write a file named
kafkalogstash.conf
with this content:
input { kafka { bootstrap_servers => "localhost:9092" topics => ["source-topic"] } } output { elasticsearch { host => localhost...