Putting data to Elasticsearch
Now that we have set up the data to be consumed by a CSV file into Logstash, followed by parsing and processing based on the data type needed, we now need to put the data in Elasticsearch so that we can index the different fields and consume them later via the Kibana interface.
We will use the output
plugin of Logstash for an elasticsearch
output.
A typical elasticsearch
plugin configuration looks like this:
output { elasticsearch { action => # string (optional), default: "index" cluster => # string (optional) host => # string (optional) document_id => # string (optional), default: nil index => # string (optional), default: "logstash-%{+YYYY.MM.dd}" index_type => # string (optional) port => # string (optional) protocol => # string, one of ["node", "transport", "http"] (optional) } }
action
: This specifies what action to perform on incoming...