Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Mastering Kibana 6.x
Mastering Kibana 6.x

Mastering Kibana 6.x: Visualize your Elastic Stack data with histograms, maps, charts, and graphs

eBook
$9.99 $35.99
Paperback
$43.99
Subscription
Free Trial
Renews at $19.99p/m

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Table of content icon View table of contents Preview book icon Preview Book

Mastering Kibana 6.x

Revising the ELK Stack

Although this book is about Kibana, it doesn't make any sense if we are not aware of the complete Elastic Stack (ELK Stack), including Elasticsearch, Kibana, Logstash, and Beats. In this chapter, you are going to learn the basic concepts of the different software, installation, and their use cases. We cannot use Kibana to its full strength unless we know how to get proper data, filter it, and store it in a format that we can easily use in Kibana.

Elasticsearch is a search engine that is built on top of Apache Lucene, which is mainly used for storing schemaless data and searching it quickly. Logstash is a data pipeline that can practically take data from any source and send data to any source. We can also filter that data as per our requirements. Beats is a single-purpose software that is used to run on individual servers and send data to the Logstash server or directly to the Elasticsearch server. Finally, Kibana uses the data that's stored in Elasticsearch and creates beautiful dashboards using different types of visualization options, such as graphs, charts, histograms, word tags, and data tables. 

In this chapter, we will be covering the following topics:

  • What is ELK Stack?
  • The installation of Elasticsearch, Logstash, Kibana, and Beats
  • ELK use cases

What is ELK Stack?

ELK Stack is a stack with three different open source software—Elasticsearch, Logstash, and Kibana. Elasticsearch is a search engine that is developed on top of Apache Lucene. Logstash is basically used for data pipelining where we can get data from any data source as an input, transform it if required, and send it to any destination as an output. In general, we use Logstash to push the data into Elasticsearch. Kibana is a dashboard or visualization tool, which can be configured with Elasticsearch to generate charts, graphs, and dashboards using our data:

We can use ELK Stack for different use cases, the most common being log analysis. Other than that, we can use it for business intelligence, application security and compliance, web analytics, fraud management, and so on.

In the following subsections, we are going to be looking at ELK Stack's components.

Elasticsearch

Elasticsearch is a full text search engine that can be used as a NoSQL database and as an analytics engine. It is easy to scale, schemaless, and near real time, and provides a restful interface for different operations. It is schemaless, and it uses inverted indexes for data storage. There are different language clients available for Elasticsearch, as follows:

  • Java
  • PHP
  • Perl
  • Python
  • .NET
  • Ruby
  • JavaScript
  • Groovy

The basic components of Elasticsearch are as follows:

  • Cluster
  • Node
  • Index
  • Type
  • Document
  • Shard

Logstash

Logstash is basically used for data pipelining, through which we can take input from different sources and output to different data sources. Using Logstash, we can clean the data through filter options and mutate the input data before sending it to the output source. Logstash has different adapters to handle different applications, such as for MySQL or any other relational database connection. We have a JDBC input plugin through which we can connect to MySQL server, run queries, and take the table data as the input in Logstash. For Elasticsearch, there is a connector in Logstash that gives us the option to seamlessly transfer data from Logstash to Elasticsearch.

To run Logstash, we need to install Logstash and edit the configuration file logstash.conf, which consists of an input, output, and filter sections. We need to tell Logstash where it should get the input from through the input block, what it should do with the input through the filter block, and where it should send the output through the output block. In the following example, I am reading an Apache Access Log and sending the output to Elasticsearch:

input {
file {
path => "/var/log/apache2/access.log"
}
}

filter {
grok {
match => { message => "%{COMBINEDAPACHELOG}" }
}
}

output {
elasticsearch {
hosts => "http://127.0.0.1:9200"
index => "logs_apache"
document_type => "logs"
}
}

The input block is showing a file key that is set to /var/log/apache2/access.log. This means that we are getting the file input and path of the file, /var/log/apache2/access.log, which is Apache's log file. The filter block is showing the grok filter, which converts unstructured data into structured data by parsing it.

There are different patterns that we can apply for the Logstash filter. Here, we are parsing the Apache logs, but we can filter different things, such as email, IP addresses, and dates.

Kibana

Kibana is a dashboarding open source software from ELK Stack, and it is a very good tool for creating different visualizations, charts, maps, and histograms, and by integrating different visualizations together, we can create dashboards. It is part of ELK Stack; hence it is quite easy to read the Elasticsearch data. This does not require any programming skills. Kibana has a beautiful UI for creating different types of visualizations, including charts, histograms, and dashboards.

It provides us with different inbuilt dashboards with multiple visualizations when we use Beats, as it automatically creates multiple visualizations that we can customize to create a useful dashboard, such as for CPU usage and memory usage.

Beats

Beats are basically data shippers that are grouped to do single-purpose jobs. For example, Metricbeat is used to collect metrics for memory usage, CPU usage, and disk space, whereas Filebeat is used to send file data such as logs. They can be installed as agents on different servers to send data from different sources to a central Logstash or Elasticsearch cluster. They are written in Go; they work on a cross-platform environment; and they are lightweight in design. Before Beats, it was very difficult to get data from different machines as there was no single-purpose data shipper, and we had to do some tweaking to get the desired data from servers.

For example, if I am running a web application on the Apache web server and want to run it smoothly, then there are two things that need to be monitored—first, all of the errors from the application, and second, the server's performance, such as memory usage, CPU usage, and disk space. So, in order to collect this information, we need to install the following two Beats on our machine:

  • Filebeat: This is used to collect log data from Apache web server in an incremental way. Filebeat will run on the server and will periodically check for any change in the Apache log. When there is any change in the Apache log file, it will send the log to Logstash. Logstash will receive the data file and execute the filter to find the errors. After filtering the data, it saves the data into Elasticsearch.
  • Metricbeat: This is used to collect metrics for memory usage, CPU usage, disk space, and so on. Metricbeat collects the server metrics, such as memory usage, CPU usage, and disk space, and saves the data into Elasticsearch. Metrics data sends a predefined set of data, and there is no need to modify anything; that is why it sends data directly to Elasticsearch instead of sending it to Logstash first.

To visualize this data, we can use Kibana to create meaningful dashboards through which we can get complete control of our data.

Installing the ELK Stack

For a complete installation of ELK Stack, we first need to install individual components that are explained one by one in the following sections.

Elasticsearch

Elasticsearch 6.0 requires that we have Java 8 at the least. Before you proceed with the installation of Elasticsearch, please ensure which version of Java is present in your system by executing the following command:

java -version
echo $JAVA_HOME

After the setup is complete, we can go ahead and run Elasticsearch. You can find the binaries at www.elastic.co/downloads.

Installing Elasticsearch using a TAR file

First, we will download Elasticsearch 6.1.3.tar, as shown in the following code block:

curl -L -O https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.1.3.tar.gz

Then, extract it as follows:

tar -xvf elasticsearch-6.1.3.tar.gz

You will then see that a bunch of files and folders have been created. We can now proceed to the bin directory, as follows:

cd elasticsearch-6.1.3/bin

 We are now ready to start our node and a single cluster:

./elasticsearch

Installing Elasticsearch with Homebrew

You can also install Elasticsearch on macOS through Homebrew, as follows:

brew install elasticsearch

Installing Elasticsearch with MSI Windows Installer

Windows users are recommended to use the MSI Installer package. This package includes a graphical user interface (GUI) that guides the users through the installation process.

First, download the Elasticsearch 6.1.3 MSI from https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.1.3.msi.

Launch the GUI by double-clicking on the downloaded file. On the first screen, select the deployment directories:

Installing Elasticsearch with the Debian package

On Debian, before you can proceed with the installation process, you may need to install the apt-transport-https package first:

sudo apt-get install apt-transport-https

Save the repository definition to /etc/apt/sources.list.d/elastic-6.x.list:

echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list

You can install the elasticsearch Debian package with the following code:

sudo apt-get update && sudo apt-get install elasticsearch

Installing Elasticsearch with the RPM package

Download and install the public signing key:

rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch

Create a file named elasticsearch.repo in the /etc/yum.repos.d/ directory for Red Hat-based distributions or in the /etc/zypp/repos.d/ directory for openSUSE-based distributions, containing the following code:

[elasticsearch-6.x]
name=Elasticsearch repository for 6.x packages
baseurl=https://artifacts.elastic.co/packages/6.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

Your repository is now ready for use. You can now install Elasticsearch with one of the following commands:

You can use yum on CentOS and older Red Hat-based distributions:

sudo yum install elasticsearch

You can use dnf on Fedora and other newer Red Hat distributions:

sudo dnf install elasticsearch

You can use zypper on openSUSE-based distributions:

sudo zypper install elasticsearch

Elasticsearch can be started and stopped using the service command:

sudo -i service elasticsearch start
sudo -i service elasticsearch stop

Logstash

Logstash requires at least Java 8. Before you go ahead with the installation of Logstash, please check the version of Java in your system by running the following command:

java -version
echo $JAVA_HOME

Using apt package repositories

Download and install the public signing key:

wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -

You may need to install the apt-transport-https package on Debian before proceeding, as follows:

sudo apt-get install apt-transport-https

Save the repository definition to /etc/apt/sources.list.d/elastic-6.x.list, as follows:

echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list

Run sudo apt-get update and the repository will be ready for use. You can install it using the following code:

sudo apt-get update && sudo apt-get install logstash

Using yum package repositories

Download and install the public signing key:

rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch

Add the following in your /etc/yum.repos.d/ directory in a file with a .repo suffix (for example, logstash.repo):

[logstash-6.x]
name=Elastic repository for 6.x packages
baseurl=https://artifacts.elastic.co/packages/6.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

Your repository is now ready for use. You can install it using the following code:

sudo yum install logstash

Kibana

Starting with version 6.0.0, Kibana only supports 64-bit operating systems.

Installing Kibana using .tar.gz

The Linux archive for Kibana v6.1.3 can be downloaded and installed as follows:

wget https://artifacts.elastic.co/downloads/kibana/kibana-6.1.3-linux-x86_64.tar.gz

Compare the SHA produced by sha1sum or shasum with the published SHA:

sha1sum kibana-6.1.3-linux-x86_64.tar.gz
tar -xzf kibana-6.1.3-linux-x86_64.tar.gz

This directory is known as $KIBANA_HOME

cd kibana-6.1.3-linux-x86_64/

Installing Kibana using the Debian package

Download and install the public signing key:

wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -

You may need to install the apt-transport-https package on Debian before proceeding:

sudo apt-get install apt-transport-https

Save the repository definition to /etc/apt/sources.list.d/elastic-6.x.list:

echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list

You can install the Kibana Debian package with the following:

sudo apt-get update && sudo apt-get install kibana

Installing Kibana using rpm

Download and install the public signing key, as follows:

rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch

Create a file named kibana.repo in the /etc/yum.repos.d/ directory for Red Hat-based distributions, or in the /etc/zypp/repos.d/ directory for openSUSE-based distributions, containing the following code:

[kibana-6.x]
name=Kibana repository for 6.x packages
baseurl=https://artifacts.elastic.co/packages/6.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

Your repository is now ready for use. You can now install Kibana with one of the following commands:

  • You can use yum on CentOS and older Red Hat-based distributions:
sudo yum install kibana
  • You can use dnf on Fedora and other newer Red Hat distributions:
sudo dnf install kibana
  • You can use zypper on openSUSE-based distributions:
sudo zypper install kibana

Installing Kibana on Windows

Download the .zip Windows archive for Kibana v6.1.3 from https://artifacts.elastic.co/downloads/kibana/kibana-6.1.3-windows-x86_64.zip.

Unzipping it will create a folder named kibana-6.1.3-windows-x86_64, which we will refer to as $KIBANA_HOME. In your Terminal, CD to the $KIBANA_HOME directory; for instance:

CD c:\kibana-6.1.3-windows-x86_64

Kibana can be started from the command line as follows:

.\bin\kibana

Beats

After installing and configuring the ELK Stack, you need to install and configure your Beats.

Each Beat is a separately installable product. To get up and running quickly with a Beat, see the getting started information for your Beat:

  • Packetbeat
  • Metricbeat
  • Filebeat
  • Winlogbeat
  • Heartbeat

Packetbeat

The value of a network packet analytics system such as Packetbeat can be best understood by trying it on your traffic.

To download and install Packetbeat, use the commands that work with your system (deb for Debian/Ubuntu, rpm for Red Hat/CentOS/Fedora, macOS for OS X, Docker for any Docker platform, and win for Windows):

  • Ubuntu:
sudo apt-get install libpcap0.8
curl -L -O https://artifacts.elastic.co/downloads/beats/packetbeat/packetbeat-6.2.1-amd64.deb
sudo dpkg -i packetbeat-6.2.1-amd64.deb
  • Red Hat:
sudo yum install libpcap
curl -L -O https://artifacts.elastic.co/downloads/beats/packetbeat/packetbeat-6.2.1-x86_64.rpm
sudo rpm -vi packetbeat-6.2.1-x86_64.rpm
  • macOS:
curl -L -O https://artifacts.elastic.co/downloads/beats/packetbeat/packetbeat-6.2.1-darwin-x86_64.tar.gz
tar xzvf packetbeat-6.2.1-darwin-x86_64.tar.gz
  • Windows:
    1. Download and install WinPcap from this page. WinPcap is a library that uses a driver to enable packet capturing.
    2. Download the Packetbeat Windows ZIP file from the downloads page.
    3. Extract the contents of the ZIP file into C:\Program Files.
    4. Rename the packetbeat-<version>-windows directory to Packetbeat.
    5. Open a PowerShell prompt as an administrator (right-click the PowerShell icon and select Run as administrator). If you are running Windows XP, you may need to download and install PowerShell.
    6. From the PowerShell prompt, run the following commands to install Packetbeat as a Windows service:
PS > cd 'C:\Program Files\Packetbeat'
PS C:\Program Files\Packetbeat> .\install-service-packetbeat.ps1

Before starting Packetbeat, you should look at the configuration options in the configuration file; for example, C:\Program Files\Packetbeat\packetbeat.yml or /etc/packetbeat/packetbeat.yml.

Metricbeat

Metricbeat should be installed as close as possible to the service that needs to be monitored. For example, if there are four servers running MySQL, it's strongly recommended that you run Metricbeat on each service. This gives Metricbeat access to your service from localhost and in turn does not cause any additional network traffic or prevent Metricbeat from collecting metrics when there are network problems. Metrics from multiple Metricbeat instances will be combined on the Elasticsearch server.

To download and install Metricbeat, use the commands that work with your system (deb for Debian/Ubuntu, rpm for Red Hat/CentOS/Fedora, macOS for OS X, Docker for any Docker platform, and win for Windows), as follows:

  • Ubuntu:
curl -L -O https://artifacts.elastic.co/downloads/beats/metricbeat/metricbeat-6.2.1-amd64.deb
sudo dpkg -i metricbeat-6.2.1-amd64.deb
  • Red Hat:
curl -L -O https://artifacts.elastic.co/downloads/beats/metricbeat/metricbeat-6.2.1-x86_64.rpm
sudo rpm -vi metricbeat-6.2.1-x86_64.rpm
  • macOS:
curl -L -O https://artifacts.elastic.co/downloads/beats/metricbeat/metricbeat-6.2.1-darwin-x86_64.tar.gz
tar xzvf metricbeat-6.2.1-darwin-x86_64.tar.gz
  • Windows:
    1. Download the Metricbeat Windows ZIP file from the downloads page.
    2. Extract the contents of the ZIP file into C:\Program Files.
    3. Rename the metricbeat-<version>-windows directory to Metricbeat.
    4. Open a PowerShell prompt as an administrator (right-click the PowerShell icon and select Run as administrator). If you are running Windows XP, you may need to download and install PowerShell.
    5. From the PowerShell prompt, run the following commands to install Metricbeat as a Windows service:
PS > cd 'C:\Program Files\Metricbeat'
PS C:\Program Files\Metricbeat> .\install-service-metricbeat.ps1

Before starting Metricbeat, you should look at the configuration options in the configuration file; for example, C:\Program Files\Metricbeat\metricbeat.yml.

Filebeat

To download and install Filebeat, use the commands that work with your system (deb for Debian/Ubuntu, rpm for Red Hat/CentOS/Fedora, macOS for OS X, Docker for any Docker platform, and win for Windows), as follows:

  • Ubuntu:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.2.1-amd64.deb
sudo dpkg -i filebeat-6.2.1-amd64.deb
  • Red Hat:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.2.1-x86_64.rpm
sudo rpm -vi filebeat-6.2.1-x86_64.rpm
  • macOS:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.2.1-darwin-x86_64.tar.gz
tar xzvf filebeat-6.2.1-darwin-x86_64.tar.gz
  • Windows:
    1. Download the Filebeat Windows ZIP file from the downloads page.
    2. Extract the contents of the ZIP file into C:\Program Files.
    3. Rename the filebeat-<version>-windows directory to Filebeat.
    4. Open a PowerShell prompt as an administrator (right-click the PowerShell icon and select Run as administrator). If you are running Windows XP, you may need to download and install PowerShell.
    5. From the PowerShell prompt, run the following commands to install Filebeat as a Windows service:
PS > cd 'C:\Program Files\Filebeat'
PS C:\Program Files\Filebeat> .\install-service-filebeat.ps1

Winlogbeat

In order to install Winlogbeat, we need to follow these steps:

  1. Download the Winlogbeat ZIP file from the downloads page.
  2. Extract the contents into C:\Program Files.
  3. Rename the winlogbeat-<version> directory to Winlogbeat.
  4. Open a PowerShell prompt as an administrator (right-click on the PowerShell icon and select Run as administrator). If you are running Windows XP, you may need to download and install PowerShell.
  5. From the PowerShell prompt, run the following commands to install the service:
PS C:\Users\Administrator> cd 'C:\Program Files\Winlogbeat'
PS C:\Program Files\Winlogbeat> .\install-service-winlogbeat.ps1
Security warning: Only run scripts that you trust. Although scripts from the internet can be useful, they can potentially harm your computer. If you trust the script, use Unblock-File  to allow the script to run without this warning message:
Do you want to run
C:\Program Files\Winlogbeat\install-service-winlogbeat.ps1?
[D] Do not run [R] Run once [S] Suspend [?] Help (default is "D"): R

Status Name DisplayName
------ ---- -----------
Stopped winlogbeat winlogbeat

Before starting winlogbeat, you should look at the configuration options in the configuration file; for example, C:\Program Files\Winlogbeat\winlogbeat.yml. There's also a full example configuration file named winlogbeat.reference.yml.

Heartbeat

Unlike most Beats, which we install on edge nodes, we typically install Heartbeat as part of a monitoring service that runs on a separate machine and possibly even outside of the network where the services that you want to monitor are running.

To download and install Heartbeat, use the commands that work with your system (deb for Debian/Ubuntu, rpm for Red Hat/CentOS/Fedora, macOS for OS X, Docker for any Docker platform, and win for Windows):

  • Ubuntu:
curl -L -O https://artifacts.elastic.co/downloads/beats/heartbeat/heartbeat-6.2.1-amd64.deb
sudo dpkg -i heartbeat-6.2.1-amd64.deb
  • Red Hat:
curl -L -O https://artifacts.elastic.co/downloads/beats/heartbeat/heartbeat-6.2.1-x86_64.rpm
sudo rpm -vi heartbeat-6.2.1-x86_64.rpm
  • macOS:
curl -L -O https://artifacts.elastic.co/downloads/beats/heartbeat/heartbeat-6.2.1-darwin-x86_64.tar.gz
tar xzvf heartbeat-6.2.1-darwin-x86_64.tar.gz
  • Windows:
    1. Download the Heartbeat Windows ZIP file from the downloads page.
    2. Extract the contents of the ZIP file into C:\Program Files.
    3. Rename the heartbeat-<version>-windows directory to Heartbeat.
    4. Open a PowerShell prompt as an administrator (right-click the PowerShell icon and select Run as administrator). If you are running Windows XP, you may need to download and install PowerShell.
    5. From the PowerShell prompt, run the following commands to install Heartbeat as a Windows service:
PS > cd 'C:\Program Files\Heartbeat'
PS C:\Program Files\Heartbeat> .\install-service-heartbeat.ps1

 

Before starting Heartbeat, you should look at the configuration options in the configuration file; for example, C:\Program Files\Heartbeat\heartbeat.yml or /etc/heartbeat/heartbeat.yml.

ELK use cases

ELK Stack has many different use cases, but here we are only going to discuss some of them.

Log management

In any large organization, there will be different servers with different sets of applications. So, in this case, we need to have different teams for different applications whose task is to explore the log files for debugging any issue. However, this is not an easy task, as the format of logs is never user friendly. Here, I am talking about a single application, but what will happen if we ask the team to monitor all different applications that are built using different technologies and their log format is very different from other applications? The answer is very simple: the team has to dig through all the logs from the different servers and then they will spend days and nights to find the issue.

ELK Stack is very useful for these situations, and we can solve this problem easily. First of all, we need to set up a central Elasticsearch cluster for collecting all different logs. Now, we need to configure Logstash as per the application log so that we can transform different log formats that we are getting from different application servers. Logstash will output this data into Elasticsearch for storage so that we can explore, search, and update the data. Finally, Kibana can be used to display graphical dashboards on top of Elasticsearch.

Using this setup, anyone can get complete control of all logs coming from different sources. We can use Kibana to alert us to any issues in the log file so that the user can get the issue without doing any data drill downs.

Many organizations are using ELK for their log management as this is an open source software that can be built easily to monitor different type of logs on a single screen. Not only can we monitor all of our logs in a single screen, but we can also get alerts if something went wrong in the logs.

Security monitoring and alerting

Security monitoring and alerting is a very important use case of ELK Stack as application security is a vital part, and it costs if there are any security breaches in the application since security breaches are becoming more common, and most importantly, more targeted. Although enterprises are regularly trying to improve their security measures, hackers are successful in penetrating the security layers. Therefore, it is very much required for any enterprise to detect the presence of security attacks on their server, and not only detect but also alert them so that they can take immediate actions to mitigate their losses. Using ELK Stack, we can monitor various things, such as unusual server requests and any suspicious traffic. We can gather security-related log information that can be monitored by security teams to check any alerts to the system.

This way, security teams can prevent the enterprise from attackers who have gone unnoticed for a long time. ELK Stack provides a way through which we can gain an insight and make the attacker's life more difficult. These logs can also be very useful for after-attack analysis; for example, for finding out the time of the attack and the method of attack used. We can understand the activities the attacker performed to attack, and this information can provide us with a way to strengthen that loophole easily. In this way, ELK Stack is useful for both before attack prevention and after attack healing and prevention.

Web scraping

In ELK Stack, we have different tools to grab data from remote servers. In traditional Relational Database Management System (RDBMS), it is quite difficult to save these types of data because they are not structured, so either we have to manually clean the data or leave some part of it in order to save it in the table schema. In the case of Elasticsearch, the schemaless behavior gives us the leverage to push any data from any source. It not only holds that data but also provides us with a feature to search and play with it. An example of web scraping using ELK Stack is a Twitter to Elasticsearch connector, which allows us to set up hashtags from Twitter and grab all the tweets that used those hashtags. After grabbing those hashtags, we can search, visualize, and analyze them in Kibana.

E-commerce search solutions

Many of the top e-commerce websites, such as eBay's, are using Elasticsearch for their product search pages. The main reason behind this is the ability of Elasticsearch in full-text searching, building filters, facets, aggregations, fast response time, and the ease it provides in collecting analytic information. Users can easily drill down to get the product set, from where they can easily select the product they want. This is just one side of the picture, through which we are improving the user's experience. On the other side, we can use the same data and by using Kibana, we can monitor the trends, analyze the data, and much more.

There is a big competition going on among e-commerce companies to attract more and more customers. Being able to understand the shopping behavior of their customers is a very important feature, as it leverages e-commerce companies to target users with products that they had liked or will like. This is business intelligence, and using ELK Stack, they can achieve it.

Full text search

ELK Stack's core competency is its full text search feature. It is powerful and flexible, and it provides various features such as fuzzy search, conditional searching, and natural language searching. So, as per our requirements, we can decide which type of searching is required. We can use ELK Stack's full text search capabilities for product searching, autocomplete features, searching text in emails, and so on.

Visualizing data

Kibana is an easy-to-use visualization tool that provides us with a rich feature set to create beautiful charts (such as pie charts, bar charts, and stack charts), histograms, geo maps, word tags, data tables, and so on. Visualizing data is always beneficial for any organization as it helps top management to make decisions with ease. We can also easily track any unusual trends and find any outliers in data without digging into the data. We can create dashboards for any existing web-based application as well by simply pushing the application data into Elasticsearch and then use Kibana to create beautiful dashboards. This way, we can plug in an additional dimension into the application and start monitoring it without putting any additional load on the application.

Summary

In this chapter, we covered the basics of ELK Stack and their characteristics. We explained how we can use Beats to send logs data, file data, and system metrics to Logstash or Elasticsearch and that Logstash can be configured as a pipeline to modify the data format and then send the output to Elasticsearch. Elasticsearch is a search engine built on top of Lucene. It can store data and provide functionality to do full text searching on data. Kibana can be configured to read Elasticsearch data and create visualizations and dashboards. We can embed these dashboards on existing web pages, which can then be used for decision-making. 

Then, we discussed different use cases of ELK Stack. The first one we mentioned was log management, which is the primary use case of ELK Stack and which made it famous. In log management, we can capture logs from different servers/sources and dump them in a central Elasticsearch cluster after modifying it through Logstash. Kibana is used to create meaningful graphical visualization and dashboards by reading the Elasticsearch data. Finally, we discussed security monitoring and alerting, where ELK Stack can be quite helpful. Security is a very important aspect of any software, and often it is the most neglected part of development and monitoring. Using ELK Stack, we can observe any security threat.

Left arrow icon Right arrow icon

Key benefits

  • Explore visualizations and perform histograms, stats, and map analytics
  • Unleash X-Pack and Timelion, and learn alerting, monitoring, and reporting features
  • Manage dashboards with Beats and create machine learning jobs for faster analytics

Description

Kibana is one of the popular tools among data enthusiasts for slicing and dicing large datasets and uncovering Business Intelligence (BI) with the help of its rich and powerful visualizations. To begin with, Mastering Kibana 6.x quickly introduces you to the features of Kibana 6.x, before teaching you how to create smart dashboards in no time. You will explore metric analytics and graph exploration, followed by understanding how to quickly customize Kibana dashboards. In addition to this, you will learn advanced analytics such as maps, hits, and list analytics. All this will help you enhance your skills in running and comparing multiple queries and filters, influencing your data visualization skills at scale. With Kibana’s Timelion feature, you can analyze time series data with histograms and stats analytics. By the end of this book, you will have created a speedy machine learning job using X-Pack capabilities.

Who is this book for?

Mastering Kibana 6.x is for you if you are a big data engineer, DevOps engineer, or data scientist aspiring to go beyond data visualization at scale and gain maximum insights from their large datasets. Basic knowledge of Elasticstack will be an added advantage, although not mandatory.

What you will learn

  • Create unique dashboards with various intuitive data visualizations
  • Visualize Timelion expressions with added histograms and stats analytics
  • Integrate X-Pack with your Elastic Stack in simple steps
  • Extract data from Elasticsearch for advanced analysis and anomaly detection using dashboards
  • Build dashboards from web applications for application logs
  • Create monitoring and alerting dashboards using Beats
Estimated delivery fee Deliver to Japan

Standard delivery 10 - 13 business days

$8.95

Premium delivery 3 - 6 business days

$34.95
(Includes tracking information)

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Jul 31, 2018
Length: 376 pages
Edition : 1st
Language : English
ISBN-13 : 9781788831031
Vendor :
Apache
Category :
Languages :
Tools :

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Estimated delivery fee Deliver to Japan

Standard delivery 10 - 13 business days

$8.95

Premium delivery 3 - 6 business days

$34.95
(Includes tracking information)

Product Details

Publication date : Jul 31, 2018
Length: 376 pages
Edition : 1st
Language : English
ISBN-13 : 9781788831031
Vendor :
Apache
Category :
Languages :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$199.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts
$279.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total $ 92.97 220.97 128.00 saved
Mastering ElasticSearch 6.x and the Elastic Stack (V)
$9.99 $137.99
Mastering Kibana 6.x
$43.99
Learning Elastic Stack 6.0
$38.99
Total $ 92.97 220.97 128.00 saved Stars icon
Banner background image

Table of Contents

15 Chapters
Revising the ELK Stack Chevron down icon Chevron up icon
Setting Up and Customizing the Kibana Dashboard Chevron down icon Chevron up icon
Exploring Your Data Chevron down icon Chevron up icon
Visualizing the Data Chevron down icon Chevron up icon
Dashboarding to Showcase Key Performance Indicators Chevron down icon Chevron up icon
Handling Time Series Data with Timelion Chevron down icon Chevron up icon
Interact with Your Data Using Dev Tools Chevron down icon Chevron up icon
Tweaking Your Configuration with Kibana Management Chevron down icon Chevron up icon
Understanding X-Pack Features Chevron down icon Chevron up icon
Machine Learning with Kibana Chevron down icon Chevron up icon
Create Super Cool Dashboard from a Web Application Chevron down icon Chevron up icon
Different Use Cases of Kibana Chevron down icon Chevron up icon
Creating Monitoring Dashboards Using Beats Chevron down icon Chevron up icon
Best Practices Chevron down icon Chevron up icon
Other Books You May Enjoy Chevron down icon Chevron up icon

Customer reviews

Rating distribution
Full star icon Full star icon Full star icon Full star icon Full star icon 5
(1 Ratings)
5 star 100%
4 star 0%
3 star 0%
2 star 0%
1 star 0%
suneet srivastava Sep 07, 2018
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Excellent work by the author....this book is extremely useful for data management and presentation tool ....the best part I found in this book is the easy language by the Indian author and he is explained the concept in very structured manner
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

What is the delivery time and cost of print book? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela
What is custom duty/charge? Chevron down icon Chevron up icon

Customs duty are charges levied on goods when they cross international borders. It is a tax that is imposed on imported goods. These duties are charged by special authorities and bodies created by local governments and are meant to protect local industries, economies, and businesses.

Do I have to pay customs charges for the print book order? Chevron down icon Chevron up icon

The orders shipped to the countries that are listed under EU27 will not bear custom charges. They are paid by Packt as part of the order.

List of EU27 countries: www.gov.uk/eu-eea:

A custom duty or localized taxes may be applicable on the shipment and would be charged by the recipient country outside of the EU27 which should be paid by the customer and these duties are not included in the shipping charges been charged on the order.

How do I know my custom duty charges? Chevron down icon Chevron up icon

The amount of duty payable varies greatly depending on the imported goods, the country of origin and several other factors like the total invoice amount or dimensions like weight, and other such criteria applicable in your country.

For example:

  • If you live in Mexico, and the declared value of your ordered items is over $ 50, for you to receive a package, you will have to pay additional import tax of 19% which will be $ 9.50 to the courier service.
  • Whereas if you live in Turkey, and the declared value of your ordered items is over € 22, for you to receive a package, you will have to pay additional import tax of 18% which will be € 3.96 to the courier service.
How can I cancel my order? Chevron down icon Chevron up icon

Cancellation Policy for Published Printed Books:

You can cancel any order within 1 hour of placing the order. Simply contact customercare@packt.com with your order details or payment transaction id. If your order has already started the shipment process, we will do our best to stop it. However, if it is already on the way to you then when you receive it, you can contact us at customercare@packt.com using the returns and refund process.

Please understand that Packt Publishing cannot provide refunds or cancel any order except for the cases described in our Return Policy (i.e. Packt Publishing agrees to replace your printed book because it arrives damaged or material defect in book), Packt Publishing will not accept returns.

What is your returns and refunds policy? Chevron down icon Chevron up icon

Return Policy:

We want you to be happy with your purchase from Packtpub.com. We will not hassle you with returning print books to us. If the print book you receive from us is incorrect, damaged, doesn't work or is unacceptably late, please contact Customer Relations Team on customercare@packt.com with the order number and issue details as explained below:

  1. If you ordered (eBook, Video or Print Book) incorrectly or accidentally, please contact Customer Relations Team on customercare@packt.com within one hour of placing the order and we will replace/refund you the item cost.
  2. Sadly, if your eBook or Video file is faulty or a fault occurs during the eBook or Video being made available to you, i.e. during download then you should contact Customer Relations Team within 14 days of purchase on customercare@packt.com who will be able to resolve this issue for you.
  3. You will have a choice of replacement or refund of the problem items.(damaged, defective or incorrect)
  4. Once Customer Care Team confirms that you will be refunded, you should receive the refund within 10 to 12 working days.
  5. If you are only requesting a refund of one book from a multiple order, then we will refund you the appropriate single item.
  6. Where the items were shipped under a free shipping offer, there will be no shipping costs to refund.

On the off chance your printed book arrives damaged, with book material defect, contact our Customer Relation Team on customercare@packt.com within 14 days of receipt of the book with appropriate evidence of damage and we will work with you to secure a replacement copy, if necessary. Please note that each printed book you order from us is individually made by Packt's professional book-printing partner which is on a print-on-demand basis.

What tax is charged? Chevron down icon Chevron up icon

Currently, no tax is charged on the purchase of any print book (subject to change based on the laws and regulations). A localized VAT fee is charged only to our European and UK customers on eBooks, Video and subscriptions that they buy. GST is charged to Indian customers for eBooks and video purchases.

What payment methods can I use? Chevron down icon Chevron up icon

You can pay with the following card types:

  1. Visa Debit
  2. Visa Credit
  3. MasterCard
  4. PayPal
What is the delivery time and cost of print books? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela