Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Splunk Operational Intelligence Cookbook

You're reading from   Splunk Operational Intelligence Cookbook Over 80 recipes for transforming your data into business-critical insights using Splunk

Arrow left icon
Product type Paperback
Published in May 2018
Publisher
ISBN-13 9781788835237
Length 541 pages
Edition 3rd Edition
Languages
Tools
Arrow right icon
Authors (4):
Arrow left icon
Yogesh Raheja Yogesh Raheja
Author Profile Icon Yogesh Raheja
Yogesh Raheja
Josh Diakun Josh Diakun
Author Profile Icon Josh Diakun
Josh Diakun
Derek Mock Derek Mock
Author Profile Icon Derek Mock
Derek Mock
Paul R. Johnson Paul R. Johnson
Author Profile Icon Paul R. Johnson
Paul R. Johnson
Arrow right icon
View More author details
Toc

Table of Contents (12) Chapters Close

Preface 1. Play Time – Getting Data In 2. Diving into Data – Search and Report FREE CHAPTER 3. Dashboards and Visualizations - Make Data Shine 4. Building an Operational Intelligence Application 5. Extending Intelligence – Datasets, Modeling and Pivoting 6. Diving Deeper – Advanced Searching, Machine Learning and Predictive Analytics 7. Enriching Data – Lookups and Workflows 8. Being Proactive – Creating Alerts 9. Speeding Up Intelligence – Data Summarization 10. Above and Beyond – Customization, Web Framework, HTTP Event Collector, REST API, and SDKs 11. Other Books You May Enjoy

Data onboarding – defining field extractions

Splunk has many built-in features, including knowledge of several common source types, which lets it automatically know which fields exist within your data. Splunk, by default, also extracts any key-value pairs present within the log data and all the fields within JSON-formatted logs. However, often the fields within raw log data cannot be interpreted out of the box, and this knowledge must be provided to Splunk to make these fields easily searchable.

The sample data that we will be using in subsequent chapters contains data we wish to present as fields to Splunk. Much of the raw log data contains key-value fields that Splunk will extract automatically, but there is one field we need to tell Splunk how to extract, representing the page response time. To do this, we will be adding a custom field extraction, which will tell Splunk how to extract the field for us.

Getting ready

To step through this recipe, you will need a running Splunk server with the operational intelligence sample data loaded. No other prerequisites are required.

How to do it...

Follow these steps to add a custom field extraction for a response:

  1. Log in to your Splunk server.
  2. In the top right-hand corner, click on the Settings menu and then click on the Fields link.
  1. Click on the Field extractions link:
  2. Click on New.
  3. In the Destination app field, select the search app, and in the Name field, enter response. Set the Apply to dropdown to sourcetype and the named field to access_combined. Set the Type dropdown to Inline, and for the Extraction/Transform field, carefully enter the (?i)^(?:[^"]*"){8}s+(?P<response>.+) regex:
  4. Click on Save.
  5. On the Field extractions listing page, find the recently added extraction, and in the Sharing column, click on the Permissions link:
  1. Update the Object should appear in setting to All apps. In the Permissions section, for the Read column, check Everyone, and in the Write column, check admin. Then, click on Save:
  2. Navigate to the Splunk search screen and enter the following search over the Last 60 minutes time range:
index=main sourcetype=access_combined 
  1. You should now see a field called response extracted on the left-hand side of the search screen under the Interesting Fields section.

How it works...

All field extractions are maintained in the props.conf and transforms.conf configuration files. The stanzas in props.conf include an extraction class that leverages regular expressions to extract field names and/or values to be used at search time. The transforms.conf file goes further and can be leveraged for more advanced extractions, such as reusing or sharing extractions over multiple sources, source types, or hosts.

See also

  • The Loading the sample data for this book recipe
  • The Data onboarding – defining event types and tags recipe
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime