Feeding data into Dashing
As we've already covered, Dashing uses a series of scheduled jobs written in Ruby that will collect any data that we are interested in. A library called rufus-scheduler
controls the scheduling; the rufus-scheduler
library allows for great flexibility as to when and how jobs are run, meaning that you could have a lightweight job that scrapes data from a public API and runs every five seconds, and another job that will run every 30 minutes and perform a heavy query on a database.
We're going to create a single job called puppet.rb
, and this Ruby code is going to perform the following actions:
Gather metrics using PuppetDB's
metrics
endpointGather a list of nodes using PuppetDB's
nodes
endpointUse the nodes gathered to gather counts for events that have occurred in the past 30 minutes using PuppetDB's
event-counts
endpointParse the events data to display the state of our hosts
As you can see, we're taking the knowledge that we've gained with PuppetDB over the past two...