Reporting

How would I go about using Splunk to analyze new builds before we push them?

jeffrey_steiner
New Member

My team is looking to use Splunk as a way to analyze new builds before we push them to check for any significant changes in total boot time, individual job time, I/O throughput etc. We'd then like to be able to compare the data it spits out to historical averages/means and report any disturbances from the norm. The constraint is that the plan is to have the process automated so everything needs to be done through the CLI on a machine running Arch Linux. If anyone could provide some help in getting to that endpoint it would be appreciated.
Thanks!

0 Karma
1 Solution

aljohnson_splun
Splunk Employee
Splunk Employee

Hi @jeffery_steiner

Here are a few steps you could follow:

1.) Install Splunk.
On the download page, you'll see there is an option for linux, and in the top right, a link to use wget to download the tar.gz file. You can script the installation if you want. Once you've installed Splunk, you'll have access to the web interface (by default on port 8000) .

2.) Get the data into Splunk
You could log the data to files, then monitor and index those using Splunk. You could also send the information over TCP/UDP/HTTP, install a forwarder to forward the data, or utilize the scripted input or modular inputs to get the data. Basically the data has to be in Splunk one way or another, and how exactly you do that is up to you. Which approach you take will likely be dependent on how much data you have, how fast its being created, and the process that is generating the data. Thankfully, we have lots of documentation on the various ways you can get data in. Here is a good starting point, the Getting Data In manual. You can set all this up on the CLI if you don't want to use the web interface to add inputs.

3.) Write the searches
It sounds you are interested in a few different things, e.g. historical averages, deviations from the norm, etc. You'll need to write a number of searches using Splunk's Processing Language (SPL). Once you have the queries written, you can quickly save those as reports and turn those into dashboard panels and schedule them to run on regular intervals. If you haven't written any searches with SPL before, the Search Manual has a lot of good information. I doubt that you'll be doing this part on CLI, but, hypothetically, you could, although you wouldn't get any of our awesome visualizations 😞

If this is all a bit much, I'd suggest taking some training from Splunk Education - Instructors (including myself) cover this content in the Using Splunk, Searching and Reporting, and Administration classes. Best of luck !

View solution in original post

aljohnson_splun
Splunk Employee
Splunk Employee

Hi @jeffery_steiner

Here are a few steps you could follow:

1.) Install Splunk.
On the download page, you'll see there is an option for linux, and in the top right, a link to use wget to download the tar.gz file. You can script the installation if you want. Once you've installed Splunk, you'll have access to the web interface (by default on port 8000) .

2.) Get the data into Splunk
You could log the data to files, then monitor and index those using Splunk. You could also send the information over TCP/UDP/HTTP, install a forwarder to forward the data, or utilize the scripted input or modular inputs to get the data. Basically the data has to be in Splunk one way or another, and how exactly you do that is up to you. Which approach you take will likely be dependent on how much data you have, how fast its being created, and the process that is generating the data. Thankfully, we have lots of documentation on the various ways you can get data in. Here is a good starting point, the Getting Data In manual. You can set all this up on the CLI if you don't want to use the web interface to add inputs.

3.) Write the searches
It sounds you are interested in a few different things, e.g. historical averages, deviations from the norm, etc. You'll need to write a number of searches using Splunk's Processing Language (SPL). Once you have the queries written, you can quickly save those as reports and turn those into dashboard panels and schedule them to run on regular intervals. If you haven't written any searches with SPL before, the Search Manual has a lot of good information. I doubt that you'll be doing this part on CLI, but, hypothetically, you could, although you wouldn't get any of our awesome visualizations 😞

If this is all a bit much, I'd suggest taking some training from Splunk Education - Instructors (including myself) cover this content in the Using Splunk, Searching and Reporting, and Administration classes. Best of luck !

jeffrey_steiner
New Member

Hi, thanks for the answer!

In the last part you mentioned:

I doubt that you'll be doing this part on CLI, but, hypothetically, you could

I was hoping I could find more information on how to go about using Splunk in this particular way. We are limited to this option because the machine in which Splunk is installed has no desktop environment and therefore cant use the Splunk Dashboard.

0 Karma

aljohnson_splun
Splunk Employee
Splunk Employee

Hi @jeffery_steiner - yes, you can run searches from the command line or dispatch from from the REST API, you can also retrieve the results via the REST API, possibly utilizing one of our various SDKs for either of these tasks.

When you say you don't have a "desktop environment", does that also mean that you cannot run a webserver from that machine? That is what Splunk web is - a service running which will expose a UI via HTTP. Its really quite crucial to getting full value out of Splunk. If you cannot run Splunk web whatsoever, maybe you can consider running Splunk web on a different machine, which you can treat as a search head, whereas the first (original) machine can be treated as an indexer? Really Splunk web is quite important - you don't get any visualizations from Splunk without it !

jeffrey_steiner
New Member

You are correct, we are unable to run a web server on the machine in question. That is okay though, we're not looking to get human-readable visualizations of the data, we just want to set it up in a way in which it can notify us of any significant deviations in our data. Do you have a link to some documentation showing how to run a search through the API and retrieve the results that way?

0 Karma

aljohnson_splun
Splunk Employee
Splunk Employee

You can check out the REST api: http://dev.splunk.com/restapi

The SDKs: http://dev.splunk.com/sdks

As well as running from the CLI: http://docs.splunk.com/Documentation/Splunk/6.4.1/Admin/CLIadmincommands#Splunk_CLI_command_syntax

Also, please accept the answer above as "accepted" if you feel this solved your issue 🙂

Get Updates on the Splunk Community!

Get Inspired! We’ve Got Validation that Your Hard Work is Paying Off

We love our Splunk Community and want you to feel inspired by all your hard work! Eric Fusilero, our VP of ...

What's New in Splunk Enterprise 9.4: Features to Power Your Digital Resilience

Hey Splunky People! We are excited to share the latest updates in Splunk Enterprise 9.4. In this release we ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...