All Apps and Add-ons

Jenkins integration

msn2507
Path Finder

Has anyone successfully monitored Jenkins job through Splunk ? If so can you please share your knowledge. Thanks !

Tags (1)

Flynt
Splunk Employee
Splunk Employee

Here is a quick python script for an example of how to query this data and return it back to Splunk.

DISCLAIMER - This is NOT an officially Splunk supported function and merely just an example of how I get Jenkins data in for my own use in Splunk.

import requests, json
import collections
import splunk.Intersplunk as si
import time
import re
import sys


# replace YOURJENKINSURL with your Jenkins location 
baseurl="http://YOURJENKINSURL:8080/"
# grab the arguments to be passed to the API
args=sys.argv[1:]
# if no arguments are given, list all jobs
if len(sys.argv)==1:
   url = baseurl+"/api/json?depth=1"
# else handle projects, jobs, and builds
elif "project" in sys.argv:
   project=sys.argv[sys.argv.index("project")+1]
   if "job" in sys.argv:
      if "build" in sys.argv:
         job=sys.argv[sys.argv.index("job")+1]
         build=sys.argv[sys.argv.index("build")+1]
         url = baseurl+"job/"+project+"/job/"+job+"/"+build+"/api/json"
      else:
         job=sys.argv[sys.argv.index("job")+1]
         url = baseurl+"job/"+project+"/job/"+job+"/api/json"
   else:
       url = baseurl+"job/"+project+"/api/json"
else:
     url = baseurl+sys.argv[1]
# send request, don't verify SSL
r = requests.get(url,verify=False)
# if additional args weren't specified, just give the jobs otherwise use whole payload 
if  len(sys.argv)==1:
  jdata=r.json()['jobs']
else:
  jdata=r.json()

# quick and dirty conversion from dicts to json
def convert(data):
    if isinstance(data, basestring):
        return str(data)
    elif isinstance(data, collections.Mapping):
        return dict(map(convert, data.iteritems()))
    elif isinstance(data, collections.Iterable):
        return type(data)(map(convert, data))
    else:
        return data
class mydict(dict):
        def __str__(self):
            return json.dumps(self)

# initialize our results and rows
results=[]
row={}
# if no args given just deliver the jobs
if  len(sys.argv)==1:
   for i in jdata:
       # this sets Splunk's _time as the time of the script run
       row["_time"]=time.time()
       # arbitrary Splunk host field assignment, change as desired
       row["host"]="Jenkins"
       # arbitrary Splunk source field assignment, change as desired
       row["source"]="Jenkins"
       # arbitrary Splunk sourcetype field assignment, change as desired
       row["sourcetype"]="Jenkins" 
       # create an _raw field (to show as events in Splunk) using our dict to json conversion
       row["_raw"]=mydict(convert(i))
       # add row to result
       results.append(row)
       # clear out the row since we've appended it. We want this empty so we don't end up with just the last row of data
       row={}
   # when the results are complete, stream them on to Splunk
   si.outputStreamResults(results)
else:
  # if we have args, parse accordingly
  try:
     # this sets Splunk's _time as the time of the script run
     row["_time"]=time.time()
     # arbitrary Splunk host field assignment, change as desired
     row["host"]="Jenkins"
     # arbitrary Splunk source field assignment, change as desired
     row["source"]="Jenkins"
     # arbitrary Splunk sourcetype field assignment, change as desired
     row["sourcetype"]="Jenkins"
     # assign fields based on args
     if "project" in sys.argv:
         row["project"]=project
     if "job" in sys.argv:
         row["job"]=job
     if "build" in sys.argv:
         row["build"]=build
     # create an _raw field (to show as events in Splunk) using our dict to json conversion
     row["_raw"]=mydict(convert(jdata))
     # add row to result      
     results.append(row)
     # clear out the row since we've appended it. We want this empty so we don't end up with just the last row of data
     row={}
     # when the results are complete, stream them on to Splunk
     si.outputStreamResults(results)
  # some very minor error handling that will return an error event back to Splunk if the API call fails
  except Exception, e:
        import traceback
        results=[]
        row={}
        stack =  traceback.format_exc()
        row['_time'] = int(time.time())
        row['error'] = str(str(e))
        results.append(row)
        si.outputStreamResults(results)
        logger.error(str(e) + ". Traceback: " + str(stack))

Save the above as a python script (for example, jenkins.py) then you'll need to create a commands.conf for wherever you put the command (could be in its own app, or what have you, but wherever you put it, make sure the commands.conf exists there). The commands.conf should exist in the "local" directory of the App you put the script in.

# identify the command name
[jenkins]
# tell Splunk what the script name is on disk
filename = jenkins.py
# we're making our own time fields, so just ignore the order that the events come in 
overrides_timeorder = true
# some values may be multivalues, we should support it
supports_multivalues = true
# streaming should be set to true since we're using InterSplunks outputStreamResults
streaming = true
# it's a generating command
generating = stream
# send the sessionKey when the command is run (probably extraneous, sends SessionKey to the command)
passauth = true
# don't try to run this distributed (if you set local to false, you will run the same command over all search heads getting duplicate data!)
local = true

And finally, you'll want a props.conf in the "local" directory of the App you put the command in. This helps Splunk to know how to extract the fields returned.

# identify the sourcetype (IMPORTANT! if you changed the sourcetype in the script, make sure you change it here too!)
[Jenkins]
# use json extractions 
INDEXED_EXTRACTIONS = json
# ignore kv_mode
KV_MODE = none

You'll want to restart after you've gotten all the files in place. ( You ARE using a test system for this first right? Don't dev in Prod! 😉 )

Search syntax for Jenkins -

| jenkins project  [project name] job [job name] build [build name]{/testReport}|spath

You can do each individually if you like

| jenkins project  [project name] |spath
| jenkins project  [project name] job [job name] |spath
| jenkins project  [project name] job [job name] build [build name]|spath

/testReport is case sensitive and must immediately follow the build name

Note that all data is returned in JSON, it's highly recommended to use |spath for field extraction

If it's not working as expected, try a couple of things first -

Go to the Jenkin's API URI -

http://YOURJENKINS:8080/api/json?depth=1

Run the command manually from the command line (shell) using Splunk's python interpreter.

/opt/splunk/bin/splunk cmd python jenkins.py
0 Karma

dougmair
Explorer

Thanks Flynt.
I see there's now a Jenkins app released. Have asked for it to be installed on Splunk Cloud - just waiting for it to be vetted.

Flynt
Splunk Employee
Splunk Employee

Excellent, I wholeheartedly recommend the Supported Jenkins app over this. If anything, this is more of an example of how to get data in.

0 Karma

Flynt
Splunk Employee
Splunk Employee

Using Python and the Jenkins API (https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API) this can be accomplished fairly easily. I've recently whipped up a quick and dirty connector using the API that allows you to see projects/jobs/builds/testreports etc from Jenkins very quickly.

Basically this is the flow

  • using requests in python, send an API call to Jenkins (I've made a few user configurable arguments here for which jobs etc the user would like to see and we construct the appropriate URL to pass to the API.)
  • read the results in JSON format
  • convert the unicode in the results to string
  • return the converted JSON to Splunk as events
  • edit props.conf for the events to be seen correctly as JSON

The code itself is less than 90 lines. If there is interest, I'd be happy to whip up a quick blog with notation on what each function is doing and how to change it to be used as a scripted input or a generating custom search command.

dougmair
Explorer

Hi,

I'm also interested in this. Can you send me a copy?

Thanks,
Doug.

0 Karma

doubleshifter
Engager

Hi,

Could you also please send me a copy of the script?

Thanks!

Wendel

0 Karma

mmisciagna_splu
Splunk Employee
Splunk Employee

Hi,

Can you please send me the script?

Thanks!

Martin

0 Karma

knielsen
Contributor

Hi,

I'd love to see what you did as well. We're setting up quite a bit of Jenkins infrastructure, and blindly throwing all logfiles into Splunk would be way too much information.

Thanks,
Kai.

0 Karma

tattersp
Explorer

Hello

I also would love to see more details on how you configured this.

Thanks in advance for any help.

0 Karma

sreekarpeddi
New Member

Hi

Could you please give us that script .,Could you please share the script to us

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...