Splunk Search

How to create a custom generating search command?

siu
Observer

So I have a python script called Analysis.py 

And normally I would run it locally like this Analysis.py <filepath>,  so as an example Analysis.py D:/Temp/temp.txt

And what this python script does is it generates a csv file.

What I would like to do is a dashboard in splunk which does visualization on this csv file, eg. like a line chart or some bar graphs. However this python script runs with a <filepath> argument.
And also, this dashboard would accept a custom input but the user who will input the <filepath> argument and the dashboard will show the results accordingly, visualized in a line chart for example.

How can I write a splunk custom search command such that I can create this dashboard

Labels (1)
0 Karma

woodcock
Esteemed Legend

There are a couple of easy ways to do this (and the "right" way which others have already posted).

The simplest way is to just write the output file on your search head to "$SPLUNK_HOME/etc/apps/<your app here>/lookups" and the reference it there.

Another more complex way is from my (unaccepted) answer here, quoted verbatim:
https://answers.splunk.com/answers/41949/passing-search-results-to-external-python-script.html

This isn't the "right" way to do this, but it is the most expedient and it allows you to recycle your traditional alert script for use in regular searches, too. This example assumes that you will be passing 2 fields to the scirpt: scr_ip and host; you will need to adjust slightly for the fields that you will be using.

Insert this code to your MyAlertScript.py code right before your existing code that accesses the results.csv.gz file in argv[8]:

# In order to facilitate Event Workflow Actions using runshellscript,
# we will hijack the arguments in one special case as follows:
# 1(sys.argv[1]) = '1'
# 2(sys.argv[2]) = '"<src_ip>","<host>"'
# 3(sys.argv[3]) = 'Hack'
# 4(sys.argv[4]) = 'to'
# 5(sys.argv[5]) = 'run'
# 7(sys.argv[6]) = 'from'
# 6(sys.argv[7]) = 'runshellscript'
# 8(sys.argv[8]) = * <- DO NOT CHECK because splunk modifieds this on the way in
# If in this format we will pull the data directly out of the 2nd argument,
# instead of out of the restults file.
specialCase = False
# initialize to FALSE
if ((sys.argv[1] == '1') and (sys.argv[3] == 'Hack') and (sys.argv[4] == 'to') and (sys.argv[5] == 'run') and (sys.argv[6] == 'from') and (sys.argv[7] == 'runshellscript')): print "SPECIAL CASE!\n"
# Special case!
specialCase = True
# make sure we delete this file at the end!
fnz = workdir + 'results.csv.gz' with gzip.open(fnz, 'wb') as OFH: OFH.write('src_ip,host\n') OFH.write(sys.argv[2]) OFH.close() sys.argv[8] = fnz #print "SPECIAL CASE: ARGV8=<" + sys.argv[8] + "> fnz=<" + fnz + ">\n"

Then at the bottom, add this, too:

if (specialCase): os.remove(sys.argv[8]) # delete fake zip file we made

Next you need a macro like this (to abstract away the trickery):

[MyScript] definition = table src_ip host\ | map maxsearches=5000 search="|runshellscript MyAlertScript.py 1 \"\\\"\\\"$src_ip$\\\",\\\"$host$\\\"\\\"\" Hack to run from runshellscript 8" iseval = 0

Now, to use it, you just do this:

My Search To Get Events With src_ip And host Here | `MyScript`

SPECIAL WARNING! This will not scale nicely if you pass a large number of results to the script because of the map command but it works GREAT for small numbers of events.

Tags (1)
0 Karma

siu
Observer

hi @woodcock @diogofgm possible to elab?

0 Karma

siu
Observer

ahh I see thanks guys,

For my custom generating search command would this work?

#!/usr/bin/env python

import sys
import os

sys.path.insert(0, os.path.join(os.path.dirname(__file__), "..", "lib"))
from splunklib.searchcommands import \
    dispatch, GeneratingCommand, Configuration, Option, validators

@Configuration()
class %(command.title())Command(GeneratingCommand):
    filename = Option(require=True)

    def generate(self):
       filename = self.filename
       # Put your event  code here

       # To connect with Splunk, use the instantiated service object which is created using the server-uri and
       # other meta details and can be accessed as shown below
       # Example:-
       #    service = self.service

       pass

dispatch(%(command.title())Command, sys.argv, sys.stdin, sys.stdout, __name__)


 After I have this then I would put the python script which needs the file path argument in commands.conf

And then run the search command using
| mycustomsearchgeneratingcommand filename="D:\Temp\Temp.txt"

And if that generates the csv file output then I will test it with user input, creating a field in the dashboard

and using as suggested by you guys
| mycustomsearchgeneratingcommand filename=$field_token$

0 Karma

diogofgm
SplunkTrust
SplunkTrust

Check the splunk dev portal. Its has some examples on custom commands.

https://dev.splunk.com/enterprise/docs/devtools/customsearchcommands/

Also Splunk's git has some other examples for custom commands here:

https://github.com/splunk/splunk-app-examples/blob/master/custom_search_commands/python/generatingse...

In this example you can replace the count with a path to do something along the lines of:

| mycustomcommand path="D:/Temp/temp.txt"
------------
Hope I was able to help you. If so, some karma would be appreciated.
0 Karma

siu
Observer

Yes it helps but when I have that path=...

I want it to be a user input in the dashboard.

How can I Do so?

0 Karma

diogofgm
SplunkTrust
SplunkTrust

Just place a text input in the dashboard and use normal dashboard tokens. Then you can make your search something like this:

| mycustomcommand path=$path_token$

 

------------
Hope I was able to help you. If so, some karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Updated Team Landing Page in Splunk Observability

We’re making some changes to the team landing page in Splunk Observability, based on your feedback. The ...

New! Splunk Observability Search Enhancements for Splunk APM Services/Traces and ...

Regardless of where you are in Splunk Observability, you can search for relevant APM targets including service ...

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...