Hello,
I am writing a custom search command in python, which should accept the | table c1 c2 ... cn as an input, call my webservice to evaluate if the given row of the table is anomalous and append the table with the column "anomaly" being True/False.
Now, I see it a bit complicated to try/error using logging.
Is there any way to pass the input table to my python script from the command line in order to properly debug it?
I took the Splunk script example as a starter and I can see it accepts the sys.stdin, so it must be somehow possible to construct the CLI input that imitates my table ...
How would I do this?
Can anyone bring an example?
My code so far looks as below.
Kind Regards,
Kamil
import sys
sys.path.append("/usr/local/lib/python3.5/site-packages/splunk_sdk-1.6.6-py2.7.egg")
from splunklib.searchcommands import dispatch, StreamingCommand, Configuration
import urllib2
import json
import pandas as pd
import time
def classifier_bwp_ls5923_v1_lr(df_input, host):
df_input.insert(loc=1, column='host', value=host)
data = json.dumps({"data": df_input.as_matrix().tolist()})
body = str.encode(data)
url = 'url ...'
headers = {'Content-Type':'application/json'}
req = urllib2.Request(url, body, headers)
try:
response = urllib2.urlopen(req)
result = response.read().decode('utf-8')
return result
except urllib2.HTTPError, error:
print("The request failed with status code: " + str(error.code))
print(error.info())
print(json.loads(error.read()))
return str(error.code)
@Configuration()
class MyCommand(StreamingCommand):
def stream(self,records):
for record in records:
# .... xgboost_prediction = classifier_bwp_ls5923_v1_lr(splunk table input ....,host)
record ['anomaly']='False'
yield record
if __name__ == "__main__":
dispatch(MyCommand, sys.argv, sys.stdin, sys.stdout, __name__)
Splunk SDK for python is a good solution for this.