In the main Splunk interface, I can filter down on a saved search like this:
| savedsearch "my_search" | search title="Elite Baller" person="me" | table *
This will run my saved search, "my_search", and then filter down the results further.
I'm trying to do the same thing in the SDK (Python), using the same saved search. But it's not working for me! I've tried many iterations:
# Load Saved Search
saved_search = self.connect().saved_searches['my_search']
# Load Job
job = saved_search.history()[-1]
job = saved_search.dispatch()
# Poll API
if job['isDone'] == '1':
# Here's where I'm trying to filter down the search further
search = ''
# search_items['title'] = 'Elite Baller'
# search_items['person'] = 'me'
if len(search_items) > 0:
search += 'search '
for key, val in iteritems(search_items):
search += '%s%%3D%s ' % (key, val)
search = search[:-1] # trim trailing space
job_kwargs['search'] = search
# Fetch Result
job_result = job.results(**job_kwargs)
reader = results.ResultsReader(job_result)
# ... iterate over reader ... #
Anyway, I've tried every which way to build an argument list string and place it into job_kwargs, but no such luck.
Am I missing something simple?
For what I know, you can't postprocess a job in Python SDK. But you can perform the search itself:
import splunklib.client as client
sp_con = client.connect(username='admin', password='password', host='127.0.0.1',
scheme='https', port='8089', app='appname',
query = """
| savedsearch "my_search" |
search title="Elite Baller" person="me" |
earliest = an_epoch_time
latest = an_epoch_time
rr = sp_con.jobs.oneshot(query, count=0, earliest_time=earliest,
Remember to give correct app and username to your connection, or your saved search will be not visible to the script.