Splunk Dev

Python SDK save search to csv

to914868
New Member

I want to use splunklib to run a one-off Splunk query and save it to csv.
I'm testing with a small query (a single visitId) of 8 events only.
The result is returned immediately in Splunk UI but I have problems getting the result from the python-sdk.

My problems with splunklib are:
- service.jobs.export() query does not complete because it keeps repeating the same 8 event results over and over again
- service.jobs.oneshot() query does not finish and returns no result

I tried adding the search parameters "preview"=False, i.e.

kwargs_export = { "search_mode": "normal","preview": True }
rr = results.ResultsReader(service.jobs.export(query,**kwargs_export ))

The only effect is that neither option returns anything anymore, since the queries are not completing.

import splunklib.client as client
import splunklib.results as results
service = client.connect( 
               host=HOST, 
               port=8089,
               username=USERNAME,
               password=PWD )
    query= """search index=xxx application="xxx" sourcetype=xxx| 
    spath visitId  | join type ..."""
    rr = results.ResultsReader(service.jobs.export(query))

    for item in rr:
        for key in item.keys():
            print(key, len(item[key]), item[key])

I tried the same with oneshot

kwargs_oneshot = {'output_mode': 'csv',"search_mode": "normal"}
oneshotsearch_results = service.jobs.oneshot(query, **kwargs_oneshot)
 f=open('myresults.csv', 'w')
 f.write(oneshotsearch_results.read())

This creates a csv file but has no content at all. I think .read is deprecated.
Any suggestions ?
All I want is to save the query results to .csv ONCE using the library.
Thanks!

0 Karma
1 Solution

poete
Builder

Hello @to914868,

please add f.close() on the next line after f.write(oneshotsearch_results.read())

I think the content is not flushed to the file.

@to914868, please accet this answer in order for other users to find more easily the answer to this question.

View solution in original post

0 Karma

poete
Builder

Hello @to914868,

please add f.close() on the next line after f.write(oneshotsearch_results.read())

I think the content is not flushed to the file.

@to914868, please accet this answer in order for other users to find more easily the answer to this question.

0 Karma

to914868
New Member

Thanks @poete!

Here is what I used in the end

results_kwargs = {
 "earliest_time": "-40min",
 "latest_time": "now",
 "search_mode": "normal",
 "output_mode": "csv"
}
oneshotsearch_results = service.jobs.oneshot(query, **results_kwargs)
f=open('myresults.csv', 'w')
f.write(oneshotsearch_results.read())
f.close()
0 Karma

pchp348
Explorer

This is working fine , But i could not fetch all the results in csv. Kindly provide me the solution for this question
https://answers.splunk.com/answers/708529/export-to-csv-is-not-fetching-all-the-results-pyth.html?mi...

0 Karma

evuk
Engager

try:

kwargs_export = { "output_mode": "csv"}
rr = service.jobs.export(query)

for item in rr:
    print(item)

I think that you shouldn't need to convert the result into resultsreader because it already is one.

0 Karma

kkrishnan_splun
Splunk Employee
Splunk Employee

This works.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...