Dashboards & Visualizations

How to download the result of a search in csv using rest api or curl command?

anooshac
Communicator

Hi All,

I want to download a search result as csv file into my local folder. Can anyone suggest me some good methods to do it and how can i do it?

I saw some examples using curl command and rest api, but couldn't able to understand that fully. can anyone help me in this?

Tags (2)
0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @anooshac 

Please be aware that if you are using Splunk >9.0.1 then you should use the V2 endpoints rather than the un-versioned endpoints mentioned in older answers which are now deprecated.

 Here is a sample cURL request to run a search and export the results as a CSV using Splunk's /services/search/v2/jobs/export endpoint:

curl -k -u "admin:yourpassword" \
     -X POST \
     https://splunk_server:8089/services/search/v2/jobs/export \
     -d search="search index=_internal | head 10" \
     -d output_mode=csv

Explanation of the Command:

  1. curl: Command-line tool used for making HTTP requests.

  2. -k: This flag allows connections to SSL sites without certificates (useful if Splunk uses self-signed certificates) - This should be omitted if you have a valid SSL certificate on your management port,

  3. -u "admin:yourpassword": Specifies the authentication credentials (username:password). Replace admin and yourpassword with your actual Splunk credentials. Alternatively you can use Token based auth

  4. -X POST: Specifies that this is a POST request, as required by Splunk's search export API.

  5. https://splunk_server:8089/services/search/v2/jobs/export: The Splunk API endpoint for executing a search and exporting results immediately.

    • Replace splunk_server with the actual Splunk host (e.g., splunk.yourcompany.com).

    • 8089 is the default Splunk REST API management port.

  6. -d search="search index=_internal | head 10": The actual Splunk search query.

    • index=_internal queries the internal Splunk logs.
    • head 10 limits the output to the first 10 results.
  7. -d output_mode=csv: Specifies that the output format should be in CSV (other options include JSON and XML).

Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped.
Regards

Will

0 Karma

anooshac
Communicator

hi @livehybrid , Thanks for the reply, Is there any way that i can schedule this export? Since ii have a tool which is scheduled to run every 1 hour.

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @anooshac 

If you want to run this on a schedule then you might want to look at putting this into a Bash script and running as a cronjob. 

Once you have a working CURL command, add this into a bash script, ensure it is executable (chmod +x) and then add to your user's cron (crontab -e)

To run hourly you would do something like 1 * * * * which would run at 1 minute past each hour.

This assumes you are running a Linux system.

Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped.
Regards

Will

0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@anooshac 

Please check this documentations for solution. 

https://community.splunk.com/t5/Installation/What-is-a-good-way-to-export-data-from-Splunk-via-rest-... 

https://docs.splunk.com/Documentation/Splunk/9.4.0/Search/ExportdatausingRESTAPI 

https://stackoverflow.com/questions/67525334/splunk-data-export-using-api 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma

anooshac
Communicator

hi @kiran_panchavat , thanks for the reply. Is it possible to automate this process? as i have a tool which is scheduled to run every 1 hour. So for every 1 hour the data needs to be imported.

0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@anooshac 

1) Create the saved search.
2) Create a python script to call the saved search you created. Then, save the results in csv in the directory you want.
3) Schedule the python script as a script input in Splunk.

kiran_panchavat_0-1740048601006.jpeg

 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@anooshac 

have you seen this post ? 

export search results using curl - Splunk Community 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...