Getting Data In

how to upload csv data file into splunk by using REST API? Can someone provide the exact URI Used to upload CSV File. I was confusing with the URI's provided by splunk.

gopij
Engager

hi i am trying to upload csv data file to the splunk enterprise through the REST API, there were lot of URI's available for different operations,
can someone provide exact URI which will upload CSV file to Splunk.

Tags (1)

dmarling
Builder

I know this question is about two years old but I had to solve this issue for my team and I wanted to share the solution I came up with.

@harsmarvania57 is correct that there is not a direct way to import with api, but if you are feeling adventurous there is an alternative way of doing this if the file is not extremely massive. I only had powershell to work with when I created this and I wrote a script that does the following steps:

  1. converts the csv into json
  2. escapes any escapes \ and escaped quotes \" in the json elements (you'll see why later)
  3. escapes any quotes in the json body
  4. takes the fully escaped json body and passes it into a splunk search via api and curl
  5. Using makeresults the search splits out the different rows and parses the json into the appropriate columns and outputs the results into the csv. This is the reason we had to do all of the earlier escaping.

The powershell script is below. I have wrapped things you need to change in curly brackets with a label in between {change me}:

$csvjson = import-csv "{absolute path to file}" | ConvertTo-Json
$escapingescapedescapes = $csvjson -replace '\\\\', '\\\\\\'
$escapingescapedquotes = $escapingescapedescapes -replace '([^\\])\\"','$1\\\"'
$fileescaped = $escapingescapedquotes -replace '([\n\r]\s+)"(.*)":(\s+)"(.*)"(,?[\n\r])','$1\"$2\":$3\"$4\"$5'
$search = '| makeresults count=1 | fields - _time | eval data="'+$fileescaped+'" | eval data=trim(data, "[]") | rex field=data mode=sed "s/(\s+)\},/\1}█/g" | makemv data delim="█" | mvexpand data | eval data="[".data."]" | spath input=data | fields - data | rename "{}.*" as * | outputlookup {lookup file name goes here}'
Add-Type -Assembly System.Web
$searchencoded = [System.Web.HttpUtility]::UrlEncode("$search")
curl.exe -k -u {credentials here} -X POST https://{put your domain here}/services/search/jobs -d exec_mode=oneshot -d output_mode=csv -d count=0 -d search="$searchencoded"
Remove-Variable -Name pass
Remove-Variable -Name csvjson
Remove-Variable -Name escapingescapedescapes
Remove-Variable -Name escapingescapedquotes
Remove-Variable -Name fileescaped
Remove-Variable -Name search
Remove-Variable -Name searchencoded

The four things that need to be changed are {absolute path to file} on line 1 which should have the path to your file (e.g. C:\Users\dmarling\Desktop\test.csv), {lookup file name goes here} on line 5 which should have the name of your lookup file that you are writing to (e.g. myTestLookup.csv), '{credentials here}' on line 7 is where you will need to put your splunk login credentials (e.g. admin:P@ssW03d), and {put your domain here} on line 7 which should have the domain and potentially ip address that you are connecting to via curl (e.g. localhost:8089).

The above process will take a csv that looks like this:
column1,column2,rownum
a,b,1
b,a,2
c,a,3
d,b,4
e,a,5
f,e,6
a,a,7
b,b,8
c,c,9
d,"d,""4\""""",10

and turn it into this:

| makeresults count=1 | fields - _time | eval data="[
    {
        \"rownum\":  \"1\",
        \"column1\":  \"a\",
        \"column2\":  \"b\"
    },
    {
        \"rownum\":  \"2\",
        \"column1\":  \"b\",
        \"column2\":  \"a\"
    },
    {
        \"rownum\":  \"3\",
        \"column1\":  \"c\",
        \"column2\":  \"a\"
    },
    {
        \"rownum\":  \"4\",
        \"column1\":  \"d\",
        \"column2\":  \"b\"
    },
    {
        \"rownum\":  \"5\",
        \"column1\":  \"e\",
        \"column2\":  \"a\"
    },
    {
        \"rownum\":  \"6\",
        \"column1\":  \"f\",
        \"column2\":  \"e\"
    },
    {
        \"rownum\":  \"7\",
        \"column1\":  \"a\",
        \"column2\":  \"a\"
    },
    {
        \"rownum\":  \"8\",
        \"column1\":  \"b\",
        \"column2\":  \"b\"
    },
    {
        \"rownum\":  \"9\",
        \"column1\":  \"c\",
        \"column2\":  \"c\"
    },
    {
        \"rownum\":  \"10\",
        \"column1\":  \"d\",
        \"column2\":  \"d,\\\"4\\\\\\\"\\\"\"
    }
]" | eval data=trim(data, "[]") | rex field=data mode=sed "s/(\s+)\},/\1}█/g" | makemv data delim="█" | mvexpand data | eval data="[".data."]" | spath input=data | fields - data | rename "{}.*" as * | outputlookup myTestLookup.csv

This process has some potential pitfalls especially with very large csv files as you may run into memory constraints using mvexpand depending on the limits that are put upon you as a user. This process can be ported to be used on linux based systems, but I unfortunately did not have access to one when I was creating the above process.

If this comment/answer was helpful, please up vote it. Thank you.

Ralf
Explorer

Hey dmarling,

I spotted your feedback searching in google how to upload files to Splunk using curl command.

In my case, I'd like to upload a json formatted file but I have no idea how the curl command has too look like. So the normal usage of sending a single event I found in Splunk documentation looks as follows:

curl -k "https://mysplunkserver.example.com:8088/services/collector" \
    -H "Authorization: Splunk CF179AE4-3C99-45F5-A7CC-3284AA91CF67" \
    -d '{"event": "Hello, world!", "sourcetype": "manual"}'

 Are you able to help modifying this to allow me the upload of a json file?

regard
Ralf

harsmarvania57
Ultra Champion

Hi,

As far as I know there are no easy way to upload CSV file using Splunk REST API. First you need to upload lookup file into Splunk Staging area $SPLUNK_HOME/var/run/splunk/lookup_tmp and then you can move that CSV file to respective app.

Have a look at POST method on REST API doc http://docs.splunk.com/Documentation/Splunk/7.2.0/RESTREF/RESTknowledge#data.2Flookup-table-files

Create a lookup table file by moving a file from the upload staging area into $SPLUNK_HOME
0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...