Getting Data In

NiFi GetSplunk import events

hanseMand
Observer

I'm trying to import a csv file generated by the NiFi GetSplunk component. It retrieves events from a Splunk Instance SPL-01 and store them in a CSV file with the following header:

_serial,_time,source,sourcetype,host,index,splunk_server,_raw

I do an indexed_extraction=CSV when I import the csv files on another spunk instance SPL-02.

If I just import the file, the host will be the instance SPL-02 and I want the host to be SPL-01 I got past this by having a transform as follows:

[mysethost]

INGEST_EVAL = host=$field:host$

 

Question 1:

That gives me correct host name set to SPL-01, but I still have a EXTRACTED_HOST field, when I look at events in Splunk.. I found the article below where I got the idea to use $field:host$, but it also has ":=" for assignment, that did not work for me, so I used the "=" and then it worked. I also tried setting the "$field:host$=null()" but that had no effect..

I found this article

https://community.splunk.com/t5/Getting-Data-In/How-to-get-the-host-value-from-INDEXED-EXTRACTIONS-j...

 

Question 2:

I have problem getting the data from time field in. I tried using the

TIMESTAMP_FIELDS in props.conf for this import. I tried the following.

  1. TIMESTAMP_FIELDS=_time (Did not work)
  2. TIMESTAMP_FIELDS=$field:_time$ ( Did not work)

I then renamed the header line so time was named: "xtime" instead and then I could use the props.conf and set the

TIMESTAMP_FIELDS=xtime

How can I use the _time field directly?

 

Labels (4)
Tags (2)
0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...