Getting Data In

How to import this kind of CSV data?

nicolociraci
New Member

I've a CSV file like the one reported below, and on my UF I've added the following props but on the search heads the events are not parsed.

props.conf

[sourcetype]
FIELD_HEADER_REGEX=#LineAboveHeader\n(.*)
FIELD_DELIMITER=,

CSV example

#LineAboveHeader
"Header1","Header2","Header3","Header4"
"Field1", "Field2", "Field3", "Field4"
"Field1", "Field2", "Field3", "Field4"
"Field1", "Field2", "Field3", "Field4"

What I would like is that splunk sees the headers and import the field names, and then create an event for each line.

0 Karma

woodcock
Esteemed Legend

The configurations need to be deployed to the Indexers (all of them) and you splunk instances there restarted (all of them) and then only verify against events that have been indexed AFTER the restarts by adding _index_earliest= to your search.

0 Karma

nicolociraci
New Member

I'm already aware of that, and it's not the issue.

0 Karma

koshyk
Super Champion

is this CSV :
1. A "one-off reference data" (lookup) ? OR
2. A streaming data i.e the "#lineaboveHeader" happens only one time for entire data ? OR
3. A batch file, so the header happens every-time per file?

Your above logic works only if its (3)
Have a try in your Heavy Forwarder or Indexers

[sourcetype]
 FIELD_HEADER_REGEX=^\s*("Header1.*)
 PREAMBLE_REGEX = ^#LineAboveHeader.*
0 Karma

nicolociraci
New Member

So, my comment went into review and never get posted. By the way, I'm in option 3 but your solution is not working. Actually no solution is working, it seems like Splunk is ignoring my config. Where should I put the config? For now I've only tried in the Indexer.

0 Karma

koshyk
Super Champion

can you please put you actual props.conf & transforms.conf file with the real data. Possibly we can work out and provide an answer. Is your installation an indexer cluster?

0 Karma

nicolociraci
New Member

I can't post the real data, but I will post the props.conf (with thelatest configuration) and censored data.

[MailboxAuditDisabled]
FIELD_HEADER_REGEX = ^\s*("Name.*)
PREAMBLE_REGEX = ^#TYPE.*

Data

#TYPE Selected.Microsoft.Exchange.Data.Directory.Management.Mailbox
"Name","Alias","ServerName","AuditEnabled"
"foo","bar","server-foo","False"
"foobar","bar2","server-foo2","False"
"foofoo","barbar","server-foo","False"
"foo3","bar2","server-foo3","False"

Note that the data is splitted in multiple events. In my installation I've two indexer (the configuration is deployed on both of them).

0 Karma

koshyk
Super Champion

Below Configuration worked correctly

# inputs.conf
[monitor:///tmp/csv/*.csv]
sourcetype=MailboxAuditDisabled
index=my_index

 # props.conf
[MailboxAuditDisabled]
FIELD_HEADER_REGEX=^\s*("Name.*)
PREAMBLE_REGEX = ^#TYPE.*
FIELD_DELIMITER=,
FIELD_QUOTE="

Sample data kept in /tmp/csv/sample1.csv

#TYPE Selected.Micro.EX.data.xyz.mailbox
"Name","Alias","Servername","AuditEnabled"
"emp1","empA1","server1","false"
"emp2","empA2","server2","true"
"emp3","empA3","server3","false"

The props.conf needs to be in Indexers && Heavy Forwarders if you are using them

0 Karma

p_gurav
Champion

add props.conf on indexer not on universal forwarder.

0 Karma

nicolociraci
New Member

Does not work.

0 Karma
Get Updates on the Splunk Community!

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Spotting Financial Fraud in the Haystack: A Guide to Behavioral Analytics with Splunk

In today's digital financial ecosystem, security teams face an unprecedented challenge. The sheer volume of ...

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability As businesses scale ...