Splunk Search

How to edit our props.conf and transforms.conf to parse a CSV file we are indexing in Hunk?

jwalzerpitt
Influencer

I am trying to configure the props and transforms conf files for logs that's in .csv format that we're querying via a virtual index in Hunk.

My props.conf file is configured as follows:

[source::/LogCentral/Applications/Shibboleth_PRD/*]
sourcetype = shibboleth

[shibboleth]
REPORT-manual-shib = manual-shib

My transforms.conf file is configured as follows:

[manual-shib]
DELIMS = ","
FIELDS = username,mfa_service_code,date_requested,mfa_ip_address,mfa_value_returned,mfa_required,zip,latitude,longitude,timezone,country_code,country,city,isp,organization_name,as_name,region,region_name,audit_date_created,audit_created_by,audit_date_modified,audit_modified_by

I restart Hunk/Splunk service and when I run a search (index=shibboleth), the fields aren't being extracted.

Any help on what I'm missing or fat-fingered in my configs would be greatly appreciated.

Thx

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

[source::/LogCentral/Applications/Shibboleth_PRD/*]
should be
[source::/LogCentral/Applications/Shibboleth_PRD/...]

0 Karma

jwalzerpitt
Influencer

Thx for pointing out that error. I fixed that source to /... and then re-ran the query but the fields still aren't being extracted. I opened a case with Splunk as all sources I have (properly set to /...) are also no longer having fields automatically extracted. This issue seemed to have arose when I updated to version 6.4.2.

Thx

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

Based on this link:
http://docs.splunk.com/Documentation/Hunk/6.4.1/HunkReleaseNotes/Releasenotes -- See ERP-1901 - there was an issue with CSV extraction but it was fixed in 6.4.1.

I am checking to see if that same fixed is also in 6.4.2

0 Karma

jwalzerpitt
Influencer

Thx for checking.

I have three other sources - Cisco ISE, Cisco ASA, and Windows Event Logs - whose fields were being extracted before I updated to version 6.4.2. For those three sources I was using the relevant add-ons: Splunk_CiscoISE (with Splunk_TA_cisco-ise), Splunk_TA_windows, and Splunk_TA_cisco-asa.

Here are the stanzas for those three from my props.conf file:

[source::/LogCentral/Applications/ISE_PRD/...]
sourcetype = cisco:ise:syslog
DATETIME_CONFIG =
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false

[source::/LogCentral/Applications/Firewall_PRD/...]
sourcetype = cisco:asa

[source::/LogCentral/WindowsEvent/PRD/...]
sourcetype = windows_snare_syslog
0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

I tried 6.4.2 with a CSV file, and Hunk did the right thing without any entry in the props.conf. Hunk auto translated the CSV to a JSON visualization automatically

0 Karma

jwalzerpitt
Influencer

Thx for the update.

I tried extracting fields and then saved the extracted fields, restarted Splunk, and the fields still didn't extract.

Waiting to work with Splunk support...

0 Karma

ddrillic
Ultra Champion

The following speaks about it Hunk - Conditional Record Format

What is the path for your configurations?

0 Karma

jwalzerpitt
Influencer

What configurations? props and transforms?

Thx

0 Karma

ddrillic
Ultra Champion

Right ; - )

0 Karma

jwalzerpitt
Influencer

I even modified my props.con file as follows and still no automated field extraction:

[source::/LogCentral/Applications/Shibboleth_PRD/*]
sourcetype = shibboleth
REPORT-manual-shib = manual-shib
KV_MODE = none
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
category = Custom
description = Comma-separated value format. Set header and other settings in "Delimited Settings"
pulldown_type = true
0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...