Getting Data In

best way to define fields for a csv with no headers

Contributor

I have a source which is csv but has no headers. I'm trying to set up props.conf and transforms.conf to supply these rather than have to edit every csv file to add them, but the darn thing won't work. Here is props.conf: [winsec_csv] TRANSFORMS-winsec = extract_winsec

transforms.conf: [extract_winsec] DELIMS="," REGEX=* FIELDS = "logtype", "time", "logtype1", "eventcode", "status", "actiontype", "userid", "dc", "message"

Sample events: SEC,06/05/2010 10:31:24,Security,540,Success,Logon/Logoff ,UNISUPER\MEPUTIL04$,MEPDOM02,Successful Network Logon:^User Name: MEPUTIL04$^ Domain: UNISUPER^Logon ID: (0x0 0x27048EFB)^ Logon Type: 3^Logon Process: Kerberos^ Authentication Package: Kerberos^Workstation Name: ^ Logon GUID: {4fff3638-0ad5-b2d7-7358-2ae3dd4509b3}^Caller User Name: -^ Caller Domain: -^Caller Logon ID: -^ Caller Process ID: -^Transited Services: -^ Source Network Address: 10.3.37.101^Source Port: 0^ SEC,06/05/2010 10:31:26,Security,540,Success,Logon/Logoff ,UNISUPER\WS-02241$,MEPDOM02,Successful Network Logon:^User Name: WS-02241$^ Domain: UNISUPER^Logon ID: (0x0 0x2704BA25)^ Logon Type: 3^Logon Process: Kerberos^ Authentication Package: Kerberos^Workstation Name: ^ Logon GUID: {268894b4-e2a0-3b99-b2a8-c446da5c7eca}^Caller User Name: -^ Caller Domain: -^Caller Logon ID: -^ Caller Process ID: -^Transited Services: -^ Source Network Address: 10.3.0.160^Source Port: 0^

The REGEX I have added in frustration because of the error 'REGEX must be specified'--don't see why, it's not used. Now no data is being indexed at all, but I don't get the error message...

Yes this is windows security event log but for complicated reasons I can't use it directly--have to have this.

How do I do this? I can use a generic solution to this for other scenarios anyway. The documentation shows this to be straightforward--so why won't it go??

Tags (1)

Splunk Employee
Splunk Employee

All you have to do is specify a sourcetype name that you will assign to the files and the fields:

[mysourcetypename]
FiELDS = field1, "second field", field3, field4, "field number 5", field6

The headers in the file actually get in the way.

cmeo, are you willing to index all the fields? Otherwise, you should put in props.comf [winsec_csv] REPORT-winsec = extract_winsec, instead of TRANSFORMS.

0 Karma

Splunk Employee
Splunk Employee

Sigh. If you've got forwarded files it doesn't matter, but unfortunately to kill the auto-csv generation, you have to make a stanza for your source or sourcetype, set CHECK_FOR_HEADER = false and set priority = 101 (or some higher number) to cancel out a built-in rule that applies to any file name ending in .csv. You could also rename the file I guess. If it's already been generated, just delete the contents of the etc/apps/learned/local folder. (Don't disable the learned app. Only tears will come of that.)

0 Karma

Contributor

Cracked it using another example in answers. What was happening is that my definitions were conflicting with splunk's own attempt in apps/learned.

So I suppose the only remaining question is, how to prevent splunk from attempting a learned configuration?

0 Karma

Contributor

This looks like a props.conf entry. I tried that--splunk ignored it. Still no field names in the picker except two (CN, DC) which splunk has somehow defined for itself.

Where does this entry go? Where did DC and CN come from and how do I stop splunk doing that?

0 Karma

Contributor

sorry the cut and paste mangled things. Every line in the sample events starts with SEC, the rest should be pretty obvious.

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!