Getting Data In

New CSV files are being seen, but not indexed

Builder

I am working on adding large CSV files into splunk. Here is an example csv file:

TimeStamp,Transport Overload,Core Overload,LS Overload,Connections Rejected,Connection Errors,Pending Msg Dropped,Supressed Retransmit,Response Retransmit,Req Dropped,Invalid Requests,Invalid Responses,Invalid Messages,ACL Demotions,Transaction Timeouts,Duplicate Transactions,Transport Errors,Transport App Errors,Trans Redundancy Errs,Challenge Not Found,Challenge Dropped,DNS Errors,No Target/Route,XML Errors,Core Trans Errors,Core App Errors,Exceeded License Cap,Missing Dialog,Expired Sessions,Terminated Sessions,Multiple OK Drops,B2BUA Trans Errors,B2BUA App Errors,B2BUA Redundancy Errs,Reg Rejects,Reg Cache Timeouts,Reg w/o Contacts,Forced UnRegister,Out of Map Ports,NMC Rejected,NMC Diverted,No Routes Found,Next Hop OOS,Anonymous Source,Invalid Trunk Group,Inb SA Constraints,Outb SA Constraints,Inb REG SA Constraint,Outb REG SA Constraint,LS Trans Errors,LS App Errors,LS Redundancy Errors,Media Overload,SDP Offer Errors,SDP Answer Errors,Drop Media Errors,Invalid SDP,Media Failure Drops,Media Exp Events,Early Media Exps,Exp Media Drops,Media Trans Errors,Media App Errors
1270047982,0,0,0,0,5,0,0,0,0,0,0,0,39,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,12,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
1270048282,0,0,0,0,5,0,0,0,0,0,0,0,52,0,0,0,0,0,0,15,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,12,0,0,45,0,0,0,0,0,0,0,0,0,0,0,0,0
1270048582,0,0,0,0,5,0,0,0,0,0,0,0,52,0,0,0,0,0,0,15,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,12,0,0,45,0,0,0,0,0,0,0,0,0,0,0,0,0

When the files are placed into the monitoring directory, only one file out of about 580 is being indexed. No other subsequent files are indexed, despite the file count in manager->data inputs going up. I modified the props.conf file so that the CHECK_METHOD is now set to entire_md5 so that it does hashing based off of the contents of the entire file as opposed to the first/last 256 bytes. However, this has not changed the behavior splunk is exhibiting.

We dropped some files with gibberish data into the directory and splunk did pick those up and index them. Anyone have any ideas?

Tags (1)
0 Karma

Splunk Employee
Splunk Employee

Do you see all your files if you check this:

https://localhost:8089/services/admin/inputstatus/TailingProcessor%3AFileStatus

Note: you might need to modify port in case you have not used the default ones

0 Karma

Splunk Employee
Splunk Employee

did you successfully authenticate ?

0 Karma

Builder




call not properly authenticated

0 Karma

Splunk Employee
Splunk Employee

What do you see if you search for...

index=_internal "TailingProcessor" "montior:///var/log/some.log" 

...where "monitor:///var/log/some.log" is replaced with the stanza from your inputs.conf

0 Karma

Builder

The result I get is:

10-11-2010 08:03:43.286 INFO TailingProcessor - Parsing configuration stanza: monitor:///var/splunk/p-cscf/batch/sip-errors.

0 Karma

Splunk Employee
Splunk Employee

What do you see if you search for...
index=_internal "TailingProcessor" "montior:///var/log/some.log"

where "monitor:///var/log/some.log" is replaced with the stanza from your inputs.conf

0 Karma