I have had a few issues ingesting data into the correct index. We are deploying an app from the deployment server, and this particular app has two clients. Initially, when I set this app up, I was ingesting data into our o365 index. This data looked somewhat like:
We have a team running a script that tracks all deleted files. We were getting in one line per event. And at the time, I had the inputs.conf that looked like:
[monitor://F:\scripts\DataDeletion\SplunkReports] index=o365 disabled=false source=DataDeletion |
It would ingest all CSV files within that DataDeletion Directory. In this case, it ingested everything under that directory. This worked.
I changed the index to testing so i could manage the new data a bit better while we were still testing it. One inputs.conf backup shows that i had this at some point:
[monitor://F:\scripts\DataDeletion\SplunkReports\*.csv] index=testing disabled=false sourcetype=DataDeletion crcSalt = <string> |
Now months later, I have changed the inputs.conf to ingest everything into the o365 index, and i have applied that change and pushed it out to the class using the Deployment server, and yet the most recent data looks different. The last events we ingested went into the testing index and looked like:
This may be due to how the script is sending data into splunk, but it looks like its aggregating hundreds of separate lines into one event. My inputs.conf looks like this currently:
[monitor://F:\scripts\DataDeletion\SplunkReports\*]
|
I am just trying to grab everything under D:\DataDeletion\SplunkReports\ on the new windows servers, and ingest all of the csv files under there, breaking up each line in the csv into a new event. What is the proper syntax for this inputs, what am i doing wrong, I have tried a few things and none of them see to work. Ive tried adding a whitelist, adding a blacklist, I have recursive and crcSalt there just to grab anything and everything. and if the script isnt at fault at sending in chunks of data in one event, would adding a props.conf fix how Splunk is ingesting this data? Thanks for any help.
Use INDEXED_EXTRACTIONS = CSV in porps.conf for your sourcetype and push it to Universal Forwarder too along with inputs.conf
props.conf
[DataDeletion]
INDEXED_EXTRACTIONS = CSV
FIELD_DELIMITER = ,
FIELD_NAMES = field1, field2, field3, field4 # (Replace with actual field names)
TIME_FORMAT = %Y-%m-%d %H:%M:%S # (Adjust based on your timestamp format)
TIMESTAMP_FIELDS = timestamp_field # (Replace with the actual field containing the timestamp)
------