Good morning all,
I have been beating my head against this issue for a week or more. Let me give you the details.
We have one indexer and multiple Universal Forwarders in the field. One of these forwarders I am running a scripted input to gather directory data for a file monitoring solution.
###### Scripted Input to monitor jpeg files [script://.\bin\dircontents.bat] disabled = 0 ## Run once per minute interval = 60 sourcetype = Script:dir_files index = filewatch
@echo off D: cd /seed dir /b
The forwarder gathers this data from the script:
24Aug2017.txt 24Jan2018.txt 28Jul2016.txt 28Jul2016.txt~ 29Jan2018.txt INCHARGE-AM-PM-AL.seedfile INCHARGE-AM-PM-AZ.seedfile INCHARGE-AM-PM-GA-FL.seedfile INCHARGE-AM-PM.seedfile MitchDRSite.list rcp.list TSM-seed.list
This data is one event with Multiple lines. I want to bread on the line feeds. That sounds simple enough.
[Script:dir_files] SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n]+) MAX_EVENTS = 10000 TRUNCATE = 0
After I deploy the configs to the UF, the data starts coming in as a single event with multiple lines. Very frustrating!!!
I have tried many things, changed my regex around and I just can not find the solution.
Any help would be appreciated at this time.
Let me know what you think
I did have that thought this morning but wanted to get my question in. I will try that and see what happens.
I would rather split the events on the UF before indexing. That way I do not have to restart the production Splunk instance.
I'll try the props.conf on the indexer and will report the outcome.
Well, that worked as expected. The data broke on the line feeds at the indexer level.
I would still like to know if the data can be split up at the UF before sending the data.