Getting Data In

How to set a large log to ingest as one single event?


Been working on this for a week... hence my question now. I have a log that can be anywhere between 3,000 lines or 20,000_ lines. It's an output of a script that takes around 1 minute to complete. It rights to the log as the script progresses, and I want that entire log (start to finish) in a single event into Splunk.

You can see that it's ingesting that logs as multiple events (1st event at the bottom, last event at the top). Tried many combinations of props.conf (see below for current) as well as "interval" in my inputs.conf (hence removed).

Known log formatting:
Always starts with "Job ammolock submitted by teamA123 starting"
Always ends with "+ exit" as the last line of the log file.

6:03:33.000 AM      + ls -l j3483458_*
            + exit
            Show all 381 lines

6:03:31.000 AM      + cd /aa/6/prod/something/custom/spool
            Show all 27 lines

6:03:29.000 AM      + export RETURN_CODE=0
            Show all 357 lines

6:03:23.000 AM      **************************************************
            TOTAL RECORDS READ FROM TABLE = 000120882
            Show all 60 lines

6:02:45.000 AM      Job ammolock submitted by teamA123 starting 2019/12/19 06:002:45 AM
            Show all 2269 lines

index = blah
sourcetype = blah:logs
disabled = 0


What am I missing folks?


0 Karma


Check out these other answers. Most solutions involve a BREAK_ONLY_BEFORE or LINE_BREAKER rule that will never match vs the default carriage return/newline ([\r\n]+) and would expect that the data is written all at once vs having pauses in time between the records being written as I believe a Splunk monitor will only wait 3 seconds by default before breaking up an event (time_before_close parameter) once it reaches the end of a file.


time_before_close parameter and multiline_event_extra_waittime parameter

0 Karma


try scripted inputs, or add oneshot attached to a script of your own

0 Karma