Getting Data In

multi-line event props.conf confusion

iom100uk
Explorer

I'm trying to get the results of a script which outputs a largeish table into splunk, but something isn't right in the way that the results are being split into different events.

I want the complete table (about 100 lines) to be contained in one event so I can do magic with a multikv command.  At the moment, each run is spilt across events - some are 60+ lines, some a single lines and some between those.

 

The actual script is being run on a search head, which has all it's outputs being forwarded to the indexer. The script should starts output with the literal characters BOF and end EOF - this works fine when run directly.  Config files below:

inputs.conf:

[script://$SPLUNK_HOME/etc/apps/stem-snmp/bin/stem-snmptable.sh]
disabled=false
index=main
interval=60
sourcetype=stem-snmptable

props.conf:

[stem-snmptable]
DATETIME_CONFIG = CURRENT
EVENT_BREAKER_ENABLE = true
EVENT_BREAKER = "(EOF)"
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = true
category = Custom
pulldown_type = 1
disabled = false

 

On the indexer  I have the following in a custom app local folder (is this right?)

 

[stem-snmptable]
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = true
category = Custom
pulldown_type = 1
disabled = false
MUST_BREAK_AFTER = "(EOF)"
MUST_NOT_BREAK_AFTER = "(BOF)"
DATETIME_CONFIG = CURRENT

 

So, where have I gone wrong. Do I need to put the indexer props.conf in a different location? Have I misunderstood the break and linemerge configs?

 

Any help much appreciated.

 

 

Labels (1)
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...