Getting Data In

multi-line event props.conf confusion

iom100uk
Explorer

I'm trying to get the results of a script which outputs a largeish table into splunk, but something isn't right in the way that the results are being split into different events.

I want the complete table (about 100 lines) to be contained in one event so I can do magic with a multikv command.  At the moment, each run is spilt across events - some are 60+ lines, some a single lines and some between those.

 

The actual script is being run on a search head, which has all it's outputs being forwarded to the indexer. The script should starts output with the literal characters BOF and end EOF - this works fine when run directly.  Config files below:

inputs.conf:

[script://$SPLUNK_HOME/etc/apps/stem-snmp/bin/stem-snmptable.sh]
disabled=false
index=main
interval=60
sourcetype=stem-snmptable

props.conf:

[stem-snmptable]
DATETIME_CONFIG = CURRENT
EVENT_BREAKER_ENABLE = true
EVENT_BREAKER = "(EOF)"
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = true
category = Custom
pulldown_type = 1
disabled = false

 

On the indexer  I have the following in a custom app local folder (is this right?)

 

[stem-snmptable]
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = true
category = Custom
pulldown_type = 1
disabled = false
MUST_BREAK_AFTER = "(EOF)"
MUST_NOT_BREAK_AFTER = "(BOF)"
DATETIME_CONFIG = CURRENT

 

So, where have I gone wrong. Do I need to put the indexer props.conf in a different location? Have I misunderstood the break and linemerge configs?

 

Any help much appreciated.

 

 

Labels (1)
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...