Getting Data In

Opsec_lea not breaking even reliably

EricPartington
Communicator

i have the lea-loggrabber.sh script working well and reliably getting all new logs from checkpoint cma into splunk. I am starting to notice that about 10 messages per 24 hours are not breaking correctly. They end up being around 257 lines long before the event breaks.

how can i force the events to be broken reliably when imported by the lea-loggrabber.sh?

all events start with loc= and should end with \r\n

I have the sourcetype set in inputs.conf where the script is called

[script:///opt/splunkbeta/etc/apps/lea-loggrabber-splunk/bin/lea-loggrabber.sh]
disabled = 0
sourcetype = checkpoint_firewall

I am attempting to set the linebreaking in the props.conf

[checkpoint_firewall]
TIME_PREFIX= time=
TIME_FORMAT= %d%b%Y %H:%M:%S
BREAK_ONLY_BEFORE=loc
#LINE_BREAKER = ([\r\n]+)(?=loc\=)
#LINE_BREAKER = ([\r\n])(?=loc\=)

You can see my attempts at forcing a new event in the comments above.

Any suggestions on how to force a linebreak for these events?

0 Karma
1 Solution

gkanapathy
Splunk Employee
Splunk Employee

It's probably just because of the default MAX_EVENTS setting of 256. Just add:

MAX_EVENTS=999999

to your props.conf rules for the sourcetype.

View solution in original post

gkanapathy
Splunk Employee
Splunk Employee

It's probably just because of the default MAX_EVENTS setting of 256. Just add:

MAX_EVENTS=999999

to your props.conf rules for the sourcetype.

EricPartington
Communicator

If that gives me events that could have 9999999 lines in them, then i would want the opposite (MAX_EVENTS=1). I think i solved it with the ALWAYS_BREAK_BEFORE=loc (had to restart the splunk daemon).

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...