Getting Data In

LEA Loggrabber CDATA ExecProcessor error

krugger
Communicator

Hi,

I have the lea-loggrabber.sh script correctly pulling data via OPSEC from multiple firewalls. However my logs are being flooded with error messages like below:

05-02-2013 16:00:38.477 +0100 ERROR ExecProcessor - message from "/opt/splunk/etc/apps/Splunk_TA_opseclea_linux22/bin/lea-loggrabber.sh --configentity Firewall03" sh: ![CDATA[1366363524@Firewall03: No such file or directory

How can I correct this?

1 Solution

krugger
Communicator

This issue keeps spamming my splunkd.log. Had to start discarding stderr by changing the inputs.conf to:

[script:///opt/splunk/etc/apps/Splunk_TA_opseclea_linux22/bin/lea-loggrabber.my --configentity Firewall03 2>/dev/null]
disabled = 0
interval = 600
passAuth = admin
sourcetype = opsec
index = checkpoint

View solution in original post

krugger
Communicator

This issue keeps spamming my splunkd.log. Had to start discarding stderr by changing the inputs.conf to:

[script:///opt/splunk/etc/apps/Splunk_TA_opseclea_linux22/bin/lea-loggrabber.my --configentity Firewall03 2>/dev/null]
disabled = 0
interval = 600
passAuth = admin
sourcetype = opsec
index = checkpoint

krugger
Communicator

This happens because of the shell thinks the ![CDATA is a command to be executed. It can be replicated with:

sh /opt/splunk/etc/apps/Splunk_TA_opseclea_linux22/bin/lea-loggrabber.my --configentity Firewall01 > banana

I had to comment out the part that reads the token:

read auth_key
SPLUNK_TOK=$auth_key
export SPLUNK_TOK

0 Karma

wmccracken
New Member

I too am getting these errors. Any idea on how to fix this?

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...