Getting Data In

Raspberry Pi Universal Forwarder Bug Report for splunkforwarder-8.0.3-a6754d8441bf-Linux-arm.tgz:

BongoTheWhippet
Path Finder

On a Raspberry Pi 3 armv7l GNU/Linux, INDEXED_EXTRACTIONS=JSON in the props.conf file results in unrecoverable JSON StreamId processing errors:

05-06-2020 17:52:07.836 +0100 ERROR JsonLineBreaker - JSON StreamId:8017092045127549753 had parsing error:Unexpected character: '5' - data_source="/opt/splunkforwarder/var/log/splunk/metrics.log", data_host="rpi3", data_sourcetype="json"
05-06-2020 17:52:07.836 +0100 ERROR JsonLineBreaker - JSON StreamId:8017092045127549753 had parsing error:Unexpected character: '5' - data_source="/opt/splunkforwarder/var/log/splunk/metrics.log", data_host="rpi3", data_sourcetype="json"
05-06-2020 17:52:07.836 +0100 ERROR JsonLineBreaker - JSON StreamId:8017092045127549753 had parsing error:Unexpected character: '5' - data_source="/opt/splunkforwarder/var/log/splunk/metrics.log", data_host="rpi3", data_sourcetype="json"

with the log expanding so quickly, it fills up the /opt/splunkforwarder/var/log/splunk/splunkd.log to maximum logrotate capacity.

Steps to duplicate bug:

  1. Install splunkforwarder-8.0.3-a6754d8441bf-Linux-arm.tgz onto a Raspberry Pi 3.
  2. Edit the /opt/splunkforwarder/etc/system/local/props.conf and add the following code:

    [default]
    SHOULD_LINEMERGE = false
    KV_MODE = none
    INDEXED_EXTRACTIONS=JSON
    NO_BINARY_CHECK = true
    TRUNCATE = 0

  3. Add a local JSON file to the splunk file monitor with $SPLUNKHOME/bin/splunk add monitor /var/log/myvalidjsonfile.json -sourcetype json -host myhost -index myindex

  4. Restart splunk.

  5. Check the file tail -f $SPLUNKHOME/var/log/splunk/splunkd.log

  6. Watch it scroll away off the screen! The errors above are reported for both metrics.log and the splunkd.log itself(!)

  7. Stop splunk.

  8. Edit props.conf again and remove the line INDEXED_EXTRACTIONS=JSON.

  9. Restart splunk.

  10. Your splunkd.log is back to normal again.

0 Karma
1 Solution

darrenfuller
Contributor

so, i think what is happening is that you are adding that INDEXED_EXTRACTIONS=JSON to default, which will apply to every log the system is forwarding , and that includes the Splunk logs themselves (everything in $SPLUNK_HOME/var/log/splunk), which are not JSON formatted.

you are better to use a specific sourcetype for your pi logs, and add the indexed extractions to that rather than in default.

./D

View solution in original post

darrenfuller
Contributor

so, i think what is happening is that you are adding that INDEXED_EXTRACTIONS=JSON to default, which will apply to every log the system is forwarding , and that includes the Splunk logs themselves (everything in $SPLUNK_HOME/var/log/splunk), which are not JSON formatted.

you are better to use a specific sourcetype for your pi logs, and add the indexed extractions to that rather than in default.

./D

BongoTheWhippet
Path Finder

You're right, but ewwww, that's expected behaviour?

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...