Splunk Enterprise

Splunk stream

Vadim_Peskov
Observer

Hi!
We use Splunk Stream 7.3.0. When receiving an event in a log longer than 1000000 characters, Splunk cuts it. Event in json format. Tell me what settings should be applied in Splunk Stream so that Splunk parses the data correctly.
Thanks!

0 Karma

_JP
Contributor

IIRC Splunk Stream doesn't have truncation settings and this ends up being caught by the truncation settings for your sourcetype within props.conf.  Can you share what your stanza is for your sourcetype?  Is TRUNCATE=1000000?  You might need to change to TRUNCATE=0 to force Splunk to include all of the event.

0 Karma

Vadim_Peskov
Observer
[stream:ip]
TRUNCATE = 0


did not help.
Any other suggestions?

0 Karma

_JP
Contributor

Did you see any change in the data being ingested when you made the TRUNCATE value change?  Also, if you change it to something specific to test it, like 10237.  Does that limit it to 10237 bytes?  This is mainly just to see if this particular TRUNCATE setting is what is limiting your data, and maybe we can help rule it out as the culprit so we know to dig further.

0 Karma

Vadim_Peskov
Observer

now in props.conf
[stream:ip]
TRUNCATE = 100000
I will change to 0, I will check, I will return with the answer

0 Karma
Get Updates on the Splunk Community!

Machine Learning - Assisted Adaptive Thresholding

Let’s talk thresholding. Have you set up static thresholds? Tired of static thresholds triggering false ...

Observability Unlocked: Kubernetes Monitoring with Splunk Observability Cloud

  Ready to master Kubernetes and cloud monitoring like the pros?Join Splunk’s Growth Engineering team for an ...

Wrapping Up Cybersecurity Awareness Month

October might be wrapping up, but for Splunk Education, cybersecurity awareness never goes out of season. ...