Splunk Enterprise

Splunk stream

Vadim_Peskov
Observer

Hi!
We use Splunk Stream 7.3.0. When receiving an event in a log longer than 1000000 characters, Splunk cuts it. Event in json format. Tell me what settings should be applied in Splunk Stream so that Splunk parses the data correctly.
Thanks!

0 Karma

_JP
Contributor

IIRC Splunk Stream doesn't have truncation settings and this ends up being caught by the truncation settings for your sourcetype within props.conf.  Can you share what your stanza is for your sourcetype?  Is TRUNCATE=1000000?  You might need to change to TRUNCATE=0 to force Splunk to include all of the event.

0 Karma

Vadim_Peskov
Observer
[stream:ip]
TRUNCATE = 0


did not help.
Any other suggestions?

0 Karma

_JP
Contributor

Did you see any change in the data being ingested when you made the TRUNCATE value change?  Also, if you change it to something specific to test it, like 10237.  Does that limit it to 10237 bytes?  This is mainly just to see if this particular TRUNCATE setting is what is limiting your data, and maybe we can help rule it out as the culprit so we know to dig further.

0 Karma

Vadim_Peskov
Observer

now in props.conf
[stream:ip]
TRUNCATE = 100000
I will change to 0, I will check, I will return with the answer

0 Karma
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...