All Apps and Add-ons

When using the Splunk Stream app, why does Splunk suddenly stop indexing NetFlow data every 2 hours?

fedecastiglio
New Member

Hi community,

I've configured Splunk Stream to ingest NetFlow data (stream collector and Splunk indexer running on the same box), and it's actually working. But exactly every 2 hours, there is a 10 minute gap of data. Packet captures show normal traffic during that gap, so it looks like Splunk is not indexing that data.

Any idea of what could be the reason?

Thanks!

0 Karma

fedecastiglio
New Member

Looking at the streamfwd.log this seems to be the cause (several errors with diverse id numbers):

stream.NetflowReceiver - NetFlowDecoder::decodeFlow Unable to decode flow set data. No template with id <#>

So I guess Splunk Stream is not processing some template records?

0 Karma

fedecastiglio
New Member

Some updates:
After tweaking kernel parameters mentioned in Stream documentation I noticed some improvement.

We still have gaps every 2 hours, but those gaps last for 2 minutes.

This are the parameters:

(Before - Default)
net.core.rmem_default = 212992
net.core.rmem_max = 212992
net.core.netdev_max_backlog = 1000

(After)
net.core.rmem_default = 33554432
net.core.rmem_max = 33554432
net.core.netdev_max_backlog = 10000

Documentation suggests those parameters for high-volume packet capture. I'm not entirely sure of what's going on, as I don't think we have that amount of traffic.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...