Getting Data In

Why are we seeing duplicate data coming from a Splunk Enterprise forwarder to an external syslog server?

jppham
New Member

I am seeing duplicate events coming from SPLUNK to our external logger.
external syslog server is 10.1.1.25

It appears there are duplicate specs in different stanzas below. Could this be the cause of the duplicate data
we are seeing?
[syslog:Everything]
[syslog:Logger]

OUTPUTS

[syslog]
defaultGroup =

[syslog:Everything]
disabled = false
timestampformat = %b %e %H:%M:%S
server = 10.1.1.25:514

[syslog:Logger]
disabled = false
timestampformat = %b %e %H:%M:%S
server = 10.1.1.25:514

[tcpout]
maxQueueSize = 500KB
forwardedindex.0.whitelist = .*
forwardedindex.1.blacklist = _.*
forwardedindex.2.whitelist = _audit
forwardedindex.filter.disable = false
indexAndForward = false
autoLBFrequency = 30
blockOnCloning = true
compressed = false
disabled = false
dropClonedEventsOnQueueFull = 5
dropEventsOnQueueFull = -1
heartbeatFrequency = 30
maxFailuresPerInterval = 2
secsInFailureInterval = 1
maxConnectionsPerIndexer = 2
forceTimebasedAutoLB = false
sendCookedData = true
connectionTimeout = 20
readTimeout = 300
writeTimeout = 300
useACK = false

defaultGroup=nowhere

0 Karma

Richfez
SplunkTrust
SplunkTrust

Have you tried commenting one of those two stanzas out, restarting Splunk and seeing what happens?

0 Karma

jppham
New Member

My first thought was to comment out the first stanza, but I wasn't sure if this could be the cause.
I will probably try that.

thanks for the suggestion.

0 Karma
Get Updates on the Splunk Community!

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

 Prepare to elevate your security operations with the powerful upgrade to Splunk Enterprise Security 8.x! This ...

Get Early Access to AI Playbook Authoring: Apply for the Alpha Private Preview ...

Passionate about security automation? Apply now to our AI Playbook Authoring Alpha private preview ...

Reduce and Transform Your Firewall Data with Splunk Data Management

Managing high-volume firewall data has always been a challenge. Noisy events and verbose traffic logs often ...