Getting Data In

Why are we seeing duplicate data coming from a Splunk Enterprise forwarder to an external syslog server?

jppham
New Member

I am seeing duplicate events coming from SPLUNK to our external logger.
external syslog server is 10.1.1.25

It appears there are duplicate specs in different stanzas below. Could this be the cause of the duplicate data
we are seeing?
[syslog:Everything]
[syslog:Logger]

OUTPUTS

[syslog]
defaultGroup =

[syslog:Everything]
disabled = false
timestampformat = %b %e %H:%M:%S
server = 10.1.1.25:514

[syslog:Logger]
disabled = false
timestampformat = %b %e %H:%M:%S
server = 10.1.1.25:514

[tcpout]
maxQueueSize = 500KB
forwardedindex.0.whitelist = .*
forwardedindex.1.blacklist = _.*
forwardedindex.2.whitelist = _audit
forwardedindex.filter.disable = false
indexAndForward = false
autoLBFrequency = 30
blockOnCloning = true
compressed = false
disabled = false
dropClonedEventsOnQueueFull = 5
dropEventsOnQueueFull = -1
heartbeatFrequency = 30
maxFailuresPerInterval = 2
secsInFailureInterval = 1
maxConnectionsPerIndexer = 2
forceTimebasedAutoLB = false
sendCookedData = true
connectionTimeout = 20
readTimeout = 300
writeTimeout = 300
useACK = false

defaultGroup=nowhere

0 Karma

Richfez
SplunkTrust
SplunkTrust

Have you tried commenting one of those two stanzas out, restarting Splunk and seeing what happens?

0 Karma

jppham
New Member

My first thought was to comment out the first stanza, but I wasn't sure if this could be the cause.
I will probably try that.

thanks for the suggestion.

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...