Getting Data In

How do you prevent Splunk from indexing duplicated events?

djime
New Member

How do you prevent Splunk from indexing duplicate events forwarded from different forwarders? The monitored log files are recording the same events but in different servers. The requirement is needed for maintaining the availability of the monitored events, even when one of the servers is powered off.

Thank you.

0 Karma
1 Solution

gjanders
SplunkTrust
SplunkTrust

Effectively no, universal forwarders are not aware of other universal forwarders.
In fact Splunk enterprise instances are not aware of each other, each heavy forwarder would also be standalone.

Therefore you would have to build a script or find a way to only monitor the file when the instance should be running it...(or use another trick)

At the Splunk indexing tier it's also impossible to de-duplicate data on the way in, at least upto 7.2.x so far

View solution in original post

0 Karma

gjanders
SplunkTrust
SplunkTrust

Effectively no, universal forwarders are not aware of other universal forwarders.
In fact Splunk enterprise instances are not aware of each other, each heavy forwarder would also be standalone.

Therefore you would have to build a script or find a way to only monitor the file when the instance should be running it...(or use another trick)

At the Splunk indexing tier it's also impossible to de-duplicate data on the way in, at least upto 7.2.x so far

0 Karma

djime
New Member

Ok, thank you for the help

0 Karma

gjanders
SplunkTrust
SplunkTrust

Please click on accept answer so this question is marked as answered when you are ready (feel free to wait for more answers)...thanks!

0 Karma

rashi83
Path Finder

@gjanders - Can we do some config change on forwarder end to stop sending duplicate data?

0 Karma

gjanders
SplunkTrust
SplunkTrust

@rashi83 it would depend on what is causing it! The UF does not de-duplicate data, so if multiple files have some level of duplicate content you may get duplicates in Splunk...

If you monitor unique files on the UF you should not be seeing duplicates in Splunk outside issues with performance and the useACK setting...

0 Karma

richgalloway
SplunkTrust
SplunkTrust

To prevent data loss, you probably want to index the duplicate events and remove the duplicates at search time.

---
If this reply helps you, Karma would be appreciated.
0 Karma

djime
New Member

Thank you ,but the goal is to not index the duplicated events. Any other idea?

0 Karma
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...