Hi all,
I am finding duplicate events during search operation. I am bit confused on where the issue is lies and how to start investigating on this.
Regards,Shivanand
This can happen when you are using regex to extract fields, but have not set kvmode=none.
If you are using regex to extract fields, and have not set this value, Splunk will honor your regex extractions, but (by default) also try to determine on it's own the structure of the data and parse the fields (xml, json, etc.).
Set kvmode=none if using regex.
Its also possible to receive duplicate events if your forwarders are configured with useAck=true
To see if this is the cause look for "Possible duplication of events" in index=_internal
https://docs.splunk.com/Documentation/Forwarder/8.0.2/Forwarder/Protectagainstthelossofin-flightdata...
Hi @shivanandbm,
at first check if they arrive from the same host.
If yes, check output.conf and inputs.conf on that host to see if there's any overlapping.
If not, check if they arrive from a cluster: if yes, you have to re-design your log ingestion, if not it's correct to have duplicated events.
Ciao.
Giuseppe
How to check whether it is arriving from the cluster..
How i can check whether the duplicates are happening from the cluster
Hi @shivanandbm,
if they don't arrive from the same host, they aren't duplicated events!
There's only one exception: if they arrive from a cluster.
About the cluster, sorry I wasn't clear: I asked if they arrive from a cluster to monitor (e.h. Oracle cluster), because if you have a Universal Forwarder on the two components of a cluster, there's the possibility to read twice the events.
Ciao.
Giuseppe
Thanks i will investigate