Getting Data In

How to remove the duplicate logs or events?

Isaias_Garcia
Path Finder

I have this serch string
source=/xxxx/log/xxxx/server.log ERROR

and i got this:

2014-01-06 13:28:33,828 ERROR xxx.xxx.xxxUsersRolesLoginModule Failed to load users/passwords/role files
2014-01-06 13:28:32,878 ERROR xxx.xxx.xxxUsersRolesLoginModule Failed to load users/passwords/role files
2014-01-06 13:28:32,814 ERROR xxx.xxx.xxxUsersRolesLoginModule Failed to load users/passwords/role files

and so on..

Question: How to remove the duplicate?I tried some of the suggestions I found on this site but to no avail. I used dedup and uniq but it returned the same number of duplicate logs. Please advise what is the best way to remove duplicates and retain just one instance of logs.Thanks in advance

Tags (3)
0 Karma

somesoni2
Revered Legend

As suggested by ShaneNewman/Iguinn, all these entries are unique as they have different timestamp/workerthread/IP. The only thing duplicate thing is the error message among these events. To filter the duplicate events based on the error message only, you can do field extraction or use rex command, something like below: (rex example)

source=/xxxx/log/xxxx/server.log ERROR | rex "\]\)\s(?<ErrorMessage>.*)" | dedup ErrorMessage

Isaias_Garcia
Path Finder

Hello- Thanks for your suggestion however it did not seem to work... so for the time being i just used NOT command to atleast remove all the duplicate logs. Thank you.

0 Karma

ShaneNewman
Motivator

These are not duplicate logs. If you notice, the IP and port are different for each entry above. If you want to remove these, you will have to do field extraction and leave out the WorkerThread field from the dedup list.

Isaias_Garcia
Path Finder

Thanks Iguinn.

0 Karma

lguinn2
Legend

The timestamps and worker thread numbers are different as well.

It may be that all these entries represent the same problem or incident - but these are definitely unique events.

As Shane points out, you will need to create the fields - and decide how you define "duplicate" in this context. I'd be interested in seeing your definition of "duplicate".

ShaneNewman
Motivator

Not in the eyes of Splunk. Because the port numbers are different, they are unique. That is why you will have to do field extraction to get them to be seen as duplicates.

0 Karma

Isaias_Garcia
Path Finder

Hi Shane, thanks for the prompts response. These are actually a duplicate logs.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...