Splunk Search

How to remove Duplicate values in different field?

alexspunkshell
Contributor

How to remove duplicate values in a different field

|stats count by src dest

alexspunkshell_0-1660244580972.png

 

Labels (3)
Tags (2)
0 Karma
1 Solution

yuanliu
SplunkTrust
SplunkTrust

The most straightforward implementation is

| stats count by src dest
| where src != dest

Alternatively,

| where src != dest
| stats count by src dest

 

View solution in original post

richgalloway
SplunkTrust
SplunkTrust

Where are the duplicates?  I see the first 3 octets of some IP addresses match, but stats looks at the entire field, not just parts of it.  If you need to deduplicate on the first 3 octets, then use rex or split to extract it into a new field and dedup on that new field.

---
If this reply helps you, Karma would be appreciated.

alexspunkshell
Contributor

@richgalloway  Thanks for the reply

I am getting similar IPs in both src & dest fields.

So I want to remove in results if both src & dest are the same.

0 Karma

yuanliu
SplunkTrust
SplunkTrust

The most straightforward implementation is

| stats count by src dest
| where src != dest

Alternatively,

| where src != dest
| stats count by src dest

 

Taruchit
Contributor

Hi @alexspunkshell,

Please confirm if I understood your requirement correctly: -

There are two fields in the result: - src and dest.

If in a given row both src and dest are same, then you need to filter out those rows from the result.

Thank you

0 Karma

alexspunkshell
Contributor

@Taruchit  You are right.

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...