All Apps and Add-ons

How can I delete double values of the first field, but sum the second?

Crooda
New Member

Hi there,

I hope you can help me. I use the URL Toolbox to get the domain of my proxy logs.

lookup ut_parse_extended_lookup url | table ut_domain count | sort -count | head 100

These are the search results in the following table:

ut_domain         count
google.com        1000
heise.de          500
yahoo.com         20
google.com        200
yahoo.com         100

There are about 10,000 more URLs, some of them very often.
I want a table with every unique URL, but the counts summed like:

ut_domain         count
google.com        1200
heise.de          500
yahoo.com         120

Has anyone an idea? Thank you very much.

0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

Try this.

lookup ut_parse_extended_lookup url | stats sum(count) as Count by ut_domain | table ut_domain Count | sort -Count | head 100
---
If this reply helps you, Karma would be appreciated.

View solution in original post

Crooda
New Member

it's working, thanks 🙂

0 Karma

woodcock
Esteemed Legend

Like this:

.... lookup ut_parse_extended_lookup url | table ut_domain count | stats sum(count) AS count by ut_domain | sort -count | head 100
0 Karma

richgalloway
SplunkTrust
SplunkTrust

Try this.

lookup ut_parse_extended_lookup url | stats sum(count) as Count by ut_domain | table ut_domain Count | sort -Count | head 100
---
If this reply helps you, Karma would be appreciated.
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...