Splunk Search

How to group IP address subnets to build an average duration by subnet range?

matt4321
Explorer

I am using a search to get the average Sessions Duration for my Windows security event logs. I want to take the below a step further and build average duration's by Subnet Ranges.

Starting search currently is:

index=mswindows host=* Account_Name=* | transaction Logon_ID startswith=EventCode=4624 endswith=EventCode=4634 | eval duration=duration/60

From here I am able to avg durations by Account_Name, Hostname etc..

I would like to now group by subnet range from the SourceIP field:
10.144.50.0/23 as Citrix Internal
10.144.11.0/23 as Citrix External

I don't mind putting in the octets for each Citrix Internal or External etc..

Any help would be appreciated!

0 Karma
1 Solution

javiergn
Super Champion

You can use he cidrmatch function and label your IPs:

| eval subnet = case (cidrmatch("10.144.50.0/23",YourIPField), "CitrixInternal", cidrmatch("10.144.11.0/23",YourIPField), "CitrixExtrernal")

View solution in original post

0 Karma

javiergn
Super Champion

You can use he cidrmatch function and label your IPs:

| eval subnet = case (cidrmatch("10.144.50.0/23",YourIPField), "CitrixInternal", cidrmatch("10.144.11.0/23",YourIPField), "CitrixExtrernal")
0 Karma

matt4321
Explorer

Actually to keep this simple I really only need 4 subnets.
So IP's 10.144.50.* and 10.144.51.* = Citrix Internal
and IP's 10.144.11.* and 10.144.12.* = Citrix External

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...