Splunk Search

How to create an average on the position for each value in events (ex: Dog = position 1 and 5, Average for Dog = 3)?

JovanMilosevic
Path Finder

Hi Wonder if any of the community can help me.

I'm trying to create an average of some data, and running into problems.

My data looks like this

Fieldname is Animals

Event 1 - "Dog","Cat","Horse","Rabbit","Zebra"
Event 2 - "Zebra","Horse","Cat","Rabbit","Dog"

I'm trying to create an average for each element (i.e each animal) in the event, based on it's position.

So -
Dog = position 1 and 5
Cat = position 2 and 3
Horse = position 3 and 2
etc

what I want out the back of this, is a table that looks like the following

Animal Avg
Dog 3
Cat 2.5
Horse 2.5
Rabbit 4
Zebra 2.5

Many Thanks for any help.

Tags (3)
1 Solution

aalanisr26
Path Finder

index=win |head 2| eval animals="Dog,Dog,Dog,Cat,Horse,Rabbit,Zebra"|eval animals= split(animals,",") |eval eventno=1 |accum eventno |mvexpand animals |streamstats count by eventno |stats avg(count) by animals

View solution in original post

aalanisr26
Path Finder

index=win |head 2| eval animals="Dog,Dog,Dog,Cat,Horse,Rabbit,Zebra"|eval animals= split(animals,",") |eval eventno=1 |accum eventno |mvexpand animals |streamstats count by eventno |stats avg(count) by animals

aalanisr26
Path Finder

the index=win|head 2|
was just for testing purposes but I get the idea right?

0 Karma

JovanMilosevic
Path Finder

Thats got it!

Thanks - Really appreciate that.

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...