Splunk Search

Compiling of Data

albyva
Communicator

Using this set of data:

Time Host Type Packets

12:00 mothra A 5
12:05 mothra A 6
12:10 mothra A 7
12:00 mothra B 100
12:05 mothra B 200
12:10 mothra B 300

I want to compile and calculate the data to come out like this:

Time Host Packet_Loss

12:10 mothra 0.03

The problem I'm having is how do I gather up all of Type=A and Type=B so I can then run
something like | eval=packetloss(typeA/typeB) ? I understand enough of splunk to gather
up this data and all. My issue is getting everything under Type=A and Type=B combined so I
can then run the math on it.

Tags (2)
0 Karma
1 Solution

Damien_Dallimor
Ultra Champion

There is probably a better search to use if I knew more about your data and use case , how many host types you are dealing with , what time buckets you want to calculate the ratios over etc...

But anyhow , based purely on the data set above , this search worked. It should at least get you pointed in the right direction.I bucketed up into days , so the output is packet loss per day between those 2 host types. The streamstats command allows me to perform the ratio math on the current and previous event ie: host type A's packet sum and host type B's packet sum.

index=main sourcetype=foo | bucket _time span=1d | stats sum(packets) as total_packets by host,type,_time | streamstats window=1 global=f current=f first(total_packets) as next_total_packets | eval packet_loss=next_total_packets/total_packets | table  _time host packet_loss | tail 1

View solution in original post

Damien_Dallimor
Ultra Champion

There is probably a better search to use if I knew more about your data and use case , how many host types you are dealing with , what time buckets you want to calculate the ratios over etc...

But anyhow , based purely on the data set above , this search worked. It should at least get you pointed in the right direction.I bucketed up into days , so the output is packet loss per day between those 2 host types. The streamstats command allows me to perform the ratio math on the current and previous event ie: host type A's packet sum and host type B's packet sum.

index=main sourcetype=foo | bucket _time span=1d | stats sum(packets) as total_packets by host,type,_time | streamstats window=1 global=f current=f first(total_packets) as next_total_packets | eval packet_loss=next_total_packets/total_packets | table  _time host packet_loss | tail 1

albyva
Communicator

Thanks. I've give this one a try and see how things pan out. I've never used the "bucket" or "streamstats" so I'll have some new toys to play with.

Thanks,

0 Karma
Get Updates on the Splunk Community!

Building Reliable Asset and Identity Frameworks in Splunk ES

  Accurate asset and identity resolution is the backbone of security operations. Without it, alerts are ...

Cloud Monitoring Console - Unlocking Greater Visibility in SVC Usage Reporting

For Splunk Cloud customers, understanding and optimizing Splunk Virtual Compute (SVC) usage and resource ...

Automatic Discovery Part 3: Practical Use Cases

If you’ve enabled Automatic Discovery in your install of the Splunk Distribution of the OpenTelemetry ...