Splunk Search

How to get result for device grouped by two different fields

hcastell
Path Finder

I have the following scenario:

x number of devices connected to 8 different nodes. The 8 nodes are connected to 3 switches. What I'm trying to do is generate a report that will show me the number of nodes that have device failures broken down by switch. I can generate a report that shows me the device failure by node but can't figure out how to get nodes with device failures broken down by switch.

The following search string allows me to get the device (STB) failure count by node (NodeName):

| eval foo=if(stb_fail==0, 0, 1)
| stats sum(foo) AS TotalBox by NodeName
| eval fooNode=case(TotalBox==1,"1", TotalBox==2,"2",TotalBox>2,">2")
| stats count by fooNode

I have a field called NodeName as well as another called SwitchName and have a table with values for both these fields. Hope my question is clear. Appreciate any assistance offered.

Tags (1)
0 Karma

MuS
SplunkTrust
SplunkTrust

Hi hcastell,

if I get this correct you should be able to use something like this run everywhere command:

index=_internal  source=*metrics.log | stats count(sourcetype) AS c_st by series

If you adapt it to your needs it would look like this:

your base search here
| eval foo=if(stb_fail==0, 0, 1)
| stats sum(foo) AS TotalBox by NodeName
| eval fooNode=case(TotalBox==1,"1", TotalBox==2,"2",TotalBox>2,">2")
| stats count(fooNode) by SwitchName

hope this helps ...

cheers, MuS

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...