Splunk Search
Highlighted

Group by field / table

New Member

Hi,
I am grabbing interface errors from Cisco routers (via snmpget) that form a distinct path through the network. I want to present them in the same order of the path..

if I dedup the pathorder, it works, but not over any period of time.. I want to be able to group the whole path (defined by pathorder) (1-19) and display this "table" over time.

index=interfacepath sourcetype=interfaceerrors | dedup pathorder| table _time,hostname, ifName,ifOutDiscards,ifOutErrors,ifInDiscards,ifInErrors pathorder | sort pathorder

Sample of data output (formatting might not be screwy)(hostname field removed for sample data)


_time ifName ifOutDiscards ifOutErrors ifInDiscards ifInErrors path
order
2014-03-03 20:00:00 GigabitEthernet0/0 11508 0 0 0 1
2014-03-03 20:00:03 FastEthernet5/1 5471 30595 0 1 2
2014-03-03 20:00:13 POS2/0 3 0 4 13044 3
2014-03-03 20:00:24 POS2/0 674 0 14 368866 4

Does this make sense? I could be going about this wrong.. Looking for suggestions!

I would love to be able to make a sparkline for each Error and Discard field showing errors over time on one table/chart. But i think i need to figure out the grouping first..

Thanks,
Ross Warrren

0 Karma
Highlighted

Re: Group by field / table

SplunkTrust
SplunkTrust

Will there be multiple events with same pathorder? Will pathorder set 1-19 repeats for different calls?

0 Karma
Highlighted

Re: Group by field / table

New Member

Pathorder repeats 1-19, there will not be multiple events in the same 5 min period with the same pathorder number. clear as mud?

It takes about 5 min to get the data from all the routers and the cron job runs every 10 minutes.

side note: If any wants some snmpget/walk scripts.. Willing to share..

Thanks, Ross Warren

0 Karma
Highlighted

Re: Group by field / table

SplunkTrust
SplunkTrust

Assuming that within a 10 min period, there are only one event/entry with one path_order (since the cron job runs every 10 min, there will be one set of entries every 10 min.), try this:

 index=interface_path sourcetype=interface_errors | eval orig_time=_time | bucket span=10m _time  | stats first(orig_time) as orig_time ,first(host_name) as host_name, first(ifName) as ifName,first(ifOutDiscards) as ifOutDiscards,first(ifOutErrors) as ifOutErrors,first(ifInDiscards) as ifInDiscards,first(ifInErrors) as ifInErrors  by path_order,_time | table orig_time,host_name, ifName,ifOutDiscards,ifOutErrors,ifInDiscards,ifInErrors, path_order | sort orig_time, path_order
0 Karma
Highlighted

Re: Group by field / table

New Member

Wow.. At first cut and paste this looks great! wow.. Thank you somesoni2! Need to take the time to understand what is going on now.. Give me a day to check it out!

WOWOWW!

0 Karma