Splunk Search

How to frame a table from stats data?

ninja_panda
Engager

I want to create a Bar chart with the logs where the key would be the stats count field name and value would be the sum value

Query : 

search1 | eval has_error = if(match(_raw, "WARNING"),1,0)| stats sum(has_error) as field1| join instance [search2 | eval has_error = if(match(_raw, "WARNING"),1,0)| stats sum(has_error) as field2| join instance [search3 | eval has_error = if(match(_raw, "WARNING"),1,0)| stats sum(has_error) as field3|join instance [search4  | eval has_error = if(match(_raw, "WARNING"),1,0)| stats sum(has_error) as field4]]] | stats sum( field1), sum(field2), sum( field3), sum( field4)

Current result:

field1 field2 field3 field4
30 44 122 6

 

Expected result:

Field Count
field1 30
field2 44
field3 122
field4 6



Labels (2)
0 Karma
1 Solution

FelixLeh
Contributor

If I understand your search correctly, you want to search for the amount of errors over 4 different sources and show them in a bar chart.

If you already have a key that identifies the source in the separate searches and the "search1-4" is just index and source type etc then try: 

Lets say each "search" comes from a seperate index. 

(search1 OR search2 OR search3 OR search4)
| eval has_error = if(match(_raw, "WARNING"),1,0)
| where has_error=1
| stats count by index

Then you have the amount of errors from each key.

 

If it is a more komplex base search try this:

search 1
| eval key = "search1"
| append [ search search2 | eval key = "search2"]
| append [ search search3 | eval key = "search3"]
| append [ search search3 | eval key = "search3"]
| eval has_error = if(match(_raw, "WARNING"),1,0)
| where has_error=1
| stats count by key

_______________________________________

If this was helpful please consider awarding Karma. Thx!

View solution in original post

ninja_panda
Engager

The second solution worked as a charm. thank you 

0 Karma

FelixLeh
Contributor

Glad to hear that 🙂

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @ninja_panda,

using a search like the one you shared, you can launch your search and then go to walk for 30-60 minutes!

Splunk isn't a database, you have to avoid join command and use it only if you haven't any other solution and always with few data.

You can use stats and eval to hasve the same result in a quicker way,

the main problem is to identify a filter for each search, if e.g. in each search you have a different index, or another field it's the same thing, you can try something like this:

index IN (index1, index2, index3, index4) "WARNING"
| stats count BY index
| eval field=case(index=index1,"field1",index=index2,"field2",index=index3,"field3",index=index4,"field4")
| table field count

If you have to use a different condition to identify the four searches, you can use something like this:

index IN (index1, index2, index3, index4) "WARNING"
| stats 
   count(eval(condition1)) AS field 1
   count(eval(condition2)) AS field 2
   count(eval(condition3)) AS field 3
   count(eval(condition4)) AS field 4
| transpose

Ciao.

Giuseppe

 

0 Karma

FelixLeh
Contributor

If I understand your search correctly, you want to search for the amount of errors over 4 different sources and show them in a bar chart.

If you already have a key that identifies the source in the separate searches and the "search1-4" is just index and source type etc then try: 

Lets say each "search" comes from a seperate index. 

(search1 OR search2 OR search3 OR search4)
| eval has_error = if(match(_raw, "WARNING"),1,0)
| where has_error=1
| stats count by index

Then you have the amount of errors from each key.

 

If it is a more komplex base search try this:

search 1
| eval key = "search1"
| append [ search search2 | eval key = "search2"]
| append [ search search3 | eval key = "search3"]
| append [ search search3 | eval key = "search3"]
| eval has_error = if(match(_raw, "WARNING"),1,0)
| where has_error=1
| stats count by key

_______________________________________

If this was helpful please consider awarding Karma. Thx!

Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...