Splunk Search

stats list() command will not display items after fieldformat. How can I resolve it?

feickertmd
Communicator

This a fairly big query, so only run it for the past couple of weeks.

index=_internal source=*license_usage.log type="Usage"|bin _time span=1d
|eval h=lower(h)
| stats sum(b) as b by idx, h, st, _time | stats avg(b) as dailyAverage, max(b) as dailyMax by idx, h,
 st
|appendpipe [stats sum(dailyAverage) as dailyAverage sum(dailyMax) as dailyMax by idx,h | eval st="--TOTAL  IN MB BY HOST--"]
|appendpipe [stats sum(dailyAverage) as dailyAverage sum(dailyMax) as dailyMax by idx | eval st="##-TOTAL  IN MB BY APPLICATION-##"]
|eval dailyAverage=round(dailyAverage/1024/1024,3)
|eval dailyMax=round(dailyMax/1024/1024,3)
| sort idx, h,dailyAverage
|stats  list(h) as h, list(st) as Sourcetype,list(dailyAverage) as "dailyAverage", list(dailyMax) as dailyMax by idx
|rename h as "Server Name", st as "Data Source", dailyAverage as "Daily Data Average in MB", dailyMax as "Maximum Daily Data"

The issue is that I want my fields showing size in MB to have comma separation for readability. We usually obtain that through a search query like this:

|fieldformat dailyAverage= tostring(dailyAverage,"commas")

This query line works until I run a stats command and try to list the values. I end up with a blank dailyAverage column and the corresponding size column continues to show numerical values with no commas. Somehow the list() function is negating the fieldformat command.

Any help appreciated.

0 Karma
1 Solution

lguinn2
Legend

The problem arises because of how fieldformat works. It creates a "string version" of the field as well as the original (numeric) version. Most likely the stats command is unclear about which version of the field should be used - or something like that. Try this instead:

 index=_internal source=*license_usage.log type="Usage" 
 | bin _time span=1d | eval h=lower(h)
 | stats sum(b) as b by idx h st _time 
 | stats avg(b) as dailyAverage, max(b) as dailyMax by idx, h, st
 |appendpipe [stats sum(dailyAverage) as dailyAverage sum(dailyMax) as dailyMax by idx h 
              | eval st="--TOTAL  IN MB BY HOST--"]
 |appendpipe [stats sum(dailyAverage) as dailyAverage sum(dailyMax) as dailyMax by idx 
              | eval st="##-TOTAL  IN MB BY APPLICATION-##"]
 |eval dailyAverage=tostring(round(dailyAverage/1024/1024,3),"commas")
 |eval dailyMax=round(dailyMax/1024/1024,3)
 | sort idx h dailyAverage
 | stats  list(h) as h list(st) as Sourcetype list(dailyAverage) as dailyAverage list(dailyMax) as dailyMax by idx
 | rename h as "Server Name" st as "Data Source" dailyAverage as "Daily Data Average in MB"
          dailyMax as "Maximum Daily Data"

I can't actually test this completely, but I think it should work.

View solution in original post

lguinn2
Legend

The problem arises because of how fieldformat works. It creates a "string version" of the field as well as the original (numeric) version. Most likely the stats command is unclear about which version of the field should be used - or something like that. Try this instead:

 index=_internal source=*license_usage.log type="Usage" 
 | bin _time span=1d | eval h=lower(h)
 | stats sum(b) as b by idx h st _time 
 | stats avg(b) as dailyAverage, max(b) as dailyMax by idx, h, st
 |appendpipe [stats sum(dailyAverage) as dailyAverage sum(dailyMax) as dailyMax by idx h 
              | eval st="--TOTAL  IN MB BY HOST--"]
 |appendpipe [stats sum(dailyAverage) as dailyAverage sum(dailyMax) as dailyMax by idx 
              | eval st="##-TOTAL  IN MB BY APPLICATION-##"]
 |eval dailyAverage=tostring(round(dailyAverage/1024/1024,3),"commas")
 |eval dailyMax=round(dailyMax/1024/1024,3)
 | sort idx h dailyAverage
 | stats  list(h) as h list(st) as Sourcetype list(dailyAverage) as dailyAverage list(dailyMax) as dailyMax by idx
 | rename h as "Server Name" st as "Data Source" dailyAverage as "Daily Data Average in MB"
          dailyMax as "Maximum Daily Data"

I can't actually test this completely, but I think it should work.

feickertmd
Communicator

That's what I needed, was eval vs. fieldformat. Thanks, lgiunn!

aholzer
Motivator

Are you doing the formatting before or after the stats?

If you do it before, then it's like converting all your numbers to strings, and trying to calculate the average of a string.

0 Karma

feickertmd
Communicator

I have attempted it right before and after line 10. If I add it before line 10 and delete everything after, I get the rendering I want. But when I then apply list(), it dies.

0 Karma

aholzer
Motivator

What you want to do is apply the formatting after you do the the stats command. You should get your average calculated based on the numbers and then convert it to a string.

0 Karma

feickertmd
Communicator

Tried it, by adding the fieldformat after stats but before rename.

I supplied the code, it's an internal index, so try it yourself.

0 Karma

aholzer
Motivator

Got it. The reason for that is because you have created a multi-value field (mvfield) by using list(), and the fieldformatter doesn't know how to handle those.

You are either going to have to figure out how to manipulate mvfields to iterate through all the values in the mvfield, or you are going to have to calculate the daily average in a step before you use the stats list, so that you can convert it to a formatted string and then put it in the stats list().

0 Karma

feickertmd
Communicator

Tried that as well. If you add fieldformat before line 10, it applies to single value fields and works great. But then if you try to aggregate them in list after fieldformat... boom.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...