Splunk Search

Trouble with df script and charting

aferone
Builder

We use the "df" script to grab disk space data from one of our Linux servers. We use the following search to pull out the "Used" number for the XFS volume:

source="df" host="hostname" | multikv fields Filesystem Size Used Avail Use% Mounted | search xfs | table _time Used | eval Used=rtrim(Used, "[K,G,M,T]")

The number was in the MBs, so we were able to chart it successfully, and it was in the 100s (for example, 978, 988, 994). However, now that we have reached over 1 Terabyte, the number we pull is single digits (1.0, 1.1, 1.2), and the chart is screwed up since it dive bombs from 999 to 1.0.

Is there a way to convert the number (by multiplying by 1000), but only when the field contains a "T" (for terabyte), and not a "G", or "M", etc?

Thanks!

Tags (2)
0 Karma
1 Solution

auntyem
Explorer

Here's what I did:
eval UsedG = case(match(Used,"[M]"),round(tonumber(rtrim(Used,"M"))/1024,3),match(Used,"[T]"),round(tonumber(rtrim(Used,"T"))*1024,3),match(Used,"[G]"),round(tonumber(rtrim(Used,"G")),3))

View solution in original post

auntyem
Explorer

Here's what I did:
eval UsedG = case(match(Used,"[M]"),round(tonumber(rtrim(Used,"M"))/1024,3),match(Used,"[T]"),round(tonumber(rtrim(Used,"T"))*1024,3),match(Used,"[G]"),round(tonumber(rtrim(Used,"G")),3))

aferone
Builder

This worked! Thank you very much!!

0 Karma

gkanapathy
Splunk Employee
Splunk Employee

Honestly, I would consider this a flaw in the Splunk df.sh script. In fact, the native df shell command is perfectly capable of returning straight numbers (with the -k option), and in fact the Splunk df.sh script goes through some trouble to make it return numeric values in human-convenient (and machine-inconvenient) formats on other platforms. I would probably ask for an ER, but it would be a bit problematic to implement since there will be legacy problems.

0 Karma

cphair
Builder

Pipe your search to an eval and if.


... | eval TestVal=if(match(Value,"TB"),Value*1000,Value)

You will have fewer significant digits for the TB entries, but the scale will be corrected.

0 Karma

aferone
Builder

Thank you very much for your help. The answer below worked. I appreciate your time!

0 Karma

cphair
Builder

That might work, but it would make your current data different from your historical data, and people might get unexpected results if they search across both sets. If you don't care about that, or if you're going to delete the old data, it's probably fine.

If you don't mind, would you post a line from your "table _time Used" so I can see what the format looks like? Or print "table _time Used Multiplier" to see if the multiplier is always 1? The eval method should work, but without seeing the exact data it's hard to get the correct syntax.

0 Karma

aferone
Builder

Thank you again for replying. The search you provided runs, but it doesn't change the "T" values still.

Would I able to just edit the df.sh script from the UNIX app so that the "df" results are not in human format? Then, I can just manipulate the results in the search?

Thanks again!

0 Karma

cphair
Builder

OK, on looking at this more carefully, it sounds like you need to both figure out what the letter is at the end of the Used field, and to strip it out so you can do math on it. I don't have df data to test on, but I suspect your search should look something like "source="df" host="hostname" | multikv fields Filesystem Size Used Avail Use% Mounted | search xfs | eval Multiplier=if(match(Used,"T"),1000,1) | eval Used=rtrim(Used, "[K,G,M,T]") | eval Used=Used*Multiplier | table _time Used "

0 Karma

aferone
Builder

Would I append that at the end of my search as written?

0 Karma
Get Updates on the Splunk Community!

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer at Splunk .conf24 ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...