Security

How to add disk sizes in splunk?

theouhuios
Motivator

Hello

I need to compute a table which needs to add allocated spaces for different servers. And the data is being grabbed by doing a df command. Now how do we add 2.1G 100M etc? Or the only way is to change the way the data is actually coming into splunk by changing the command used?

Regards

theou

Tags (1)
0 Karma
1 Solution

MHibbin
Influencer

You could probably use eval to work out the data quantities in bytes, for example (assuming field being extracted as "Size" and "Used" from df -h😞

* | rex field=Size "(?P<Size>\d+[\.,\d]*)G" | rex field=Used "(?P<Used>\d+[\.,\d]*)M" | eval Size=Size*1024*1024*1024 | eval Used=Used*1024*1024 | 

This will give you the Size field and the Used field in bytes instead of GigaBytes and Megabytes (respectively).

This obviously gets messy very quickly... I would rather change the script to not use GB or MB or KB (or whatever) and use just a df instead of df -h... if you really want to handle larger block sizes, make sure to define the size in the command, e.g. df -m for MB.

EDIT: To complete the answer... then to work out the total of each column/field you would pipe to a stats command and then use the sum() function on your fields. e.g.

... | stats sum(Size) as totalSize sum(Used) as totalUsed

Hope this helps

View solution in original post

MHibbin
Influencer

You could probably use eval to work out the data quantities in bytes, for example (assuming field being extracted as "Size" and "Used" from df -h😞

* | rex field=Size "(?P<Size>\d+[\.,\d]*)G" | rex field=Used "(?P<Used>\d+[\.,\d]*)M" | eval Size=Size*1024*1024*1024 | eval Used=Used*1024*1024 | 

This will give you the Size field and the Used field in bytes instead of GigaBytes and Megabytes (respectively).

This obviously gets messy very quickly... I would rather change the script to not use GB or MB or KB (or whatever) and use just a df instead of df -h... if you really want to handle larger block sizes, make sure to define the size in the command, e.g. df -m for MB.

EDIT: To complete the answer... then to work out the total of each column/field you would pipe to a stats command and then use the sum() function on your fields. e.g.

... | stats sum(Size) as totalSize sum(Used) as totalUsed

Hope this helps

MHibbin
Influencer

okay cool, would probably be easier, if it is not too much trouble, you could also convert the byte values to KB etc, using the eval command to make it human readable in Splunk. I updated answer with stats to answer the question fully.

0 Karma

theouhuios
Motivator

Thanks. Will probably change the command itself

0 Karma
Get Updates on the Splunk Community!

Exporting Splunk Apps

Join us on Monday, October 21 at 11 am PT | 2 pm ET!With the app export functionality, app developers and ...

Cisco Use Cases, ITSI Best Practices, and More New Articles from Splunk Lantern

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Build Your First SPL2 App!

Watch the recording now!.Do you want to SPL™, too? SPL2, Splunk's next-generation data search and preparation ...