Security

How to add disk sizes in splunk?

theouhuios
Motivator

Hello

I need to compute a table which needs to add allocated spaces for different servers. And the data is being grabbed by doing a df command. Now how do we add 2.1G 100M etc? Or the only way is to change the way the data is actually coming into splunk by changing the command used?

Regards

theou

Tags (1)
0 Karma
1 Solution

MHibbin
Influencer

You could probably use eval to work out the data quantities in bytes, for example (assuming field being extracted as "Size" and "Used" from df -h😞

* | rex field=Size "(?P<Size>\d+[\.,\d]*)G" | rex field=Used "(?P<Used>\d+[\.,\d]*)M" | eval Size=Size*1024*1024*1024 | eval Used=Used*1024*1024 | 

This will give you the Size field and the Used field in bytes instead of GigaBytes and Megabytes (respectively).

This obviously gets messy very quickly... I would rather change the script to not use GB or MB or KB (or whatever) and use just a df instead of df -h... if you really want to handle larger block sizes, make sure to define the size in the command, e.g. df -m for MB.

EDIT: To complete the answer... then to work out the total of each column/field you would pipe to a stats command and then use the sum() function on your fields. e.g.

... | stats sum(Size) as totalSize sum(Used) as totalUsed

Hope this helps

View solution in original post

MHibbin
Influencer

You could probably use eval to work out the data quantities in bytes, for example (assuming field being extracted as "Size" and "Used" from df -h😞

* | rex field=Size "(?P<Size>\d+[\.,\d]*)G" | rex field=Used "(?P<Used>\d+[\.,\d]*)M" | eval Size=Size*1024*1024*1024 | eval Used=Used*1024*1024 | 

This will give you the Size field and the Used field in bytes instead of GigaBytes and Megabytes (respectively).

This obviously gets messy very quickly... I would rather change the script to not use GB or MB or KB (or whatever) and use just a df instead of df -h... if you really want to handle larger block sizes, make sure to define the size in the command, e.g. df -m for MB.

EDIT: To complete the answer... then to work out the total of each column/field you would pipe to a stats command and then use the sum() function on your fields. e.g.

... | stats sum(Size) as totalSize sum(Used) as totalUsed

Hope this helps

MHibbin
Influencer

okay cool, would probably be easier, if it is not too much trouble, you could also convert the byte values to KB etc, using the eval command to make it human readable in Splunk. I updated answer with stats to answer the question fully.

0 Karma

theouhuios
Motivator

Thanks. Will probably change the command itself

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...