Security

How to add disk sizes in splunk?

theouhuios
Motivator

Hello

I need to compute a table which needs to add allocated spaces for different servers. And the data is being grabbed by doing a df command. Now how do we add 2.1G 100M etc? Or the only way is to change the way the data is actually coming into splunk by changing the command used?

Regards

theou

Tags (1)
0 Karma
1 Solution

MHibbin
Influencer

You could probably use eval to work out the data quantities in bytes, for example (assuming field being extracted as "Size" and "Used" from df -h😞

* | rex field=Size "(?P<Size>\d+[\.,\d]*)G" | rex field=Used "(?P<Used>\d+[\.,\d]*)M" | eval Size=Size*1024*1024*1024 | eval Used=Used*1024*1024 | 

This will give you the Size field and the Used field in bytes instead of GigaBytes and Megabytes (respectively).

This obviously gets messy very quickly... I would rather change the script to not use GB or MB or KB (or whatever) and use just a df instead of df -h... if you really want to handle larger block sizes, make sure to define the size in the command, e.g. df -m for MB.

EDIT: To complete the answer... then to work out the total of each column/field you would pipe to a stats command and then use the sum() function on your fields. e.g.

... | stats sum(Size) as totalSize sum(Used) as totalUsed

Hope this helps

View solution in original post

MHibbin
Influencer

You could probably use eval to work out the data quantities in bytes, for example (assuming field being extracted as "Size" and "Used" from df -h😞

* | rex field=Size "(?P<Size>\d+[\.,\d]*)G" | rex field=Used "(?P<Used>\d+[\.,\d]*)M" | eval Size=Size*1024*1024*1024 | eval Used=Used*1024*1024 | 

This will give you the Size field and the Used field in bytes instead of GigaBytes and Megabytes (respectively).

This obviously gets messy very quickly... I would rather change the script to not use GB or MB or KB (or whatever) and use just a df instead of df -h... if you really want to handle larger block sizes, make sure to define the size in the command, e.g. df -m for MB.

EDIT: To complete the answer... then to work out the total of each column/field you would pipe to a stats command and then use the sum() function on your fields. e.g.

... | stats sum(Size) as totalSize sum(Used) as totalUsed

Hope this helps

MHibbin
Influencer

okay cool, would probably be easier, if it is not too much trouble, you could also convert the byte values to KB etc, using the eval command to make it human readable in Splunk. I updated answer with stats to answer the question fully.

0 Karma

theouhuios
Motivator

Thanks. Will probably change the command itself

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...