Security

How to add disk sizes in splunk?

theouhuios
Motivator

Hello

I need to compute a table which needs to add allocated spaces for different servers. And the data is being grabbed by doing a df command. Now how do we add 2.1G 100M etc? Or the only way is to change the way the data is actually coming into splunk by changing the command used?

Regards

theou

Tags (1)
0 Karma
1 Solution

MHibbin
Influencer

You could probably use eval to work out the data quantities in bytes, for example (assuming field being extracted as "Size" and "Used" from df -h😞

* | rex field=Size "(?P<Size>\d+[\.,\d]*)G" | rex field=Used "(?P<Used>\d+[\.,\d]*)M" | eval Size=Size*1024*1024*1024 | eval Used=Used*1024*1024 | 

This will give you the Size field and the Used field in bytes instead of GigaBytes and Megabytes (respectively).

This obviously gets messy very quickly... I would rather change the script to not use GB or MB or KB (or whatever) and use just a df instead of df -h... if you really want to handle larger block sizes, make sure to define the size in the command, e.g. df -m for MB.

EDIT: To complete the answer... then to work out the total of each column/field you would pipe to a stats command and then use the sum() function on your fields. e.g.

... | stats sum(Size) as totalSize sum(Used) as totalUsed

Hope this helps

View solution in original post

MHibbin
Influencer

You could probably use eval to work out the data quantities in bytes, for example (assuming field being extracted as "Size" and "Used" from df -h😞

* | rex field=Size "(?P<Size>\d+[\.,\d]*)G" | rex field=Used "(?P<Used>\d+[\.,\d]*)M" | eval Size=Size*1024*1024*1024 | eval Used=Used*1024*1024 | 

This will give you the Size field and the Used field in bytes instead of GigaBytes and Megabytes (respectively).

This obviously gets messy very quickly... I would rather change the script to not use GB or MB or KB (or whatever) and use just a df instead of df -h... if you really want to handle larger block sizes, make sure to define the size in the command, e.g. df -m for MB.

EDIT: To complete the answer... then to work out the total of each column/field you would pipe to a stats command and then use the sum() function on your fields. e.g.

... | stats sum(Size) as totalSize sum(Used) as totalUsed

Hope this helps

MHibbin
Influencer

okay cool, would probably be easier, if it is not too much trouble, you could also convert the byte values to KB etc, using the eval command to make it human readable in Splunk. I updated answer with stats to answer the question fully.

0 Karma

theouhuios
Motivator

Thanks. Will probably change the command itself

0 Karma
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...