Splunk Search

Total Memory

SN1
Path Finder

Hello i run df -h on indexer and i got

SN1_0-1741762105537.png

now i want the total , available and used space but using SPL how can i achieve that.
like in this case total will be 16Tb , used will be 12T and Available will be 5T if we sum all the spaces . How can i get this result using SPL.

 

 

 



Labels (1)
0 Karma
1 Solution

livehybrid
SplunkTrust
SplunkTrust

This variation of the partition-space search previously mentioned might work for you, this particular search is from within the monitoring consoles so should work. Infact for me it does return the correct space for the "/" mount point. 

| rest splunk_server=* /services/server/status/partitions-space
        | eval free = if(isnotnull(available), available, free)
        | eval usage = round((capacity - free) / 1024, 2)
        | eval capacity = round(capacity / 1024, 2)
        | eval compare_usage = usage." / ".capacity
        | eval pct_usage = round(usage / capacity * 100, 2)
        | stats first(fs_type) as fs_type first(compare_usage) AS compare_usage first(pct_usage) as pct_usage by mount_point        
        | rename mount_point as "Mount Point", fs_type as "File System Type", compare_usage as "Disk Usage (GB)", pct_usage as "Disk Usage (%)"

 I think Ive had issues in the past where people havent implemented this endpoint correctly and havent split the stats by mount_point, which can mean you get broken stats based on a number of different mount points!

Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards

Will

View solution in original post

livehybrid
SplunkTrust
SplunkTrust

This variation of the partition-space search previously mentioned might work for you, this particular search is from within the monitoring consoles so should work. Infact for me it does return the correct space for the "/" mount point. 

| rest splunk_server=* /services/server/status/partitions-space
        | eval free = if(isnotnull(available), available, free)
        | eval usage = round((capacity - free) / 1024, 2)
        | eval capacity = round(capacity / 1024, 2)
        | eval compare_usage = usage." / ".capacity
        | eval pct_usage = round(usage / capacity * 100, 2)
        | stats first(fs_type) as fs_type first(compare_usage) AS compare_usage first(pct_usage) as pct_usage by mount_point        
        | rename mount_point as "Mount Point", fs_type as "File System Type", compare_usage as "Disk Usage (GB)", pct_usage as "Disk Usage (%)"

 I think Ive had issues in the past where people havent implemented this endpoint correctly and havent split the stats by mount_point, which can mean you get broken stats based on a number of different mount points!

Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards

Will

SN1
Path Finder

is there a possibility to this using index?

0 Karma

livehybrid
SplunkTrust
SplunkTrust

You can also use an mstats query to query to _metrics index:

| mstats latest(_value) as val WHERE index=_metrics AND metric_name=spl.intr.disk_objects.Partitions.data.* by data.mount_point, metric_name
| rename data.mount_point as mount_point
| eval metric_name=replace(metric_name,"spl.intr.disk_objects.Partitions.data.","")
| eval {metric_name}=val
| stats latest(*) as * by mount_point
| eval free = if(isnotnull(available), available, free) 
| eval usage = round((capacity - free) / 1024, 2) 
| eval capacity = round(capacity / 1024, 2) 
| eval compare_usage = usage." / ".capacity 
| eval pct_usage = round(usage / capacity * 100, 2) 
| stats first(compare_usage) AS compare_usage first(pct_usage) as pct_usage by mount_point 
| rename mount_point as "Mount Point", compare_usage as "Disk Usage (GB)", pct_usage as "Disk Usage (%)"

Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards

Will

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @SN1 

Yes, the data is also in the _introspection index, so you can run the following search instead of using REST endpoints if you prefer - this also means you can track it easily over time if needed too.

index=_introspection host=macdev sourcetype=splunk_disk_objects 
| rename data.* as *
| eval free = if(isnotnull(available), available, free) 
| eval usage = round((capacity - free) / 1024, 2) 
| eval capacity = round(capacity / 1024, 2) 
| eval compare_usage = usage." / ".capacity 
| eval pct_usage = round(usage / capacity * 100, 2) 
| stats first(fs_type) as fs_type first(compare_usage) AS compare_usage first(pct_usage) as pct_usage by mount_point 
| rename mount_point as "Mount Point", fs_type as "File System Type", compare_usage as "Disk Usage (GB)", pct_usage as "Disk Usage (%)"

Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards

Will

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @SN1 

I think this might have been mentioned on the previous thread about disk space that you raised, but I think you should probably look to implement the Splunk Add-on for Unix and Linux which has Disk monitoring.

https://splunkbase.splunk.com/app/833

Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards

Will

 

0 Karma

livehybrid
SplunkTrust
SplunkTrust

If you do decide to go down the route of using the Linux TA then just enable what you need (ie the disk monitoring) as it can become very busy very quickly as there are lots of different inputs!

Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards

Will

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...

Splunkbase Unveils New App Listing Management Public Preview

Splunkbase Unveils New App Listing Management Public PreviewWe're thrilled to announce the public preview of ...