Hi Splunk experts,
Here is a search request:
| eventcount summarize=false report_size=true index=* | eval GB = size_bytes / 1024 / 1024 / 1024 | eval GB = round(GB, 0) | sort GB | reverse | head 10
How can I get these size counters for splunk indexes over period of time, say daily?
I'd like to check how fast vol utilization by indexes is growing over time.
Just let Splunk do the heavy lifting.
index=_introspection component=Indexes | timechart avg("data.total_size") by data.name
I believe the introspection index was added in 6.1... so if you're stuck on a lower version you could have your search run daily and summary index the results, then run your report on that summary index.
Just let Splunk do the heavy lifting.
index=_introspection component=Indexes | timechart avg("data.total_size") by data.name
I believe the introspection index was added in 6.1... so if you're stuck on a lower version you could have your search run daily and summary index the results, then run your report on that summary index.
All worked, thank you.
Easy as pie:
index=_introspection component=Indexes | eval data.total_size = 'data.total_size' / 1024 | timechart span=1d max("data.total_size") by data.name
Note the use of single quotation marks on the RHS of the eval
to avoid interpreting the dot as the concatenation operator.
Running this query, seems to give me what the typical size of a particular index on a single indexer, supposed you have many indexers? Why isn't this adding up and giving a full index size accross all your indexers? How do you add it all up to give you the illusive report that many people have been looking for which is, index size over time. Finally, this seems to give you 10 indexes only.
By now, using the monitoring console in recent versions of splunk should give you that info across all boxes.
Ok, that did the trick!
And last question, if I want to have data.total_size in GB - how to apply "eval"?
Right, avg()
may not be what you're looking for. Consider this:
index=_introspection component=Indexes | timechart span=1d max("data.total_size") by data.name
That takes exactly one value per index per day, the highest value to be precise.
Thank you, there is such index.
Is there a way to aggregate it by day? It shows data for each 30 sec.
Although aggregation like avg is not required, just looking to get any size value per index per day.