Getting Data In

Help to find daily indexed data size by each index

dhavamanis
Builder

Need your help,

Can you please tell us, how to find daily indexed data size by each index?

Tags (3)
1 Solution

ppablo
Retired

Hi @dhavamanis

There are large number of the same, if not similar, question already posted on Answers. Do the search(es) in this post answer your question? There's an option per day and per month.
http://answers.splunk.com/answers/154773/how-to-create-a-report-that-shows-max-indexed-volume-per-da...

View solution in original post

ppablo
Retired

Hi @dhavamanis

There are large number of the same, if not similar, question already posted on Answers. Do the search(es) in this post answer your question? There's an option per day and per month.
http://answers.splunk.com/answers/154773/how-to-create-a-report-that-shows-max-indexed-volume-per-da...

dhavamanis
Builder

Thanks, i have just added wildcard search for source to get the results.

index=_internal source="*license_usage.log*" type=Usage  | eval yearmonthday=strftime(_time, "%Y%m%d") | eval yearmonth=strftime(_time, "%Y%m%d") | stats sum(eval(b/1024/1024/1024)) AS volume_b by idx yearmonthday yearmonth | chart sum(volume_b) over yearmonth by idx

indut
Path Finder

hi, Why do we have 2 fields  yearmonthday  and yearmonth in this query? 

0 Karma

indut
Path Finder

hi any update on this from anyone ? Thank you!

0 Karma

ppablo
Retired

Great, I'm glad it helped you find your solution 🙂

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...