is it possible to monitor all these log files & the log files which gets generated in future on daily basis? If yes, how do i configure it in inputs.conf file? I am following this naming convention, so that i can archive files which are 10 days old and retain the rest. If i maintain a single log file, its difficult to archive the older data. Please let me know the best approach.
Also if once the data in the log file is indexed, how long would the data be retained in the Splunk index? Is it until we clean up the index in Splunk?
I think in your case, you might take a look at Example 2 from the documentation pertaining to monitoring files and directories: To load anything in /apache/ that ends in .log.
or for Windows
Splunk will monitor any new additions or changes to the files you set up on a monitoring stanza on.
The retention of your data depends on a number of factors. Most of the time, this depends on how much storage you have and Splunk will automatically remove data from indexes should storage be an issue. You can set a retirement date for your data by either size or age (time), and even do this per index if you care more about the retention of specific data. This document explains a bit about how Splunk data "ages" and "rolls" to different buckets depending on your settings: https://docs.splunk.com/Documentation/Splunk/6.5.2/Indexer/Setaretirementandarchivingpolicy
If you want to determine how long your data is hanging around, you could try this:
[ eventcount summarize="false" index=*
| dedup index
| fields index]
| stats min(startEpoch) AS startEpoch,min(modTime) AS modTime by index,splunk_server
| convert ctime(startEpoch) AS startEpoch
| rename modTime AS "Oldest Bucket",startEpoch AS "Earliest Event Time"