Hi
Have a peculiar problem.
WIth a 2GB enterprise license , I want to index 4GB of data initially. Post that SPLUNK will only consume the updates in the logs which is determined around >300MB
I dont have a particular way to split the load because the total number of log files in folder is approx 100,000 & Naming convention is same for all .
Was wondering if anyone has any past experience or approaches for this type of scenario.
Note that 4GB is the total size of the logs. Even when rollout/application restart happens, files are retained for 15 days & that maybe more than 2 GB.
If it is just a one time 4GB data input (to get up and running so to speak) there is no problem, sure you get a license violation but as long as it is just a one time event there is no real problem. You can have 5 violations in a 30 day window after that you will have a problem, you can't search your data anymore and need to get a reset key.
hi
you can not have 2GB enterprise license and you want to index 4GB of data this is License violation if it is summary index no thing will happen to it but as it concern indexing 4gb data this really licence violaton