I am wondering if someone out there can help with this.
We are evaluating Splunk Enterprise, and using the 60 day evaluation version. This is going in a live area, with a universal forwarder sending data to the Splunk back end. My concern is that when we switch this on and point the forwarder at the Splunk back end, we will blow our 500 MB a day limit straight away, and we will not be able to gain any relevant test data because of the MB,s per day constraints, and thus rule Splunk out of our possible purchases.
The data is from Windows event logs, to start with. Is there a way of telling Splunk that you only want it to start logging from the date of installation, and not read all of the log history contained in the event logs?.
My other question, is to do with capacity planning. Is there any Splunk apps that can run stand alone, to basically give you any idea of how much data would be sent to splunk if it was installed.
We do not want to spend the money purchasing the product, and find that we cannot get what we want from it, because we blow our licensing limits all the time.
Thanks in advance for any help
First of all, blowing past 500MB does not stop indexing. You'll get a license warning for that day, but will only get locked out of non-internal searches after more than five days with warnings in a 30-day window.
Hence you could knowingly blow your trial license but keep looking at the license volume to see how much you'd have to buy later.
More importantly, you can contact Splunk Sales or your regional Splunk Partner for a larger evaluation license.
Concerning your question about ignoring historic Windows Event Log data, you can turn on
current_only=1 for those inputs - see http://docs.splunk.com/Documentation/Splunk/6.0.2/Admin/inputsconf for reference (search for current_only). Note, if you blow your license on the first day due to indexing historic data but stay within the license (regardless of size) afterwards there's no real harm done.
Thanks. One of our main concerns is that Splunk is based on daily GB allowance. I am hoping to find a rule of thumb or a program that will give us some idea of what we will generate on a daily basis. We know what we want to monitor, I just do not want to put forward Splunk as the solution when I might find that it consumes 5 GB a day and costs us £30 K +.
Finding a rule of thumb for any general Splunk installation is nigh-on impossible due to every scenario being unique.
The easiest - and free-of-license-cost - way is to just install Splunk on an evaluation/trial license, connect up all sources, and let them talk for a while. Whether you blow that license or not doesn't really matter.
Concerning your pricing example, you may want to talk with your sales people about more accurate figures once you have an idea about your expected volume.
If you don't have any sales contacts yet talk to Splunk UK: http://www.splunk.com/view/offices/SP-CAAAGDK
Martin, thanks for the pointer on limiting historic inputs. That will come in handy as we begin our deployment soon.
I also agree that estimating overall volume is tricky as there are SO many variables. My Splunk contact recommended using the Splunk On Splunk (SOS) app (see:http://apps.splunk.com/app/748/) to measure this. I was able to have a 10% sample size during our POV and got some numbers that I hope are in the ballpark. Getting 10% sampling in some scenarios is not practical but it was for me. This tool also looks useful going forward.
Rush2112, perhaps that app could be helpful?