Hi Everyone,
I want to install splunk trail version. I have multiple Domain Controllers, File Servers, Exchange Server, Firewalls.
the idea is to present splunk capabilities.
Please tell me:
1-Trial license is 500 MB/per day so what should be my strategy( how many indexers, search heads and Forwarders I can configure)
2- estimate space required for each data source, for example, for DC how many events can be indexed.
3- what should be my architecture strategy.
4- How can I drop windows events at universal forwarder or Index level.
5- How can I filter network events( should I do this at network device it self .. OR i can drop events at index level.
For #1-3: If your desire is to demonstrate/examine Splunk capabilities, then my advice to you is not to stand up your own test infrastructure; you would be MUCH better off simply firing up an ES or ITSI sandbox which has your sourcetypes already coming in.
For #4, there is a github app for cutting down on Windows event sizes that should be easily found with any search engine.
For #5, in general, don't weigh down your indexers doing filter work that can be done any place else.
Hey@rashid47010,
There are certain limitation for trial license such as clustering and all cannot be done.
Refer this doc for limitations:
http://docs.splunk.com/Documentation/Splunk/7.0.3/Admin/MoreaboutSplunkFree
Let me know if this helps!!!
For #1-3: If your desire is to demonstrate/examine Splunk capabilities, then my advice to you is not to stand up your own test infrastructure; you would be MUCH better off simply firing up an ES or ITSI sandbox which has your sourcetypes already coming in.
For #4, there is a github app for cutting down on Windows event sizes that should be easily found with any search engine.
For #5, in general, don't weigh down your indexers doing filter work that can be done any place else.
@rashid47010, For point 1 and 3: Refer to Splunk Validated Architectures whitepaper and .conf 2017 Session for the same.
For Point 2: If you have Splunk Admin/Architect provide them with the devices details, volume of data per day/per week and data growth, retention requirement etc. If not reach out to Splunk Sales folks (Sales Rep or Sales Engineer) who might be closely tied to you/your location or else get the help from Professional Services
For point 4: dropping event based on your input data, some data can be filtered via Universal Forwarder using blacklist option ( if they are supported ) in the inputs.conf.
If not you can use nullQueue to filter data from getting indexed. Other option would be to use Scripted Input to Splunk and have your Script drop the unwanted data from being indexed.
hi @niketnilay,
thanks for sharing useful information.
I am looking exactly the same.