Hello folks,
there is a tool that helps in sizing a server that will work with accelerate data models ?
Or wich is the best way to achieve that goal?
It seams that splunk base configuration 12cpu/12gb of ram is not enoght.
Thank you all.
Hi @linspec9721,
in this case the hardware reference starts from the minimum you know (12 CPUs and 12 RAM) and groths if you have many users and/or many scheduled searches (e.g. if you have Splunk Security Essentials App).
My hint is to start with the hardware reference and monitor your Splunk environment to see if there are peaks that require more resources:
Ciao.
Giuseppe
Hi @linspec9721,
the storage required for accelerated Data Model, for one year is around
data indexed per day * 3.4
Ciao.
Giuseppe
Hello @gcusello.
And regarding CPU and RAM sizing? Default 12/12 configuration it seems not enought.
Thank you.
Hi @linspec9721,
are you speaking of ES or ITSI?
in these cases there are different configurations.
If instead you're speaking of Splunk Enterprise, the CPUs and RAM depend on the users, datamodel acceleration shouldn't give problems, obviously if you have many users that use datamodels, you need more resources.
Ciao.
Giuseppe
Hi @linspec9721,
in this case the hardware reference starts from the minimum you know (12 CPUs and 12 RAM) and groths if you have many users and/or many scheduled searches (e.g. if you have Splunk Security Essentials App).
My hint is to start with the hardware reference and monitor your Splunk environment to see if there are peaks that require more resources:
Ciao.
Giuseppe