Splunk Enterprise Security

How can end user help improve overall Splunk Enterprise Security speed?

jsven7
Communicator

My Splunk Admin is the landlord and I'm the tenant. Let's say the landlord is dealing with personal matters and cannot tend to their normal house upkeep duties. How may I selfishly help my landlord in order to improve my SplunkES experience? Context is an extremely slow ES experience.

Here are some of my thoughts:
o Enumerate all apps and versions and request upgrades
o Enumerate saved searches/alerts and their performance impact - make searches more efficient
o Enumerate all dashboards and their load - make searches more efficient
o Disable real-time searches only when necessary
o Enumerate unused knowledge objects - delete any unused

Am I thinking in right direction?
Version: 7.2.5.1
Enterprise Security Version: 5.2.2

0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

It depends on what is causing the extremely slow ES experience. Finding the cause is often the topic of a week-long Professional Services engagement, so we're unlikely to find it here. Here are some things to consider, though.

Upgrading apps is unlikely to help unless the upgrade specifically addresses performance.
Making searches more efficient is always good, but may not be enough.
Normally, one should use real-time searches only when absolutely necessary. Note, however, that in ES "real-time" has a different meaning and is preferred.
Deleting unused knowledge objects will have little effect unless it significantly reduces the size of the search bundle.

Too many searches can overwhelm the indexers and cause poor performance. Reduce the number of searches or schedule them more wisely.
It's possible the indexer tier is under-powered (not enough CPU or memory, slow disks, or not enough indexers).
Make sure data is evenly distributed among indexers.
Data volume is a factor. Searching a lot of data will take a lot of time. Additional indexers can help.
If the servers run on virtual hardware, make sure Splunk has dedicated resources.

I'm sure others will have different suggestions.

---
If this reply helps you, Karma would be appreciated.

View solution in original post

richgalloway
SplunkTrust
SplunkTrust

It depends on what is causing the extremely slow ES experience. Finding the cause is often the topic of a week-long Professional Services engagement, so we're unlikely to find it here. Here are some things to consider, though.

Upgrading apps is unlikely to help unless the upgrade specifically addresses performance.
Making searches more efficient is always good, but may not be enough.
Normally, one should use real-time searches only when absolutely necessary. Note, however, that in ES "real-time" has a different meaning and is preferred.
Deleting unused knowledge objects will have little effect unless it significantly reduces the size of the search bundle.

Too many searches can overwhelm the indexers and cause poor performance. Reduce the number of searches or schedule them more wisely.
It's possible the indexer tier is under-powered (not enough CPU or memory, slow disks, or not enough indexers).
Make sure data is evenly distributed among indexers.
Data volume is a factor. Searching a lot of data will take a lot of time. Additional indexers can help.
If the servers run on virtual hardware, make sure Splunk has dedicated resources.

I'm sure others will have different suggestions.

---
If this reply helps you, Karma would be appreciated.

jsven7
Communicator

Thank you for the guidance this is helpful.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...