Deployment Architecture

Number of Index's to Host or Events

Contributor

We are just starting to really dive into some more in depth reports. In some cases we are seeing some slow run times; I expect this with the millions of events to go through. We have on main index with over 100 hosts, though the sourcetype breaks it down further. Is this 'normal'? My primary question is that if we were to break up the hosts and have a few different indexes, would that speed up any search? Does that get negated as soon as you add specification to a host or sourcetype?

Any documentation would be helpful; most of this is handled by a separate team so I have not interfaced with this.

0 Karma
1 Solution

Communicator

You should be able to have tens of thousands of hosts in one index with no issue (or more), speed of search should not be affected by the number of hosts in this case.

To debug what's slow in your search, look at the search job inspector, which will show where your search is spending most of its time. Most of the time it's an incorrect search. The key is to reduce the data as much as possible as early as possible in the search, to reduce the amount of data that needs to be pulled off disk and processed. You don't say how many indexers you have, or if you have a separate indexer from your search head.

Posting your search here will get lots of replies I'm sure on how to optimize it (and perhaps a sample of the data, anything sensitive redacted of course).

View solution in original post

0 Karma

SplunkTrust
SplunkTrust

Hi aohls,
if you see slow searches, you have to debug it .
At first you have to check your infrastructure:

  • which disks are you using? remember that you need at least 800 IOPS, that means 4 15k disks in RAID0 or 8 15k disks in RAID10, don't use NAS!
  • then, how many CPUs do you have? every search and subsearch takes one CPU so if you have a dashboard with 4 panels and each panel has a search with two subsearches, you can calculate the use of your CPUs!

Then you have to debug the use of your system:

  • how many real time searches do you have? (see the information of previous point);
  • how many users do you have?
  • how many scheduled searches do you have? (alerts reports, summaries, etc...)

After these checks, you can use the Monitoring Console to understand if your indexers (and Search Heads) are overloaded and if there are scheduled searches that oveload your system.

Bye.
Giuseppe

0 Karma

Communicator

You should be able to have tens of thousands of hosts in one index with no issue (or more), speed of search should not be affected by the number of hosts in this case.

To debug what's slow in your search, look at the search job inspector, which will show where your search is spending most of its time. Most of the time it's an incorrect search. The key is to reduce the data as much as possible as early as possible in the search, to reduce the amount of data that needs to be pulled off disk and processed. You don't say how many indexers you have, or if you have a separate indexer from your search head.

Posting your search here will get lots of replies I'm sure on how to optimize it (and perhaps a sample of the data, anything sensitive redacted of course).

View solution in original post

0 Karma

Contributor

This is what I was hoping for, thanks. I have been debugging my searches but I wanted to check to make sure there was not some fundamental issue in the setup.

0 Karma