Monitoring Splunk

splunkweb reporting splunkd timeout

bmacias84
Champion

Hello,

I am currently running into problems with my Search Heads. Users are experiencing intermittent timeouts of splunkd, which is stated on by Splunkweb during search, log in, etc. When Splunk is started using the CLI I receive the message “timed out waiting for splunkd”, but splunkweb starts up immediately. Search Head configuration has not changed in months. Splunkd log does not report any errors and warnings when messages are presented by splunkweb.

Configurations:

  • Search Head pooling
  • Splunk 5.0.5
  • Windows Server Standard 2008R2
  • Virtualized

Suggestions?

0 Karma
1 Solution

bmacias84
Champion

During my investigation I noticed that Splunk was taking an exuberant amount of time to read its config file. This come to light when I started running processmon. My initial thought was ACL or network issues. After watching read/write actions with processmon it appeared that I had intermittent disk transfer (IOP) issues with my cif.

To confirm that I switch search head pooling to disabled. Once disabled Splunk web function as expected. My next step was to create an nfs share on a remote system and copying all pool configurations to the new nfs share. Splunkweb worked like a charm. I switch back and forth between local, windows nfs share, net app share, and the cif share. Not very scientific, but worked. To get my infrastructure teams to even look at the problem I need concrete numbers, so I ran ActiveIO tests against all four setups.

Results (spread sheet too big):

  1. Local disk fast (as expected)
  2. windows nfs (slower but acceptable)
  3. Net App share (on par with windows nfs)
  4. cif share was slower by a factor of 15 (unusable)

IO Zone command for full test (one thread):

  • iozone -a –b <drive_for_output>:\output.xls -f <drive_to_test>:\iozone.tmp –Q

IO Zone command for quick test (four threads):

  • iozone -i 0 -t 4 -F <drive_to_test>:\iozone.tmp <drive_to_test>:\1.tmp <drive_to_test>:\2.tmp <drive_to_test>:\3.tmp -Q

Another Helpful doc was pooling configuration issues. For those of you who looked at this thanks.

View solution in original post

bmacias84
Champion

During my investigation I noticed that Splunk was taking an exuberant amount of time to read its config file. This come to light when I started running processmon. My initial thought was ACL or network issues. After watching read/write actions with processmon it appeared that I had intermittent disk transfer (IOP) issues with my cif.

To confirm that I switch search head pooling to disabled. Once disabled Splunk web function as expected. My next step was to create an nfs share on a remote system and copying all pool configurations to the new nfs share. Splunkweb worked like a charm. I switch back and forth between local, windows nfs share, net app share, and the cif share. Not very scientific, but worked. To get my infrastructure teams to even look at the problem I need concrete numbers, so I ran ActiveIO tests against all four setups.

Results (spread sheet too big):

  1. Local disk fast (as expected)
  2. windows nfs (slower but acceptable)
  3. Net App share (on par with windows nfs)
  4. cif share was slower by a factor of 15 (unusable)

IO Zone command for full test (one thread):

  • iozone -a –b <drive_for_output>:\output.xls -f <drive_to_test>:\iozone.tmp –Q

IO Zone command for quick test (four threads):

  • iozone -i 0 -t 4 -F <drive_to_test>:\iozone.tmp <drive_to_test>:\1.tmp <drive_to_test>:\2.tmp <drive_to_test>:\3.tmp -Q

Another Helpful doc was pooling configuration issues. For those of you who looked at this thanks.

Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...