Monitoring Splunk

splunkweb reporting splunkd timeout

bmacias84
Champion

Hello,

I am currently running into problems with my Search Heads. Users are experiencing intermittent timeouts of splunkd, which is stated on by Splunkweb during search, log in, etc. When Splunk is started using the CLI I receive the message “timed out waiting for splunkd”, but splunkweb starts up immediately. Search Head configuration has not changed in months. Splunkd log does not report any errors and warnings when messages are presented by splunkweb.

Configurations:

  • Search Head pooling
  • Splunk 5.0.5
  • Windows Server Standard 2008R2
  • Virtualized

Suggestions?

0 Karma
1 Solution

bmacias84
Champion

During my investigation I noticed that Splunk was taking an exuberant amount of time to read its config file. This come to light when I started running processmon. My initial thought was ACL or network issues. After watching read/write actions with processmon it appeared that I had intermittent disk transfer (IOP) issues with my cif.

To confirm that I switch search head pooling to disabled. Once disabled Splunk web function as expected. My next step was to create an nfs share on a remote system and copying all pool configurations to the new nfs share. Splunkweb worked like a charm. I switch back and forth between local, windows nfs share, net app share, and the cif share. Not very scientific, but worked. To get my infrastructure teams to even look at the problem I need concrete numbers, so I ran ActiveIO tests against all four setups.

Results (spread sheet too big):

  1. Local disk fast (as expected)
  2. windows nfs (slower but acceptable)
  3. Net App share (on par with windows nfs)
  4. cif share was slower by a factor of 15 (unusable)

IO Zone command for full test (one thread):

  • iozone -a –b <drive_for_output>:\output.xls -f <drive_to_test>:\iozone.tmp –Q

IO Zone command for quick test (four threads):

  • iozone -i 0 -t 4 -F <drive_to_test>:\iozone.tmp <drive_to_test>:\1.tmp <drive_to_test>:\2.tmp <drive_to_test>:\3.tmp -Q

Another Helpful doc was pooling configuration issues. For those of you who looked at this thanks.

View solution in original post

bmacias84
Champion

During my investigation I noticed that Splunk was taking an exuberant amount of time to read its config file. This come to light when I started running processmon. My initial thought was ACL or network issues. After watching read/write actions with processmon it appeared that I had intermittent disk transfer (IOP) issues with my cif.

To confirm that I switch search head pooling to disabled. Once disabled Splunk web function as expected. My next step was to create an nfs share on a remote system and copying all pool configurations to the new nfs share. Splunkweb worked like a charm. I switch back and forth between local, windows nfs share, net app share, and the cif share. Not very scientific, but worked. To get my infrastructure teams to even look at the problem I need concrete numbers, so I ran ActiveIO tests against all four setups.

Results (spread sheet too big):

  1. Local disk fast (as expected)
  2. windows nfs (slower but acceptable)
  3. Net App share (on par with windows nfs)
  4. cif share was slower by a factor of 15 (unusable)

IO Zone command for full test (one thread):

  • iozone -a –b <drive_for_output>:\output.xls -f <drive_to_test>:\iozone.tmp –Q

IO Zone command for quick test (four threads):

  • iozone -i 0 -t 4 -F <drive_to_test>:\iozone.tmp <drive_to_test>:\1.tmp <drive_to_test>:\2.tmp <drive_to_test>:\3.tmp -Q

Another Helpful doc was pooling configuration issues. For those of you who looked at this thanks.

Get Updates on the Splunk Community!

Raise Your Skills at the .conf25 Builder Bar: Your Splunk Developer Destination

Calling all Splunk developers, custom SPL builders, dashboarders, and Splunkbase app creators – the Builder ...

Hunt Smarter, Not Harder: Discover New SPL “Recipes” in Our Threat Hunting Webinar

Are you ready to take your threat hunting skills to the next level? As Splunk community members, you know the ...

Splunk ITSI & Correlated Network Visibility

  Now On Demand   Take Your Network Visibility to the Next Level In today’s complex IT environments, ...