Getting Data In

splunk-kubernetes-logging in Kubernetes cluster getting error_class=Net::OpenTimeout error="execution expired"

solguin
New Member

I am trying to setup splunk-kubernetes-logging. I have my daemonset running on my worker nodes, but fluentd is failing to flush its buffer, and it's giving the same error over and over. It seems it cant create a flush thread.

I have installed using HELM - Splunk Connect for Kubernetes 1.2.0 release
I have given the pod specs a privileged security context, so it should be able to do anything within the container.

Please help.

splunk settings:
splunk:

 # Configurations for HEC (HTTP Event Collector)
  hec:
    # host is required and should be provided by user
    host: *****
    # port to HEC, optional, default 8088
    port:
    # token is required and should be provided by user
    token: *****
    # protocol has two options: "http" and "https", default is "https"
    protocol: http

resource settings:
resources:
  # limits:
  #  cpu: 100m
  #  memory: 200Mi
  requests:
   cpu: 100m
   memory: 200Mi

buffer settings:
buffer:
  "@type": memory
  total_limit_size: 600m
  chunk_limit_size: 200m
  chunk_limit_records: 100000
  flush_interval: 10s
  flush_thread_count: 2
  overflow_action: block
  retry_max_times: 3

Error:
2019-09-30 23:17:40 +0000 [info]: #0 fluentd worker is now running worker=0
2019-09-30 23:18:51 +0000 [warn]: #0 failed to flush the buffer. retry_time=0 next_retry_seconds=2019-09-30 23:18:52 +0000 chunk="593cd71313b4aadbf0ebbcb5a6188760" error_class=Net::OpenTimeout error="execution expired"
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/lib/ruby/2.5.0/net/http.rb:937:in initialize'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/lib/ruby/2.5.0/net/http.rb:937:in
open'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/lib/ruby/2.5.0/net/http.rb:937:in block in connect'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/lib/ruby/2.5.0/timeout.rb:103:in
timeout'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/lib/ruby/2.5.0/net/http.rb:935:in connect'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/lib/ruby/2.5.0/net/http.rb:920:in
do_start'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/lib/ruby/2.5.0/net/http.rb:915:in start'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/bundle/gems/net-http-persistent-3.0.1/lib/net/http/persistent.rb:710:in
start'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/bundle/gems/net-http-persistent-3.0.1/lib/net/http/persistent.rb:640:in connection_for'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/bundle/gems/net-http-persistent-3.0.1/lib/net/http/persistent.rb:945:in
request'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/bundle/gems/fluent-plugin-splunk-hec-1.1.0/lib/fluent/plugin/out_splunk_hec.rb:355:in send_to_hec'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/bundle/gems/fluent-plugin-splunk-hec-1.1.0/lib/fluent/plugin/out_splunk_hec.rb:167:in
write'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/bundle/gems/fluentd-1.4.0/lib/fluent/plugin/output.rb:1125:in try_flush'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/bundle/gems/fluentd-1.4.0/lib/fluent/plugin/output.rb:1425:in
flush_thread_run'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/bundle/gems/fluentd-1.4.0/lib/fluent/plugin/output.rb:454:in block (2 levels) in start'
2019-09-30 23:18:51 +0000 [warn]: #0 /usr/local/bundle/gems/fluentd-1.4.0/lib/fluent/plugin_helper/thread.rb:78:in
block in thread_create'

0 Karma

mattymo
Splunk Employee
Splunk Employee

this is a network connection issue, did you get is solved? usually firewall or wrong host ip/dns issues.

- MattyMo
0 Karma
Get Updates on the Splunk Community!

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Spotting Financial Fraud in the Haystack: A Guide to Behavioral Analytics with Splunk

In today's digital financial ecosystem, security teams face an unprecedented challenge. The sheer volume of ...

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability

Solve Problems Faster with New, Smarter AI and Integrations in Splunk Observability As businesses scale ...