Getting Data In

Can you help us fix a forwarding delay occurring between event time and indexed time?

Nik_Shafiq
New Member

We have set up a Splunk forwarder to forward the latest logs in the same server, but we are having an issue where there is a huge difference between indexed time and the event time. I can see the delay of almost 17 hours when I use this query to find the indexed time, event time and the delay:

index=myindex sourcetype=mysourcetype host=myhost | eval delay=(_indextime-_time)/60/60 | eval indexed_time=strftime(_indextime, "%+") | table indexed_time, _time, delay

This is what i got from splunkd.log

01-08-2019 11:17:31.607 +0700 INFO  TailReader -   ...continuing.
01-08-2019 11:17:33.515 +0700 INFO  TcpOutputProc - Connected to idx=*heavy_forwarder*, pset=0, reuse=1.
01-08-2019 11:17:46.608 +0700 INFO  TailReader - Could not send data to output queue (structuredParsingQueue), retrying...
01-08-2019 11:17:51.609 +0700 INFO  TailReader -   ...continuing.
01-08-2019 11:17:55.286 +0700 INFO  HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_*sourcehostip*_*sourcehostname*_*sourcehostmac*
01-08-2019 11:18:06.610 +0700 INFO  TailReader - Could not send data to output queue (structuredParsingQueue), retrying...
01-08-2019 11:18:11.610 +0700 INFO  TailReader -   ...continuing.
01-08-2019 11:18:16.611 +0700 INFO  TailReader - Could not send data to output queue (structuredParsingQueue), retrying...

This issue only occurred on this particular host. Hence, I created a limits.conf file to change the thruput limit as suggested here:

link:troubleshootingindexdelay

Unfortunately, that did not resolve the issue as we still get messages in splunkd.log.

Any idea which part should I look into next?

0 Karma

lakshman239
SplunkTrust
SplunkTrust

Looks like the queues are getting full and hence blocked. Are you sending the data/logs for this sourcetype only to one indexer? Look at the metrics.log for the pipeline issue and address them. Also, look at your props/transforms for regex improvements.

You may also need to review the
https://wiki.splunk.com/Community:HowIndexingWorks

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) v3.54.0

The Splunk Threat Research Team (STRT) recently released Enterprise Security Content Update (ESCU) v3.54.0 and ...

Using Machine Learning for Hunting Security Threats

WATCH NOW Seeing the exponential hike in global cyber threat spectrum, organizations are now striving more for ...

New Learning Videos on Topics Most Requested by You! Plus This Month’s New Splunk ...

Splunk Lantern is a customer success center that provides advice from Splunk experts on valuable data ...