Monitoring Splunk

Date latency

uagraw01
Motivator

I am receiving the logs from the forwarders and can see latency between index time and event time. We have difference between index time and event time is about 15 to 16 hours on more than 300 forwarders. How can  i fix this issue?

Labels (1)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

That's not (usually) a simple fix.  There are a variety of causes and finding the root cause will likely require intimate knowledge of your environment.

Some things to check include:

  • All servers are using NTP (or an equivalent time-sync service)
  • Time zones are set properly on each server
  • Event timestamps include a time zone indication or inputs.conf contains the TZ attribute
  • Props.conf has TIME_FORMAT attributes that correctly extract the time zone from event timestamps
  • All Splunk forwarders are always running
  • Any intermediate servers or processes are always running
  • Events are not cached by the generating server/process before they are sent to Splunk
---
If this reply helps you, Karma would be appreciated.

uagraw01
Motivator

@richgalloway Is DATETIME_CONFIG = CURRENT will work ? 

0 Karma

richgalloway
SplunkTrust
SplunkTrust
It will "work" in that it will assign the current time to each event that arrives. It masks the latency problem. It makes old events look like new events and may throw off your reports.
---
If this reply helps you, Karma would be appreciated.
0 Karma

uagraw01
Motivator

@richgalloway  Any other solution you can suggest to me. Because our thruput limit is set to 1024kb and that is fine . Any major issue we can fix this permanently.

 

 

0 Karma

richgalloway
SplunkTrust
SplunkTrust
I offered 7 possible solutions in my first reply. Have you checked them?
---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Dashboards: Hiding charts while search is being executed and other uses for tokens

There are a couple of features of SimpleXML / Classic dashboards that can be used to enhance the user ...

Splunk Observability Cloud's AI Assistant in Action Series: Explaining Metrics and ...

This is the fourth post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how ...

Brains, Bytes, and Boston: Learn from the Best at .conf25

When you think of Boston, you might picture colonial charm, world-class universities, or even the crack of a ...