Getting Data In

universal forwarder trying to parse the data


I have a UF monitoring a couple of files on a AIX box.
The UF is forwarding the data to a HF, I verified this in outputs.conf.
There are no props.conf present for that input on the UF, only at the HF, and they are obviously not being honored.
For some strange reason, I see "Breaking event" and "DateParserVerbose" errors on the UF.
How come the parsing phase takes place on the UF and not only the forwarding of the data ? I didn't get this behavior on any of my other UFs.
This is not an indexed_extraction.

0 Karma

Splunk Employee
Splunk Employee

The only data that will be parsed on an Universal or Lightweight forwarders (and all forwarders) will be the sourcetypes using INDEXED_EXTRACTIONS. that do tailing time structured data parsing.
usually : xml, json, IIS, etc ...


Check your data format and sourcetype.
then if it is the case, you can prevent the errors by tuning your sourcetype parsing on the forwarder props.conf directly (like the MAX_EVENTS to raise to more than 256)

0 Karma

Revered Legend

Could you post some of the error samples that you see in UF? Also, when you say you're seeing those in UF, it means you see those in physical splunkd.log file on UF?

0 Karma


Exactly, on the UF box, on it's splunkd log.
"DateParserVerbose - Time parsed (Mon May 30 21:00:00 2016) is too far away from the previous event's time"
"AggregatorMiningProcessor - Breaking event because limit of 256 has been exceeded"
I expect to see such messages only on the HF

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...