I have a strange issue where I get lots of line breaking errors about a particular file, but I can't find the file in any of my indexes.
04-10-2018 13:12:54.887 +1000 WARN AggregatorMiningProcessor - Changing breaking behavior for event stream because MAX_EVENTS (256) was exceeded without a single event break. Will set BREAK_ONLY_BEFORE_DATE to False, and unset any MUST_NOT_BREAK_BEFORE or MUST_NOT_BREAK_AFTER rules. Typically this will amount to treating this data as single-line only. - data_source="/opt/local/sys/iw-home/OpenDeployNG/etc/deploy.cfg", data_host="itsun12", data_sourcetype="xxxxxx_appmon_prod"
When I try to find the offending events I don't see anything for that source.
index=* host=* source=/opt/local/sys/iw-home/OpenDeployNG/etc/deploy.cfg
Has anyone seen this before?
What timerange did you search for? If the linebreaking fails this badly, I wouldn't be surprised if timestamping might also be broken, causing those events to show up on a different place on the timeline as where you may expect them based on when the error occurred. Alternatively, you could run a
| metadata type=sources search over all time and see if the troubled source shows up in there.
But rather than trying to find this data in Splunk, I would use the information in the error message to track down the datasource, the forwarder that is sending this data and the relevant inputs, props and transforms that are getting applied. So you can check whether that config makes sense for that source file. The filename located in etc/ and ending in .cfg sounds like a config file. Not typically something you ingest into splunk; especially with
BREAK_ONLY_BEFORE_DATE=true as I've never seen a config file that has dates before each line or so...
You may want to be more specific about the index in your search. It happens quite often to me that when I search index=* and than filter by source I dont get result but when i search index= source=, i get results. Probebly splunk is doing some optimizations to the search process and when you have a lot of data it doesnt want to go over all of it if not necsessery.