Getting Data In

Event Break: does not work on forwarded log file, works fine on local copy [screenshots included]


Hi all,

I've been absolutely stumped with a problem now for two days. I can't seem to get event breaks working for when a file is forwarded from a server, even though it seems to work with an exact copy of the same file as long as it's on the same machine as the indexer. Could anyone please take a look to see where I went wrong?

Sample Data:

(ORD_Area_Date)      INFO(19May13@21:33:12:646) Updated entrydate: SYBRCH
(ORD_Area_Date) INFO(19May13@21:33:12:731) Loading market
(ORD_Area_Date) INFO(19May13@21:33:12:747) Historical market not there - use default
(ORD_Area_Date) INFO(19May13@21:33:12:836) Loading current market

I want each event to break at (ORD_Area_Date) (this pattern is used for multiple applications, so I cannot hard code to ORD_Area_Date - it needs to regex match a set of brackets with some words in the middle.

Machine 1: Forwarder - Unix


disabled = false
index = tarsan_dev
sourcetype = tarsan_dby_HMSf

Machine 2: Indexer - Windows

BREAK_ONLY_BEFORE = \(([A-Za-z1-9_\s]+))\s
TIME_FORMAT = %d%b%y@%H:%M:%S:%f
pulldown_type = 1


Image 1 - sourcetype stanza used in props.conf above works fine in Data Preview with the local copy

Image 2 - Event breaking works fine on local copy

Image 3 - Event breaking not working on forwarded logs

Other Notes:

  1. Image 2 and Image 3 are both from the same Indexer and the same splunk web. However, the local copy is on the 'main' index and the forwarded log is being stored in 'tarsan_dev'.
  2. Events from Image 2 and Image 3 both use and have access to the same sourcetype (tarsan_dby_HMSf). It just doesn't seem to work for forwarded logs.
  3. Events highlighted in red are the same set of events, just one with proper event breaks and one without. Timestamps match.

Any help would be greatly appreciated. Please let me know if additional information is required.

0 Karma

Splunk Employee
Splunk Employee

That's correct, it should be \s - thanks Kristian!

0 Karma

Ultra Champion

dart, jonathon, it seems that the regex has an 'S' instead of 's', which would cause it to fail, i.e. you're trying to match non-whitespace characters instead of whitespace (between the (ORD_Area_Date) and INFO.)


0 Karma



Yes, the forwarding machine is using a Universal Forwarder. I have the props.conf on the Indexing machine, with the inputs.conf on the forwarder specifying the sourcetype.

I tried using the stanza above in the Data Preview, but it was unable to break up the events properly.


0 Karma

Splunk Employee
Splunk Employee

Hi Jonathon,

Is the forwarder a Universal Forwarder?

If it's not, line breaking happens on the forwarder, if it is, line breaking happens on the indexer.

I usually do this with a LINE_BREAKER and SHOULD_LINEMERGE being false.

LINE_BREAKER = ([\r\n]+)\([^\) ]+\)\S+\w+\(\d+
0 Karma


I got it working in the end, but not through the props.conf. I ended up just adding the regex for the timestamp in to datetime.xml and did not specify a sourcetype in the inputs.conf file.

0 Karma

Ultra Champion

Well, there might be a good reason for your action, but all events must have a sourcetype - and I fail to see why you should get Splunk to guess it, if you already know what the sourcetype is/should be.

A lot of things are dependent on sourcetype (field extractions, searches, eventtypes etc).

0 Karma

Ultra Champion

Are these all single line events? If so, I would suggest that you set your props accordingly;

TIME_FORMAT = %d%b%y@%H:%M:%S:%3N

The NO_BINARY_CHECK is an inputs-phase related setting, so you could try copying it to the forwarder as well. Although the %f seems to work, I haven't seen it in official Splunk docs.

Hope this helps,



Hi Kristian,

Thanks for the suggestion. Unfortunately, I tried that and it didn't work. Even on auto break by timestamp, it still ends up clustering events together.

The annoying thing is it works fine if I don't try and specify the timestamp format, but it can't parse the date format and gets that wrong.

0 Karma