Getting Data In

Multiline log event not being indexed correctly

tmurray3
Path Finder

Hi,

I have a log file being monitored which has many similiar events. The events have the same fields more or less and the event is defined as such:

2014-04-29 17:23:00,428 [[ACTIVE] ExecuteThread: '11' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO - logtype=DATA
details=Userid=ptal222 AppName=DEFAULT_TIER2B AuthType=AUTH_T2B result=AUTHENTICATED
transid=91f53cf1-2d7d-4b6b-a7a1-ab5545a343c6
transtype=AUTH
transdetail=AUTH_T2B
appclientid=DEFAULT_TIER2B
userid=rm_portal_2
adminid=null
lob=null
jclass=web.interceptor.BaseInterceptor
jmethod=afterCompletion
jline=75
epoch=1398817380428
authtype=AUTH_T2B

Splunk is dismissing everything in the event prior to the field
epoch=1398817380428

When I search Splunk, the event is displaying as:

epoch=1398817380428
authtype=AUTH_T2B

Since each field is outputted as a new line, I assume splunk thinks the epoch=timestamp line is a new event. Thus, indexes from that point. I cannot figure out how to get the entire event indexed. I want Splunk to use the date timestamp at the beginning of the event (2014-04-29 17:23:00,428) to determine the start of a multi-line event.

I have tried to add the following to the props.conf file but no luck:

[af_dev]
NO_BINARY_CHECK = true
TIME_FORMAT=%Y-%m-%d %H:%M:%S,%3N
TIME_PREFIX=^(?=\d{4}-)

Any thoughts/suggestions?

0 Karma

somesoni2
Revered Legend

Try this

[af_dev]
BREAK_ONLY_BEFORE=\d{4}-\d{2}-\d{2}
NO_BINARY_CHECK=1
SHOULD_LINEMERGE=true
TIME_FORMAT=%Y-%d-%m %H:%M:%S,%3Q
0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...