Getting Data In

How do I configure props.conf using my sample data for proper line breaking based on time?

plumainwfs
New Member

Sample log extract below:

Splunk reads the log as one event and takes the pricing date: 2/3/2016 as the actual date and matched that to the time 12:00:01.093

Please advise how do I go about generating a separate log event for the different events based on the time? Lines can range in between 1-257 lines depending on the log.

12:00:01.093 INFO  c.w.f.service.delegate.FOSJobManager - BatchId:147,LoopingType:BY_CUSTOMER,OrgId:126,PricingDate:02/03/2016,CurrentJob:DEEMED_ADJUSTMENT_JOB,LastJob:DEEMED_ADJUSTMENT_JOB,DataSource:jdbc/FOS,EmailFlag:N,SingleCustomerId:-1,SingleLocationId:-1,UserId:1234,RespId:56789,ApplnId:123
12:00:01.093 INFO  com.abc.fos.job.FOSJob - Entering run Method
12:00:01.327 INFO  com.abc.fos.job.FOSJob - DEEMED_ADJUSTMENT::Customer Count:815
12:00:01.369 INFO  com.abc.fos.job.FOSJob - Total work units created :815
12:00:01.373 INFO  com.abc.fos.job.FOSJob - Exiting run Method
12:01:31.228 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2272
12:01:31.228 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2094
12:01:31.579 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2454
12:01:31.645 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2079
12:01:32.064 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2353
0 Karma

somesoni2
Revered Legend

Try this for your props.conf on Indexer/Heavy FOrwarder

[YourSourcetype]
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)(?=\s*\d+:\d+:\d+)
TIME_FORMAT=%H:%M:%S.%N
TIME_PREFIX=^\s*
0 Karma

plumainwfs
New Member

Thanks somesoni2; the sample log above is actually from a file_name.out file which right now is not being indexed by splunk, not sure if this is the case that Splunk is not able to read file_name.out files; it did at one point and it works when I manually add.

I was wondering how do I troubleshoot this now as to why it was able to index this but it suddenly stopped.

I am using a splunkforwarder and monitor the log file on the server (example: /abc/def/file_name.out)
So I have an input file with that information above to monitor and send to a certain index and set sourcetype

Not sure if this is a limitation or I am doing something wrong.

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...