Getting Data In

How do I configure props.conf using my sample data for proper line breaking based on time?

plumainwfs
New Member

Sample log extract below:

Splunk reads the log as one event and takes the pricing date: 2/3/2016 as the actual date and matched that to the time 12:00:01.093

Please advise how do I go about generating a separate log event for the different events based on the time? Lines can range in between 1-257 lines depending on the log.

12:00:01.093 INFO  c.w.f.service.delegate.FOSJobManager - BatchId:147,LoopingType:BY_CUSTOMER,OrgId:126,PricingDate:02/03/2016,CurrentJob:DEEMED_ADJUSTMENT_JOB,LastJob:DEEMED_ADJUSTMENT_JOB,DataSource:jdbc/FOS,EmailFlag:N,SingleCustomerId:-1,SingleLocationId:-1,UserId:1234,RespId:56789,ApplnId:123
12:00:01.093 INFO  com.abc.fos.job.FOSJob - Entering run Method
12:00:01.327 INFO  com.abc.fos.job.FOSJob - DEEMED_ADJUSTMENT::Customer Count:815
12:00:01.369 INFO  com.abc.fos.job.FOSJob - Total work units created :815
12:00:01.373 INFO  com.abc.fos.job.FOSJob - Exiting run Method
12:01:31.228 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2272
12:01:31.228 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2094
12:01:31.579 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2454
12:01:31.645 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2079
12:01:32.064 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2353
0 Karma

somesoni2
Revered Legend

Try this for your props.conf on Indexer/Heavy FOrwarder

[YourSourcetype]
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)(?=\s*\d+:\d+:\d+)
TIME_FORMAT=%H:%M:%S.%N
TIME_PREFIX=^\s*
0 Karma

plumainwfs
New Member

Thanks somesoni2; the sample log above is actually from a file_name.out file which right now is not being indexed by splunk, not sure if this is the case that Splunk is not able to read file_name.out files; it did at one point and it works when I manually add.

I was wondering how do I troubleshoot this now as to why it was able to index this but it suddenly stopped.

I am using a splunkforwarder and monitor the log file on the server (example: /abc/def/file_name.out)
So I have an input file with that information above to monitor and send to a certain index and set sourcetype

Not sure if this is a limitation or I am doing something wrong.

0 Karma
Get Updates on the Splunk Community!

Building Reliable Asset and Identity Frameworks in Splunk ES

 Accurate asset and identity resolution is the backbone of security operations. Without it, alerts are ...

Cloud Monitoring Console - Unlocking Greater Visibility in SVC Usage Reporting

For Splunk Cloud customers, understanding and optimizing Splunk Virtual Compute (SVC) usage and resource ...

Automatic Discovery Part 3: Practical Use Cases

If you’ve enabled Automatic Discovery in your install of the Splunk Distribution of the OpenTelemetry ...