Getting Data In

How do I configure props.conf using my sample data for proper line breaking based on time?

plumainwfs
New Member

Sample log extract below:

Splunk reads the log as one event and takes the pricing date: 2/3/2016 as the actual date and matched that to the time 12:00:01.093

Please advise how do I go about generating a separate log event for the different events based on the time? Lines can range in between 1-257 lines depending on the log.

12:00:01.093 INFO  c.w.f.service.delegate.FOSJobManager - BatchId:147,LoopingType:BY_CUSTOMER,OrgId:126,PricingDate:02/03/2016,CurrentJob:DEEMED_ADJUSTMENT_JOB,LastJob:DEEMED_ADJUSTMENT_JOB,DataSource:jdbc/FOS,EmailFlag:N,SingleCustomerId:-1,SingleLocationId:-1,UserId:1234,RespId:56789,ApplnId:123
12:00:01.093 INFO  com.abc.fos.job.FOSJob - Entering run Method
12:00:01.327 INFO  com.abc.fos.job.FOSJob - DEEMED_ADJUSTMENT::Customer Count:815
12:00:01.369 INFO  com.abc.fos.job.FOSJob - Total work units created :815
12:00:01.373 INFO  com.abc.fos.job.FOSJob - Exiting run Method
12:01:31.228 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2272
12:01:31.228 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2094
12:01:31.579 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2454
12:01:31.645 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2079
12:01:32.064 INFO  com.abc.fos.work.FOSWork - Work Unit Completed:DEEMED_ADJUSTMENT: Customer Id:2353
0 Karma

somesoni2
Revered Legend

Try this for your props.conf on Indexer/Heavy FOrwarder

[YourSourcetype]
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)(?=\s*\d+:\d+:\d+)
TIME_FORMAT=%H:%M:%S.%N
TIME_PREFIX=^\s*
0 Karma

plumainwfs
New Member

Thanks somesoni2; the sample log above is actually from a file_name.out file which right now is not being indexed by splunk, not sure if this is the case that Splunk is not able to read file_name.out files; it did at one point and it works when I manually add.

I was wondering how do I troubleshoot this now as to why it was able to index this but it suddenly stopped.

I am using a splunkforwarder and monitor the log file on the server (example: /abc/def/file_name.out)
So I have an input file with that information above to monitor and send to a certain index and set sourcetype

Not sure if this is a limitation or I am doing something wrong.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...