Getting Data In

Multiple log format with customtime.xml not working

erikgrasman
Engager

I tried to do something like:

Because I can't get a logfile to be changed.
To do that I created a props.conf with a DATETIME_CONFIG parameter which points to a created: multitime.xml (I also changed the DATETIME_CONFIG to a non-existing file, just to check if I could find this back in my _internal - which I did).

The log data looks like shown below and I want it to be 6 events - 5 one-line events and a multiline event.
The 4th and 5th event are merged into 1 event in stead of broken up into two events, anyone here who sees what I did wrong? (the example on the splunk blog works 🙂 )

2018/01/18 13:14:21.3:switchx:T[XXXXXXXX]:AAAA.BB.CCCC:[DB3, (ACTIVE, 2018-01-09 12:27:38.185)]: partitioned-tables=20
2018/01/18 13:14:21.3:switchx:T[XXXXXXXX]:AAAA.BB.CCCC:[DB3, (ACTIVE, 2018-01-09 12:27:38.185)] prepared=true
2018/01/18 13:14:21.3:switchx:T[XXXXXXXX]:AAAA.BB.CCCC:[DB4, (ACTIVE, 2018-01-11 16:56:59.124)]: partitioned-tables=20
2018/01/18 13:14:21.3:switchx:T[XXXXXXXX]:AAAA.BB.CCCC:[DB4, (ACTIVE, 2018-01-11 16:56:59.124)] prepared=true
13:14:22 TPS=0 Act=0[0] Com=12345[67890] Ver=1.2.3-45-AB-678.901 Dln=2 Dnr=123 AA- Mem=7921/2999/329/0 CPU=1
Pool=0/s hit=0%  Script=0/xxxxx  Group out=12[34] in=123[67] FDR=off WIN dhr=12345 msg=246 lat=0 CP d=0.0k s=0.0k
AW - none
WA - none
DB[bytes/batch/size/transit/w+p+e+c=dur] -  DB1[0/0/0/0/0+0+0+0=0]
DB1: ACTIVE 18-01-11 15:51:22 switchy  AB=[131/130/77/0]  DW=[4267/18]  RW=[0/6]  L/W=6/2
DB2: ACTIVE 18-01-11 15:53:03 switchx  AB=[60/60/26/0]  DW=[3320/0]  RW=[0/0]  L/W=0/0
DB3: ACTIVE 18-01-09 12:27:38 switchz  AB=[60/60/26/0]  DW=[3320/0]  RW=[0/0]  L/W=0/0
DB4: ACTIVE 18-01-11 16:56:59 switchr  AB=[60/60/26/0]  DW=[3320/0]  RW=[0/0]  L/W=0/0
HSM: none
2018/01/18 13:14:23.3:switchx:T[XXXXXX]:abcd.efghij.management:Hrhrhr events for harouterx: null

props.conf

[multi_time]
DATETIME_CONFIG=/etc/apps/multitime/local/multitime.xml
LINE_BREAKER=([\r\n]+)(?:(?:\d{4}\/\d\d\/\d{2}\s\d{2})|(?:\d{2}:\d\d:\d\d\sTPS))
SHOULD_LINEMERGE=true
#BREAK_ONLY_BEFORE_DATE=true
TRUNCATE=5000
MAX_TIMESTAMP_LOOKAHEAD=25

multitime.xml

<datetime>
<!-- 2018/01/18 13:14:21.3 -->
<define name="_datetimeformat1" extract="year, month, day, hour, minute, second, subsecond">
<text>(\d{4})\/(\d{2})\/(\d{2})\s(\d{2}):(\d{2}):(\d{2}).(\d)</text>
</define>

<!-- 13:14:21 -->
<define name="_datetimeformat2" extract="hour, minute, second">
<text>(\d{2}):(\d{2}):(\d{2})\s</text>
</define>

<timePatterns>
<use name="_datetimeformat1"/>
<use name="_datetimeformat2"/>
</timePatterns>
<datePatterns>
<use name="_datetimeformat1"/>
<use name="_datetimeformat2"/>
</datePatterns>
</datetime>
0 Karma

woodcock
Esteemed Legend

Your configurations look fine to me so...

1: It might not be in the right place: it needs to be located on the first full instance of splunk that handles the events. This could be a heavy forwarder, intermediate forwarder, or Indexer.

2: They might not be loaded: after you put the configurations where they need to be, you need to make sure that it is owned by the same user that is running the splunk process, with the correct file permissions, and you need to restart all splunk processes there.

3: You might not be evaluating (testing) it properly: make sure that you run your search with All time on the TimePicker and add _index_earliest=-5m to show only events that have been indexed in the last 5 minutes.

0 Karma

erikgrasman
Engager

by the way, in the original log there are no whitespaces in front of eacht line 🙂

0 Karma
Get Updates on the Splunk Community!

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud  In today’s fast-paced digital ...

Observability protocols to know about

Observability protocols define the specifications or formats for collecting, encoding, transporting, and ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...