Getting Data In

How to configure Splunk to break events that start with a certain pattern followed by a timestamp?

ngeorgieff
New Member

I have the logs like below pattern. I want to break the events that starts with <94>1and then timestamp

<94>1 2016-08-31T17:31:25.633-07:00 hostname-1-p02.domain.com GAMFT - FTP Audit Log [gamft-ftp@46583 event_type="Connection Successful" remote_ip="10.0.203.75" severity="I" start_time="8/31/16 5:31:25 PM" end_time="8/31/16 5:31:25 PM" local_ip="10.11.215.194" local_port="8021" command="Connect" time_taken="14" remarks="Connection established" system_name="hostname-1-p02"]<94>1 2016-08-31T17:31:29.166-07:00 hostname-1-p02.domain.com GAMFT - FTP Audit Log [gamft-ftp@46583 event_type="Login Successful" remote_ip="10.0.203.75" user_name="splunk_test" severity="I" start_time="8/31/16 5:31:29 PM" end_time="8/31/16 5:31:29 PM" local_ip="10.11.215.194" local_port="8021" command="Login" time_taken="68" remarks="230 User logged in, proceed." system_name="hostname-1-p02" domain="Infrastructure Services"]<94>1 2016-08-31T17:31:31.402-07:00 hostname-1-p02.domain.com GAMFT - FTP Audit Log [gamft-ftp@46583 event_type="Logout" remote_ip="10.0.203.75" user_name="splunk_test" severity="I" start_time="8/31/16 5:31:31 PM" end_time="8/31/16 5:31:31 PM" local_ip="0.0.0.0" local_port="8021" command="Logout" time_taken="10" remarks="221 Goodbye." system_name="hostname-1-p02" domain="Infrastructure Services"]<94>1 2016-08-31T17:31:31.414-07:00 hostname-1-p02.domain.com GAMFT - FTP Audit Log [gamft-ftp@46583 event_type="Disconnect" remote_ip="10.0.203.75" user_name="splunk_test" severity="I" start_time="8/31/16 5:31:31 PM" end_time="8/31/16 5:31:31 PM" local_ip="0.0.0.0" local_port="8021" command="Disconnect" time_taken="8" remarks="Disconnected" system_name="hostname-1-p02" domain="Infrastructure Services"]
0 Karma
1 Solution

vprisiajni_splu
Splunk Employee
Splunk Employee

In my test with these sample events, the following seemed to work:

LINE_BREAKER = (<94>1)(\s)
SHOULD_LINEMERGE = false

Explanation from our props.conf docs:

  • The regex must contain a capturing group -- a pair of parentheses which   defines an identified subcomponent of the match.
  • Wherever the regex matches, Splunk considers the start of the first   capturing group to be the end of the previous event, and considers the end   of the first capturing group to be the start of the next event.
  • The contents of the first capturing group are discarded, and will not be   present in any event.  You are telling Splunk that this text comes between   lines.

View solution in original post

0 Karma

vprisiajni_splu
Splunk Employee
Splunk Employee

In my test with these sample events, the following seemed to work:

LINE_BREAKER = (<94>1)(\s)
SHOULD_LINEMERGE = false

Explanation from our props.conf docs:

  • The regex must contain a capturing group -- a pair of parentheses which   defines an identified subcomponent of the match.
  • Wherever the regex matches, Splunk considers the start of the first   capturing group to be the end of the previous event, and considers the end   of the first capturing group to be the start of the next event.
  • The contents of the first capturing group are discarded, and will not be   present in any event.  You are telling Splunk that this text comes between   lines.
0 Karma

ngeorgieff
New Member

It works. Thank you very much for your help and explanation.

0 Karma

ddrillic
Ultra Champion

A place to investigate it at Configure event line breaking

0 Karma

ngeorgieff
New Member

I've tried with LB regex, data prefix but none worked for me.

0 Karma
Get Updates on the Splunk Community!

Exporting Splunk Apps

Join us on Monday, October 21 at 11 am PT | 2 pm ET!With the app export functionality, app developers and ...

Cisco Use Cases, ITSI Best Practices, and More New Articles from Splunk Lantern

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Build Your First SPL2 App!

Watch the recording now!.Do you want to SPL™, too? SPL2, Splunk's next-generation data search and preparation ...