Getting Data In

Need to Split the events before parsing into Splunk

anandhalagaras1
Contributor

This below mentioned lines are coming as a single event and not as separate events. So we want to get them splitted i.e.. It starts with IP and the end would be with Email field so after which it needs to be a separate next  event.

IP:aa.bbb.ccc.ddd##Browser:Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0##LoginSuccess Wire At:04-03-24 15:10:32##CookieFilePath:/xxx/yyy/abc.com/xyz/abc/forms/submitform/live/12345/98765_3598/clear.txt##ABC:12344564##Sessionid:xyz-a1-ddd_1##Form:xyz##Type:Live##LoginSuccess:Yes##SessionUserId:123##Email:xyz@google.com
IP:aa.bbb.ccc.ddd##Browser:Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0##LoginSuccess Wire At:04-03-24 17:12:32##CookieFilePath:/xxx/yyy/abc.com/xyz/abc/forms/submitform/live/12345/1234_9564/clear.txt##ABC:12344564##Sessionid:xyz-a1-ddd_1##Form:xyz##Type:Live##LoginSuccess:Yes##SessionUserId:123##Email:xyz@google.com
IP:aa.bbb.ccc.ddd##Browser:Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0##LoginSuccess Wire At:04-03-24 18:10:32##CookieFilePath:/xxx/yyy/abc.com/xyz/abc/forms/submitform/live/12345/9821_365/clear.txt##ABC:12344564##Sessionid:xyz-a1-ddd_1##Form:xyz##Type:Live##LoginSuccess:Yes##SessionUserId:123##Email:xyz@google.com
IP:aa.bbb.ccc.ddd##Browser:Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0##LoginSuccess Wire At:04-03-24 20:10:32##CookieFilePath:/xxx/yyy/abc.com/xyz/abc/forms/submitform/live/12345/222_123/clear.txt##ABC:12344564##Sessionid:xyz-a1-ddd_1##Form:xyz##Type:Live##LoginSuccess:Yes##SessionUserId:123##Email:xyz@google.com


SO kindly let me know how can be get them splitted into separate events.

Labels (1)
0 Karma

jotne
Builder

Here are the setting for props.conf

 

SHOULD_LINEMERGE=false      #Should always be false
LINE_BREAKER=([\r\n]+)IP    #Adds IP to the line breaking (If all lines starts with IP)
NO_BINARY_CHECK=true
TIME_FORMAT=%e-%m-%y %T     #Sets the time format
TIME_PREFIX=At:             #Use time found after the At:
MAX_TIMESTAMP_LOOKAHEAD=20  #Do not search more tha needed for the time

 

0 Karma

kiran_panchavat
Influencer

@anandhalagaras1 You can apply in the HF's if you have. 

kiran_panchavat_0-1709739269664.png

 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @anandhalagaras1,

did you tried SHOULD_LINEMERGE = false?

Ciao.

Giuseppe

0 Karma

anandhalagaras1
Contributor

@gcusello Yes i have updated the props.conf in the UF of the server. Since I don't have access to the Indexers it didnt worked. Since our Search head are hosted in Cloud and managed by Splunk Support.

So what should i need to do if i need to apply to Indexers directly.

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Hi

here is the new MASA diagram where you could look where to put those and in which server https://splunk-usergroups.slack.com/files/U0483CQG4/F06PKREDNLW/masa.pdf?origin_team=T047WPASC&origi...

r. Ismo

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @anandhalagaras1,

you have to associate SHOULD_LINEMERGE = false to the sourcetype of your data in the UFs and in the Splunk Cloud Search Heads.

Ciao.

Giuseppe

0 Karma

anandhalagaras1
Contributor

@gcuselloAs previously stated, I implemented the setting SHOULD_LINEMERGE = false in Splunk Cloud SH, which successfully resolved the issue. However, the logs contain HTML events, which are now being treated as individual events, resulting in difficulties extracting the desired fields. Could you please advise on how we can address this?

0 Karma
Get Updates on the Splunk Community!

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

Splunk Up Your Game: Why It's Time to Embrace Python 3.9+ and OpenSSL 3.0

Did you know that for Splunk Enterprise 9.4, Python 3.9 is the default interpreter? This shift is not just a ...