Splunk Search

Multiple lines with the same time

dobarnes
New Member

I have logs from a custom application being streamed into splunk usinig a unverisal forwarder. The probelem I have there is multiple lines with the same time. See below.


19:12:51.790,16526719,TCP,2,3404,2226
19:12:51.790,66870655,TCP,10,53743,355114
19:12:51.790,199246079,TCP,5,2937,5715
19:12:51.790,281972991,TCP,2,55722,43156
19:12:51.790,282382591,TCP,11,2458,11480


I have extracted the fields of there data using the props.conf and the transforms.com files however when I do a search by what we call Cust_id it only pulls out the information from the first line logged for a time stamp. In the above example it would only find Cust_id = 16526719

How can I adjust my query to find the Cust_id per every line that is indexed in Splunk?

0 Karma

dobarnes
New Member

Thanks for the answer. It did work when I applied the line break to the indexer.

0 Karma

dobarnes
New Member

I restarted Splunk and ran a real time search to see if the new index data would be line breaked. It did not address the issue.

FYI - I am using a search head and multiple indexer. I have only made the adjustment on the search head. Do I need to do this on the indexers as well?

0 Karma

_d_
Splunk Employee
Splunk Employee

YES! The LINE_BREAKER and SHOULD_LINEMERGE settings are effective only at index-time (read indexer) and ignored by the search head :). On the other hand, EXTRACT-cust_id is used by the search head to performs the extraction of the field at search time.

0 Karma

dobarnes
New Member

I tried the both suggestions however I am unable to have Splunk break every line

0 Karma

MarioM
Motivator

and it will only apply to new data

0 Karma

_d_
Splunk Employee
Splunk Employee

dobarnes, did you restart splunk (indexer) after editing line breaking on props.conf?

0 Karma

_d_
Splunk Employee
Splunk Employee

It is more efficient, faster and easier for Splunk to break on every line than to line merge for each event. Try the following stanza on props.conf - it will both break the stream of data at every line AND extract a cust_id field per each event:

[my_sourcetype]
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)
EXTRACT-cust_id = (?i)\.\d{3},(?<cust_id>\d+),

Hope this helps.

> please upvote and accept answer if you find it useful - thanks!

MarioM
Motivator

Do you need to keep it as multiline ? because i would get splunk to treat each line as single event and use delims to extract fields.

props.conf:

[your_data_sourcetype]
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)
REPORT-delims=commalist

transforms.conf:

[commalist] 
DELIMS = "," 
FIELDS = field1, field2, field3, ...
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...