I have a log entry which has multiple entries within it. i would like to be able to extract each row and have a table with rows for each entry
for example
this is the log
[22/Mar/2018:17:06:23 -0700] id 100 “GET /URL1” 200 276
[22/Mar/2018:17:06:23 -0700] id 101 “GET /URL2” 200 276
[22/Mar/2018:17:06:23 -0700] id 102 “GET /URL3” 200 276
Table
100 URL1
101 URL2
102 URL3
Try this run anywhere search
| makeresults
| eval _raw="[22/Mar/2018:17:06:23 -0700] id 100 “GET /URL1” 200 276
[22/Mar/2018:17:06:23 -0700] id 101 “GET /URL2” 200 276
[22/Mar/2018:17:06:23 -0700] id 102 “GET /URL3” 200 276"
| rex max_match=0 "id\s+(?<id>\d+)\s+\“\w+\s+\/(?<url>\w+)"
| eval c=mvzip(id,url)
| mvexpand c
| rex field=c "(?<id>[^\,]+)\,(?<url>.*)"
| table id url
In your environment, try
<your base search>
| rex max_match=0 "id\s+(?<id>\d+)\s+\“\w+\s+\/(?<url>\w+)"
| eval c=mvzip(id,url)
| mvexpand c
| rex field=c "(?<id>[^\,]+)\,(?<url>.*)"
| table id url
let me know if this helps!
Try this run anywhere search
| makeresults
| eval _raw="[22/Mar/2018:17:06:23 -0700] id 100 “GET /URL1” 200 276
[22/Mar/2018:17:06:23 -0700] id 101 “GET /URL2” 200 276
[22/Mar/2018:17:06:23 -0700] id 102 “GET /URL3” 200 276"
| rex max_match=0 "id\s+(?<id>\d+)\s+\“\w+\s+\/(?<url>\w+)"
| eval c=mvzip(id,url)
| mvexpand c
| rex field=c "(?<id>[^\,]+)\,(?<url>.*)"
| table id url
In your environment, try
<your base search>
| rex max_match=0 "id\s+(?<id>\d+)\s+\“\w+\s+\/(?<url>\w+)"
| eval c=mvzip(id,url)
| mvexpand c
| rex field=c "(?<id>[^\,]+)\,(?<url>.*)"
| table id url
let me know if this helps!
Hmm. I’m curious to understand. Each event in Splunk has multiple lines of a log event? I can’t say I know your data or how it’s coming into Splunk, but it would be good if each of these was an individual event with linebreaking. If this can’t be done and you have multiple lines in each event, there are ways of handeling this, but it would be better to understand how events are coming in first before explaining how you can split multivalue or multi line events if that is not the case.