Splunk Search

Is there a way to use rex to divide api name?

mikeyty07
Communicator

I have an access logs which prints like this
server - - [date& time] "GET /google/page1/page1a/633243463476/googlep1?sc=RT&lo=en_US HTTP/1.1" 200 350 85
which rex is 
| rex field=_raw "(?<SRC>\d+\.\d+\.\d+\.\d+).+\]\s\"(?<http_method>\w+)\s(?<uri_path>\S+)\s(?<uri_query>\S+)\"\s(?<statusCode>\d+)\s(?<body_size>\d+)\s\s(?<response_time>\d+)"

Is there a way to seperate uri into two or 3?

 /google/page1/page1a/633243463476/googlep1?sc=RT&lo=en_US 

TO

 /google
/page1/page1a/633243463476/googlep1?sc=RT&lo=en_US 

OR

/google
/page1/page1a/633243463476/googlep1 
?sc=RT&lo=en_US

 

Labels (3)
0 Karma
1 Solution

bowesmana
SplunkTrust
SplunkTrust

This will get the 3 parts

(?<uri_root>/[^/]+)(?<uri_path>[^?\s]+)\s?(?<uri_query>\S+)

View solution in original post

mikeyty07
Communicator

Thank you worked like a charm, however i used
(?<uri_root>/[^/]+)(?<uri_path>[^?\s]+)\s(?<uri_query>\S+)
uri_query seemed to give results for Http/1.1

can you also please check this? It is follow up question.
https://community.splunk.com/t5/Splunk-Search/Using-lookup-command-after-rex-field/td-p/624450

0 Karma

yuanliu
SplunkTrust
SplunkTrust

An alternative to regex is to use split, which can be more semantically explicit. (And slightly more efficient.)

Now to using split.  Assuming that you have that field uri.

 

| eval uri = split(uri, "?")
| eval uri_query = "?" . mvindex(uri, 1) ``` ?sc=RT&lo=en_US ```
| eval uri = split(mvindex(uri, 0), "/")
| eval root = "/" . mvindex(uri, 1) ``` /google ```
| eval remainder = "/" . mvjoin(mvindex(uri, 2, -1), "/")

 

This gives

remainderrooturi_query
/page1/page1a/633243463476/googlep1/google?sc=RT&lo=en_US

 

Tags (3)
0 Karma

mikeyty07
Communicator

the search didnt give any results, also how do i get results of all the other companies like facebook, twitter?

0 Karma

bowesmana
SplunkTrust
SplunkTrust

This will get the 3 parts

(?<uri_root>/[^/]+)(?<uri_path>[^?\s]+)\s?(?<uri_query>\S+)
Get Updates on the Splunk Community!

Splunk Observability Synthetic Monitoring - Resolved Incident on Detector Alerts

We’ve discovered a bug that affected the auto-clear of Synthetic Detectors in the Splunk Synthetic Monitoring ...

Video | Tom’s Smartness Journey Continues

Remember Splunk Community member Tom Kopchak? If you caught the first episode of our Smartness interview ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud? Learn how unique features like ...