Splunk Search

How to write a regular expression to match all patterns of these dynamic URLs?

jagadeeshm
Contributor

I have front-end events with several dynamic uri patterns. I am trying to generate a report to summarize the average, perc95, perc80 for the response time by uri.

Some of the uri patterns include:

/market/us/asdfadf/home.do
/market/us/123123/home.do
/market/us/dfq3we/home.do
/market/us/dfq3we/home.do
/shop/us/buy.do
/shop/us/buy.do
/market/us/shop/dfq3we/home.do
/market/us/shop/dfq3we/home.do
/market/us/shop/dfq3we/home.do

I can certainly apply reg-ex pattern using sed and convert the alphanumeric characters to -- and display the counts as:

/market/us/--/home.do
/shop/us/buy.do
/market/us/shop/--/home.do

But my problem is I don't have list of all the patterns that match, so I can't write evals. I am wondering if there is a dynamic way of writing the regular expression so all patterns of these urls are identified.

Thanks!

0 Karma

lguinn2
Legend

I suggest that you look at the free app that can parse URLs/URIs for you: URL Toolbox
And on a related topic, here is the most recent app that can parse the user agent: TA-user-agent

Yes, you can write this yourself, but I prefer to use apps for this kind of functionality when possible!

0 Karma

somesoni2
Revered Legend

You need to have some sort of rule/pattern for consolidating URIs, either for portion of URI you want to remove OR portion of URI you want to keep. Without that it doesn't seems feasible.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud's AI Assistant in Action Series: Auditing Compliance and ...

This is the third post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how to ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...