Hello,
I am looking at creating a dashboard which shows us the least visited domains in the last 30 days. I also want to setup a email alert every hour of the most and least visited domain. Unfortunately, I don't have any domain tools / applications installed and want to create a search for the time being.
Usecase:
URL Count in the last 30 days within a certain location
Only see the top level domains
Setup email Alert every hour
I have currently got this setup but it doesn't work properly:
index=proxy OR index=web gateway src_country="country"
| regex url="(?<TLD>\.\w+?)(?:$|\/)"
| search
([|inputlookup of users.csv])
| stats count as Total by url
Thanks,
Mark Nicholls
So I have made some progress on this thankfully, the search is now looking like this:
index=proxy OR index=web gateway src_country="country"
| rex "\/\/(?:[^@\/\n]+@)?(?:www\.)?(?<url>[^:\/\n]+)"
| search
([|inputlookup users.csv])
| stats count as Total by url
| sort 0 - "Total"
The results are laid out perfectly with the count and top level domains, but I'm are getting a lot of URLs like, 'google.com', 'youtube.com' etc, which is obviously usual traffic. I'm now looking into getting the top 1m alexa domains added to the search so we can exclude all the known URLs.
Would this type of search be possible? We will have 2 inputlookups in one search, 1 searching a specific user set and the other to exclude known domains. How could I implement that into the search query?
Also, would it be possible to have an email sent with an attachment showing the 20 least communicated URLs? Our current alert doesn't seem to be working:
Alert Type: Real Time
Expires: 72 hours
Trigger when: Number of results, is equal to 20, in 60min
Trigger: Once
Throttle: 58min
Action: send an email with .csv attachment
Thanks,
Mark Nicholls
You can just add "| sort 2 +Total" (for top 2 least visited URL) OR "| sort 2 -Total" (for top 2 most visited URL).
If above doesn't work as expected, please provide more details on how you see the data now and what's expected.
Hi Somesoni,
Thank you for your response.
So I have made some progress on this thankfully, the search is now looking like this:
index=proxy OR index=web gateway src_country="country"
| rex "\/\/(?:[^@\/\n]+@)?(?:www\.)?(?<url>[^:\/\n]+)"
| search
([|inputlookup users.csv])
| stats count as Total by url
| sort 0 - "Total"
The results are laid out perfectly with the count and top level domains, but I'm are getting a lot of URLs like, 'google.com', 'youtube.com' etc, which is obviously usual traffic. I'm now looking into getting the top 1m alexa domains added to the search so we can exclude all the known URLs.
Would this type of search be possible? We will have 2 inputlookups in one search, 1 searching a specific user set and the other to exclude known domains. How could I implement that into the search query?
Also, would it be possible to have an email sent with an attachment showing the 20 least communicated URLs? Our current alert doesn't seem to be working:
Alert Type: Real Time
Expires: 72 hours
Trigger when: Number of results, is equal to 20, in 60min
Trigger: Once
Throttle: 58min
Action: send an email with .csv attachment
Thanks,
Mark Nicholls