Splunk Search

scheduled search with changing query

jonathanfalconi
Explorer

Hi,

I'm using 6.1

I have a group of people who are looking at a way to create monthly reports based on their list of known URLs and then run a search against our proxy logs based on this changinging list of URLS. I used a script to convert their list of urls into a long command line: e.g. index=proxylogs google.com OR web.com OR blah.com

How could this be automated ? They could provide a URL from where we can get their latest list we then need to somehow create the query using their list and run it and lastly export to an excel which we email back to them or somthing similar?

Any hints?

0 Karma

somesoni2
Revered Legend

One option that I can think of is as follows:-

a) Have the changing list of URL as CSV file. Now you have two options to have this data available in Splunk:
   1) Add this CSV file as Lookup input file 
     Pros : Always the latest data available so search query is easier
     Cons :manual update every time the new list comes
   2) Add this CSV file as data input. You can use forwarders to allow splunk index it automatically.
     Pros : Can be fully automated
     Cons : Little complex query to write to retrieve data.
b) Create a saved search with something like this
     1) if using CSV file as lookup input file
        index=proxylogs [|inputlookup URL_lookup.csv |table URL | rename URL as query] ..| rest of the search
    2) if using CSV file as data input (say index=urlData and source=reportURLs)
        index=proxylogs [search index=urlData and source=reportURLs | eventstats max(_time) as max | where max=_time | table URL | rename URL as query] ..| rest of the search

c) Configure the saved search to send email to respective recipients and search result as attached csv.
0 Karma

Ayn
Legend

You could use something like importutil to grab the URL's directly each time the saved search runs. To create a filter out of the output, run importutil in a subsearch. Something like this:

index=proxylogs [importutil "http://some/url" | multikv | fields url]
0 Karma
Get Updates on the Splunk Community!

Introducing Ingest Actions: Filter, Mask, Route, Repeat

WATCH NOW Ingest Actions (IA) is the best new way to easily filter, mask and route your data in Splunk® ...

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...