How can i create a scheduled report that runs every hour and makes GET requests to fetch data from an open source.
basically querying the same page and updating latest data available on premises.
There probably are several apps on splunkbase that can help with that.
could you please share a few?
thanks
thanks for the help. This seems to be correct, can you please help out on one other problem
within the webtools application i ran below query to fetch data from rapidapi :
| curl method=GET user=<splunk_username>pass=<splunk_password> headerfield= {
'x-rapidapi-host': "community-open-weather-map.p.rapidapi.com",
'x-rapidapi-key': "<apikey-from-rapidapi>"
}
data={"q":"London,uk","lat":"0","lon":"0","callback":"test","id":"2172797","lang":"null","units":"imperial","mode":"json"} uri="https://community-open-weather-map.p.rapidapi.com/weather"
I am getting below results (pfa screenshot)
See the example of using custom headers here
https://splunkbase.splunk.com/app/4146/#/details
headerfield option takes a field name, not a string - in any case, you need to quote the entire JSON payload and escape the doublequoted JSON fieldnames and values.
add the debug=t option to the curl query and you will get some feedback