All Apps and Add-ons

How do I use the REST API Modular Input to index data from a web service?

banderson7
Communicator

I'm trying to bring in web monitoring reports (number of runs, uptime, number of errors, etc) as a Json format using a rest call to the monitoring provider (Rigor), who has a REST API. I've still got to figure out what to do with that JSON data once it's in, but that's the next bridge.
I'm able to call the Rigor REST API using Postman with the following string:

GET https://my.rigor.com/reports/uptimes/report.json?location=all&start_date=custom&start=firstdate&end=...

And that gives me my data points in a json format.

When I put (I think) the same string into the REST API Modular Input in Splunk... I'm not sure if it works. I have the data supposedly going to main as a test, but searching the main index brings up no results. I've probably input something wrong into my modular input, but I'm not sure where. These are my settings:

Endpoint URL: https://my.rigor.com
HTTP Method: GET
Authentication Type: none
HTTP Header Properties: blank
URL Arguments: location=all,api_key=key,start_date=custom
Response Type: json
Response Handler: blank
Response Handler Arguments: blank
Response Filter Pattern: blank
Streaming Request: unchecked
Index Error Responses: checked

When I enable this data input, I don't see anything in the main index (though unsure if I should yet), and when I search in the _internal index for the host, I see the following entry:

HttpListener - Socket error from 127.0.0.1 while accessing /servicesNS/nobody/launcher/data/inputs/rest/RIgor/: Broken pipe

Also, i'm uncertain how to call this data when I'm ready to do so. In search, do I use index=main sourcetype=_json and go from there?

Thanks for any help and your patience. 🙂

0 Karma

Damien_Dallimor
Ultra Champion
0 Karma

banderson7
Communicator

So I'm getting the data, thanks for the help. But now what do i do? 🙂
I have an entry or two with epoch timestamps, datapoints like avg_response_time, errors, max_response_time, etc... that I'd to graph, or at least create a table. I suppose I need to define my fields and poke around.
Thanks again, and if you have any pointers on this, I'd appreciate it.

0 Karma

Damien_Dallimor
Ultra Champion

If your response data is JSON , then the fields should be auto extracted for you already.

If you post a sample of your JSON events , then I can show you some simple Splunk searches to get you started.

0 Karma

banderson7
Communicator
{"stats":{"min_response_time":4566,"errors":0,"percentage_uptime":100.0,"max_response_time":12788,"run_count":29,"avg_response_time":7802},"graph":{"legend":{"enabled":false},"tooltip":{"shared":true,"useHTML":true,"followPointer":true},"credits":{"enabled":false},"xAxis":{"type":"datetime","title":{"text":""},"dateTimeLabelFormats":{"month":"%b %e","day":"%b %e","hour":"%l:%M%P","year":"%b","week":"%b %e"}},"title":{"text":""},"yAxis":{"tickWidth":1,"labels":{"enabled":true,"format":"{value}%"},"title":{"text":""},"min":0,"gridLineWidth":0},"chart":{"type":"column","zoomType":"x"},"exporting":{"enabled":false,"chartOptions":{"title":{"text":"scrubbed test history"},"subtitle":{"text":"11/19/2015"}},"filename":"scrubbed test history: 11/19/2015"},"plotOptions":{"column":{"stacking":"percent"},"area":{"stacking":"percent","marker":{"enabled":false}},"series":{"pointPadding":0,"cursor":"pointer","groupPadding":0.05,"point":{"events":{}}}},"series":[{"color":"#C24747","data":[{"x":1447902292000.0,"y":0.0,"interval":"hour"},{"x":1447905892000.0,"y":0.0,"interval":"hour"},{"x":1447909492000.0,"y":0.0,"interval":"hour"},{"x":1447913092000.0,"y":0.0,"interval":"hour"},{"x":1447916692000.0,"y":0.0,"interval":"hour"},{"x":1447920292000.0,"y":0.0,"interval":"hour"},{"x":1447923892000.0,"y":0.0,"interval":"hour"},{"x":1447927492000.0,"y":0.0,"interval":"hour"}],"name":"Downtime"},{"color":"#339933","data":[{"x":1447902292000.0,"y":1.0,"interval":"hour"},{"x":1447905892000.0,"y":1.0,"interval":"hour"},{"x":1447909492000.0,"y":1.0,"interval":"hour"},{"x":1447913092000.0,"y":1.0,"interval":"hour"},{"x":1447916692000.0,"y":1.0,"interval":"hour"},{"x":1447920292000.0,"y":1.0,"interval":"hour"},{"x":1447923892000.0,"y":1.0,"interval":"hour"},{"x":1447927492000.0,"y":1.0,"interval":"hour"}],"name":"Uptime"}]},"graph_start":1447902292000.0,"graph_end":1447931092000.0,"uptime":{"maximum_response_time":12788,"percentage_uptime":100.0,"run_count":29,"average_response_time":78

`02,"minimum_response_time":4566}}

I guess I'm looking for the number of runs in a particular time, as well as the number of runs that have errors.

0 Karma

banderson7
Communicator

Makes sense. This may work better when I'm pulling from the service open via firewall. I'll let you know if I continue having problems.

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...