All Apps and Add-ons

Splunk Index to pull Data from SAP ALM

andgarciaa
Explorer

One of my customers is using a tool with a rest API available via SAP ALM Analytics

API Ref.

https://api.sap.com/api/CALM_ANALYTICS/overview


They are looking to get data from the API into a Splunk Index, so we suggest having an intermediary application (like a Scheduled Function) to get data from SAP and send it to Splunk using an HEC Token.

Is it possible to use something at Splunk directly to pull the data from 3rd party? Or is the suggested approach a good idea to go?

 

Labels (2)
Tags (3)
0 Karma
1 Solution

deepakc
Builder

Option 1 - If the app can send direct to HEC then that is least path of resistance, so try that first as Json and ensure it has a timestamp in the data and then create a sourcetype based on props and transforms. 

You will need to send to the HEC/event or HEC/Raw - most likely event HEC end point - so you will need to test this out first

https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/Data/HECRESTendpoints 


Option 2 - A possible Splunk component is to use the Splunk Add-builder, this requires you install this onto a Splunk instance such a lab / test environment or Splunk Heavy Forwarder, not onto production Splunk Servers . You then develop a custom app to poll the API and collect data for indexing purposes. See the below for reference.   

https://docs.splunk.com/Documentation/AddonBuilder/4.2.0/UserGuide/ConfigureDataCollection  

Option 3 is to write python code and pull the data and send to HEC(This obviously requires code development) 


Option 4 - There some API collection Apps on Splunk base, but I think those are by third-party and may require licence.  

https://splunkbase.splunk.com/app/1546 

 

View solution in original post

deepakc
Builder

Option 1 - If the app can send direct to HEC then that is least path of resistance, so try that first as Json and ensure it has a timestamp in the data and then create a sourcetype based on props and transforms. 

You will need to send to the HEC/event or HEC/Raw - most likely event HEC end point - so you will need to test this out first

https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/Data/HECRESTendpoints 


Option 2 - A possible Splunk component is to use the Splunk Add-builder, this requires you install this onto a Splunk instance such a lab / test environment or Splunk Heavy Forwarder, not onto production Splunk Servers . You then develop a custom app to poll the API and collect data for indexing purposes. See the below for reference.   

https://docs.splunk.com/Documentation/AddonBuilder/4.2.0/UserGuide/ConfigureDataCollection  

Option 3 is to write python code and pull the data and send to HEC(This obviously requires code development) 


Option 4 - There some API collection Apps on Splunk base, but I think those are by third-party and may require licence.  

https://splunkbase.splunk.com/app/1546 

 

Dare2SplunkSAP
Explorer

I disagree with the solution suggested. Why not use something out of the box, like PowerConnect, to send the data to Splunk? You can do it directly from the systems reporting to ALM. PowerConnect is fully supported for ABAP, JAVA, and most SAP SaaS offerings.

0 Karma
Get Updates on the Splunk Community!

Splunk Edge Processor | Popular Use Cases to Get Started with Edge Processor

Splunk Edge Processor offers more efficient, flexible data transformation – helping you reduce noise, control ...

3 Ways to Make OpenTelemetry Even Better

My role as an Observability Specialist at Splunk provides me with the opportunity to work with customers of ...

What's New in Splunk Cloud Platform 9.2.2406?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.2.2406 with many ...