All Apps and Add-ons

Splunk Index to pull Data from SAP ALM

andgarciaa
Explorer

One of my customers is using a tool with a rest API available via SAP ALM Analytics

API Ref.

https://api.sap.com/api/CALM_ANALYTICS/overview


They are looking to get data from the API into a Splunk Index, so we suggest having an intermediary application (like a Scheduled Function) to get data from SAP and send it to Splunk using an HEC Token.

Is it possible to use something at Splunk directly to pull the data from 3rd party? Or is the suggested approach a good idea to go?

 

Labels (1)
Tags (3)
0 Karma
1 Solution

deepakc
Builder

Option 1 - If the app can send direct to HEC then that is least path of resistance, so try that first as Json and ensure it has a timestamp in the data and then create a sourcetype based on props and transforms. 

You will need to send to the HEC/event or HEC/Raw - most likely event HEC end point - so you will need to test this out first

https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/Data/HECRESTendpoints 


Option 2 - A possible Splunk component is to use the Splunk Add-builder, this requires you install this onto a Splunk instance such a lab / test environment or Splunk Heavy Forwarder, not onto production Splunk Servers . You then develop a custom app to poll the API and collect data for indexing purposes. See the below for reference.   

https://docs.splunk.com/Documentation/AddonBuilder/4.2.0/UserGuide/ConfigureDataCollection  

Option 3 is to write python code and pull the data and send to HEC(This obviously requires code development) 


Option 4 - There some API collection Apps on Splunk base, but I think those are by third-party and may require licence.  

https://splunkbase.splunk.com/app/1546 

 

View solution in original post

deepakc
Builder

Option 1 - If the app can send direct to HEC then that is least path of resistance, so try that first as Json and ensure it has a timestamp in the data and then create a sourcetype based on props and transforms. 

You will need to send to the HEC/event or HEC/Raw - most likely event HEC end point - so you will need to test this out first

https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/Data/HECRESTendpoints 


Option 2 - A possible Splunk component is to use the Splunk Add-builder, this requires you install this onto a Splunk instance such a lab / test environment or Splunk Heavy Forwarder, not onto production Splunk Servers . You then develop a custom app to poll the API and collect data for indexing purposes. See the below for reference.   

https://docs.splunk.com/Documentation/AddonBuilder/4.2.0/UserGuide/ConfigureDataCollection  

Option 3 is to write python code and pull the data and send to HEC(This obviously requires code development) 


Option 4 - There some API collection Apps on Splunk base, but I think those are by third-party and may require licence.  

https://splunkbase.splunk.com/app/1546 

 

Dare2SplunkSAP
Explorer

I disagree with the solution suggested. Why not use something out of the box, like PowerConnect, to send the data to Splunk? You can do it directly from the systems reporting to ALM. PowerConnect is fully supported for ABAP, JAVA, and most SAP SaaS offerings.

0 Karma
Get Updates on the Splunk Community!

Splunk at Cisco Live 2025: Learning, Innovation, and a Little Bit of Mr. Brightside

Pack your bags (and maybe your dancing shoes)—Cisco Live is heading to San Diego, June 8–12, 2025, and Splunk ...

Splunk App Dev Community Updates – What’s New and What’s Next

Welcome to your go-to roundup of everything happening in the Splunk App Dev Community! Whether you're building ...

The Latest Cisco Integrations With Splunk Platform!

Join us for an exciting tech talk where we’ll explore the latest integrations in Cisco + Splunk! We’ve ...