All Apps and Add-ons

Splunk Index to pull Data from SAP ALM

andgarciaa
Explorer

One of my customers is using a tool with a rest API available via SAP ALM Analytics

API Ref.

https://api.sap.com/api/CALM_ANALYTICS/overview


They are looking to get data from the API into a Splunk Index, so we suggest having an intermediary application (like a Scheduled Function) to get data from SAP and send it to Splunk using an HEC Token.

Is it possible to use something at Splunk directly to pull the data from 3rd party? Or is the suggested approach a good idea to go?

 

Labels (1)
Tags (3)
0 Karma
1 Solution

deepakc
Builder

Option 1 - If the app can send direct to HEC then that is least path of resistance, so try that first as Json and ensure it has a timestamp in the data and then create a sourcetype based on props and transforms. 

You will need to send to the HEC/event or HEC/Raw - most likely event HEC end point - so you will need to test this out first

https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/Data/HECRESTendpoints 


Option 2 - A possible Splunk component is to use the Splunk Add-builder, this requires you install this onto a Splunk instance such a lab / test environment or Splunk Heavy Forwarder, not onto production Splunk Servers . You then develop a custom app to poll the API and collect data for indexing purposes. See the below for reference.   

https://docs.splunk.com/Documentation/AddonBuilder/4.2.0/UserGuide/ConfigureDataCollection  

Option 3 is to write python code and pull the data and send to HEC(This obviously requires code development) 


Option 4 - There some API collection Apps on Splunk base, but I think those are by third-party and may require licence.  

https://splunkbase.splunk.com/app/1546 

 

View solution in original post

deepakc
Builder

Option 1 - If the app can send direct to HEC then that is least path of resistance, so try that first as Json and ensure it has a timestamp in the data and then create a sourcetype based on props and transforms. 

You will need to send to the HEC/event or HEC/Raw - most likely event HEC end point - so you will need to test this out first

https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/Data/HECRESTendpoints 


Option 2 - A possible Splunk component is to use the Splunk Add-builder, this requires you install this onto a Splunk instance such a lab / test environment or Splunk Heavy Forwarder, not onto production Splunk Servers . You then develop a custom app to poll the API and collect data for indexing purposes. See the below for reference.   

https://docs.splunk.com/Documentation/AddonBuilder/4.2.0/UserGuide/ConfigureDataCollection  

Option 3 is to write python code and pull the data and send to HEC(This obviously requires code development) 


Option 4 - There some API collection Apps on Splunk base, but I think those are by third-party and may require licence.  

https://splunkbase.splunk.com/app/1546 

 

Dare2SplunkSAP
Explorer

I disagree with the solution suggested. Why not use something out of the box, like PowerConnect, to send the data to Splunk? You can do it directly from the systems reporting to ALM. PowerConnect is fully supported for ABAP, JAVA, and most SAP SaaS offerings.

0 Karma

srbruso_AWI
New Member

SAP EC Payroll Cloud does not allow for PowerConnect to be installed as an add-on unlike the other SAP Cloud systems, so customers are forced to use SolMan or SAP Cloud ALM to monitor jobs.  

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...