Getting Data In

IBM Common Data Provider for z Systems (mainframe): How to integrate to Splunk?

koshyk
Super Champion

I've seen multiple posts and links to say about integration of mainframe to Splunk. I can see lot of theory and functionalities, but want to see how practically things are done, if you have any hands-on experience with the common data provider

  1. Common data provider: Is it a package for z-systems which IBM have to install on each mainframe servers or on specific master server?
  2. How costly are they? Or is it part of the support package for z-systems?
  3. How the data is pushed from the "Data Streamer" component of z-system to Splunk? via syslog?
  4. Is there a TA/data format example for these type of data streamed from z-systems?

I feel the SplunkBase app is just an advertisement, which takes it to IBM website but no practical examples in it.

0 Karma
1 Solution

skalliger
Motivator

Hi,

luckily, we recently had a meeting with IBM about CDP (Common Data Provider).
I don't know if I'm allowed to give too much detail, so I will point out a few key features.

  1. CDP will usually be installed as a proof of concept first (PoC) with IBM. The CDP will be installed on every LPAR. There is a central instance for configuring all CDPs installed on the LPARs. To configure them, you will either need to define one security policy or even more (which data to collect etc.).
  2. CDP's list price is REALLY expensive. You might want to contact IBM for a realistic price.
  3. The CDP consists of different "jobs" (I am no hostie, so don't judge me for that wording), where the Data Streamer is one of those. There is a receiving software package that needs to be installed on one of your servers, preferably a Heavy Forwarder. This receiver will then create files for every sourcetype which can then be read from the HF and thus indexed at the indexers. So, no, it's not syslog. The transfer may also be encrypted.
  4. IBM has its own Splunk app which will only be given to you if you buy the CDP package.

Any more questions?
IBM has a quite new partnership with Splunk, so there might be some changes in the near future.

Skalli

Edit: added some more detail in 1.

View solution in original post

lfedak_splunk
Splunk Employee
Splunk Employee

Hey @koshyk, If @skalliger answered your question, remember to "√Accept" the answer to award karma points 🙂

skalliger
Motivator

Hi,

luckily, we recently had a meeting with IBM about CDP (Common Data Provider).
I don't know if I'm allowed to give too much detail, so I will point out a few key features.

  1. CDP will usually be installed as a proof of concept first (PoC) with IBM. The CDP will be installed on every LPAR. There is a central instance for configuring all CDPs installed on the LPARs. To configure them, you will either need to define one security policy or even more (which data to collect etc.).
  2. CDP's list price is REALLY expensive. You might want to contact IBM for a realistic price.
  3. The CDP consists of different "jobs" (I am no hostie, so don't judge me for that wording), where the Data Streamer is one of those. There is a receiving software package that needs to be installed on one of your servers, preferably a Heavy Forwarder. This receiver will then create files for every sourcetype which can then be read from the HF and thus indexed at the indexers. So, no, it's not syslog. The transfer may also be encrypted.
  4. IBM has its own Splunk app which will only be given to you if you buy the CDP package.

Any more questions?
IBM has a quite new partnership with Splunk, so there might be some changes in the near future.

Skalli

Edit: added some more detail in 1.

singh_1234567
Loves-to-Learn Lots

Hi Mate,

I have a question, so when you say CDP receiver what it is?

And can we use CDP receiver to forward the logs to universal forwarder and then forward it to Heavy Forwarder?

Please confirm, awaiting your reply.

 

 

0 Karma

koshyk
Super Champion

thank you mate. Very practical answer. We are looking into syncsort too as an option now.

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...