All Apps and Add-ons

Hadoop Ops App and Cloudera

Excelsys
New Member

I took a look at the Hadoop Ops App documentation and it seems to say that only Cloudera's CDH3 Distribution (which was out of date a year ago) is supported and not CDH4. Can anyone confirm this or is it a typo and CDH4 is supported?

0 Karma

pierre4splunk
Splunk Employee
Splunk Employee

hi Excelsys,

First I would refer you to my response to the related question asked earlier:

http://splunk-base.splunk.com/answers/75897/hadoopops-and-cdh4/76147

HadoopOps assumes that your HDFS and MapReduce service instances are based on the core daemons that have remained the same across all stable releases of Apache Hadoop (0.20.x to 1.04). In other words, HDFS NameNode, SecondaryNameNode, DataNode, and MapReduce version 1.0's JobTracker and TaskTracker.

Most customers using HadoopOps on CDH4 today are deploying MapReduce version 1.0 and not the MapReduce 2.0 / YARN architecture with ApplicationManager and NodeManager (Cloudera docs advise the same due to non-production readiness). HadoopOps does not yet support MapReduce 2.0 / YARN deployments, regardless of distribution. It is on our roadmap but so far we haven't seen significant adoption in production environments.

Note that HadoopOps is not officially certified on CDH4, and our docs don't provide detailed install steps for navigating the CDH4 installation paths. But if you're familiar with the data inputs HadoopOps depends upon, its just a matter of identifying the right location of properties and logs in your Hadoop environment. Many customers are using HadoopOps with CDH4, and extending the app with Splunk Core features as needed (e.g. to monitor HBase, ZK, Hive).

The HadoopOps install experience will continue to improve. In future versions, the app and docs will rely more on the Splunk admin user to provide access paths or end points for necessary inputs, rather than be aware of how Hadoop services are packaged for a particular vendor's tarball.

Hope this helps. Let us know how it goes!

0 Karma

pierre4splunk
Splunk Employee
Splunk Employee

by the way, with regards to being familiar with the data inputs HadoopOps depends upon, one useful resource is to look at the inputs.conf.template in v1.1 descriptions of each input stanza.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...