All Apps and Add-ons

Can you help me find or load main class org.apache.hadoop.fs.FsShell during configure Splunk Hadoop Connect application?

yko84109
Loves-to-Learn

Hi,

I'm trying to configure the Splunk Hadoop Connect application with the following configurations:

HDFS URI: mynamenode:8020 HADOOP_HOME:
/opt/cloudera/parcels/CDH JAVA_HOME:
/usr/lib/jvm/java (tried also
/usr/java/latest) Namenode HTTP Port:
50070

And I'm got the following error:

Unable to connect to Hadoop cluster 'hdfs://mynamenode:8020/' with principal 'None': Failed to run Hadoop CLI job command '-ls' with options 'hdfs://mynamenode:8020/' Error: Could not find or load main class org.apache.hadoop.fs.FsShell.

How can I solve this?

Thanks!

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

To debug this, from the command line, can you run the commands:
which hadoop
and
which java

0 Karma

yko84109
Loves-to-Learn

Hi.
Thats wha't i'm doing, same error..

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

As you can see from this video ( https://www.youtube.com/watch?v=TmYHsabpk_Q )
When you setup the Java path and Hadoop path inside Splunk Hadoop Connect, you should not include the ' bin/hadoop or bin/java part
Can you share the output of these ' which hadoop ' and ' which java ' commands?

0 Karma
Get Updates on the Splunk Community!

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...

SignalFlow: What? Why? How?

What is SignalFlow? Splunk Observability Cloud’s analytics engine, SignalFlow, opens up a world of in-depth ...

Federated Search for Amazon S3 | Key Use Cases to Streamline Compliance Workflows

Modern business operations are supported by data compliance. As regulations evolve, organizations must ...