Getting Data In

Error configuring Hadoop Connect on Windows.


I installed HortonWorks and the latest version of Splunk on the same Windows 2008 Server VMWare machine. Within Splunk Web, I installed the latest Hadoop Connect for Windows. When I add a new HDFS Cluster, it is telling me that it "Cannot find Java command under bin directory JAVA_HOME='C:\java\jdk1.6.0_31\'. I know this is the correct directory for I can navigate to that directory on that machine and that directory does have a 'bin' directory with all the java executables. I also tried to reformat the JAVA_HOME entry field to be any of the following:
and it still gives the same error.

What am I to enter to make this work?
Thanks, Bill.

0 Karma

Path Finder

In the configuration section you should have the following:

  • namenode (x.x.x.x:8020) where x.x.x.x is your namenode ip
  • hadoop home (/opt/hadoop-2.2.0 on my centos box, this is the base directory where your hadoop client is installed)
  • java home (/usr/local/java/jdk1.6.0_31/ for me, your should be something like this:
  • C:\Program Files\Java\jre or wherever you root java install is.
0 Karma

0 Karma


I did as instructed.
HADOOP_HOME: C:\hdp\hadoop-
JAVA_HOME: C:\java\jdk1.6.0_31
Namenode HTTP Port: 50070

The above directories are correct for I tested them in a Command Prompt and the HDFS URI is correct for the following works:
hadoop fs -ls hdfs://

The entire error response I am receiving is:
Unable to connect to Hadoop cluster 'hdfs://' with principal 'None': Invalid JAVA_HOME. Cannot find Java command under bin directory JAVA_HOME='C:\java\jdk1.6.0_31'..

What should I do to fix this error?