I successfully connected Splunk with my Hadoop cluster, I could also test using the commands specified in Splunk connect documentation successfully
Test write access to the Hadoop cluster
To test write access to your Hadoop cluster, run this command in the path where you want to export data:
$HADOOP_HOME/bin/hadoop fs -touchz hdfs://://foo.txt
$HADOOP_HOME/bin/hadoop fs -rm hdfs://://foo.txt
But when I Schedule it to send the log files I get the below error:
ERROR: Error in 'movehdfs' command: {"error": "moveFromLocal: File /CYF-TS1/WinEventLog_System/17a9a87ea0c80e26fe8f4f366a516928_1461297600_1461326400_17_1.json.gz.hdfs.COPYING could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.", "id": "HCERR0001", "message": "Failed to run hadoop CLI Job", "cmd": "-moveFromLocal", "options": "/opt/splunk/var/run/splunk/dispatch/1461353114.62/dump/CYF-TS1/WinEventLog:System/17a9a87ea0c80e26fe8f4f366a516928_1461297600_1461326400_17_1.json.gz,hdfs://172.16.30.140:8020/CYF-TS1/WinEventLog_System/17a9a87ea0c80e26fe8f4f366a516928_1461297600_1461326400_17_1.json.gz.hdfs"}
Please let me know how to resolve this issue.
Have you try to run the same command as Hadoop connect?
It looks like Hadoop Connect is trying to run the command hadoop fs -moveFromLocal
Have you seen this video to install the App: http://www.splunk.com/view/SP-CAAAHBZ
Note:
Do not include hdfs:// in the HDFS URI
Do not include bin/hadoop in the Hadoop Home
Do not include bin/java in the Java Home