All Apps and Add-ons

error while connecting splunk with hadoop

VP34333
New Member

Hi Team ,
I am getting bellow error while add cluster configuration information ....

Unable to connect to Hadoop cluster 'hdfs://192.168.102.82:8020/' with principal 'None': Invalid HADOOP_HOME. Cannot find Hadoop command under bin directory HADOOP_HOME='/usr/lib/hadoop'.

Please find below details which i have given.

HDFS URI *

namenode.hadoop.example.com:8020 (non-HA) or hadoop.cluster.logical.name (HA)

HOST06:8020

HADOOP_HOME *

/usr/lib/hadoop

JAVA_HOME *

/usr/java/jdk1.7.0_45

Please let me know where i am doing wrong.

Thanks
Vikas Pabale

Tags (1)
0 Karma

rahulgaikwad198
New Member

Hi All,
Getting below error while connecting Hunk with High Available Hadoop (HA Namenode and HA Resource Manager) Cluster.

04-20-2016 04:09:30.605 INFO  ERP.ha_poc -  VirtualIndex$Splitter - generateSplits started, vix.name=emp_index ...
04-20-2016 04:09:31.320 INFO  ERP.ha_poc -  TimelineClientImpl - Timeline service address: http://hadoop1.poc.com:8188/ws/v1/timeline/
04-20-2016 04:09:31.351 INFO  ERP.ha_poc -  RMProxy - Connecting to ResourceManager at /52.XX.XX.8:8050
04-20-2016 04:09:32.298 WARN  ERP.ha_poc -  SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SplunkJournalRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SplunkJournalRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=/journal\.gz$.
04-20-2016 04:09:32.300 WARN  ERP.ha_poc -  SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.ValueAvroRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.ValueAvroRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=\.avro$.
04-20-2016 04:09:32.301 WARN  ERP.ha_poc -  SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SimpleCSVRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SimpleCSVRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=\.([tc]sv)(?:\.(?:gz|bz2|snappy))?$.
04-20-2016 04:09:32.303 WARN  ERP.ha_poc -  SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SequenceFileRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SequenceFileRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=\.seq$.
04-20-2016 04:09:32.313 INFO  ERP.ha_poc -  JobSubmitterInputFormat - using class=com.splunk.mr.input.SplunkLineRecordReader to process split=/apps/hive/warehouse/emp/000000_0:0+134217728
04-20-2016 04:09:32.473 WARN  ERP.ha_poc -  ResourceMgrDelegate - getBlacklistedTrackers - Not implemented yet
04-20-2016 04:09:32.475 INFO  ERP.ha_poc -  ClusterInfoLogger - Hadoop cluster spec: provider=ha_poc, tasktrackers=2, map_inuse=1, map_slots=20, reduce_inuse=1, reduce_slots=4
04-20-2016 04:09:32.598 INFO  ERP.ha_poc -  SplunkBaseMapper - using class=com.splunk.mr.input.SplunkLineRecordReader to process split=/apps/hive/warehouse/emp/000000_0:0+134217728
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -  BlockReaderFactory - I/O error constructing remote block reader.
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -  java.net.ConnectException: Connection refused
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -  DFSInputStream - Failed to connect to /10.XX.XX.84:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -  java.net.ConnectException: Connection refused
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -  BlockReaderFactory - I/O error constructing remote block reader.
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -  java.net.ConnectException: Connection refused
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.887 WARN  ERP.ha_poc -  DFSInputStream - Failed to connect to /10.XX.XX.31:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused
04-20-2016 04:09:32.887 WARN  ERP.ha_poc -  java.net.ConnectException: Connection refused
0 Karma

McDonaldPa08
Explorer

I experienced errors trying to run Hunk searches and found that the issue was a result of insufficient Splunk-user permissions. Make sure you created a sudo user as per the installation instructions and are running Splunk with the proper permissions.

0 Karma

bsheppard_splun
Splunk Employee
Splunk Employee

You may also want to check the port number. The Hadoop ecosystem does not have standardized port numbers. Cloudera uses 8020 for the NameNode default port. It's also fairly common to use 9000.

0 Karma

nhaddadkaveh_sp
Splunk Employee
Splunk Employee
0 Karma

nhaddadkaveh_sp
Splunk Employee
Splunk Employee

Vikas,

HDP 2.0.6 is compatible with Hunk.

0 Karma

VP34333
New Member

Thanks nhaddadkaveh_splunk
now we have done with connection between hunk and Hadoop.

Thanks ,
Vikas Pabale

0 Karma

VP34333
New Member

Still i am getting same error .

I installed splunk on Winodws 7 (32 bit ) & Hortonworks Data Platform (HDP) 2.0.6 (pseudo distributed mode) installed on Linux server both are in same network.

Thanks ,
Vikas Pabale

0 Karma

nhaddadkaveh_sp
Splunk Employee
Splunk Employee

In order to work with Hadoop cluster you need to install Hunk not Splunk. Also we don't support Hunk on Windows.

0 Karma

VP34333
New Member

We have Hortonworks Data Platform (HDP) 2.0.6 (pseudo distributed mode)

Please let me know whether this distribution compatible or not.

Thanks Vikas Pabale

0 Karma

sdorich
Communicator

I've never tried to connect Splunk with Hadoop but with an error such as yours I'd first make sure JAVA_HOME is set to the location of the correct version of java needed. Then I'd find the Hadoop command under the bin directory and literally tell Splunk exactly what directory to look in (HADOOP_HOME='/usr/lib/hadoop/bin/'). Have you read all the sections in Splunk's documentation for this? you are probably just missing something. Try this link.

But like I've never actually tried to do what your doing - just a couple of ideas to try.

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...