Archive

Error while Hunk connecting with HA Hadoop (HA Namenode and HA Resource Manager)

New Member

Hi All ,

Getting below error while Hunk connecting to HA hadoop cluster.
BlockReaderFactory - I/O error constructing remote block reader.
java.net.ConnectException: Connection refused

Please let me know if need more details and help to resolve this issue.

Log :

04-20-2016 04:09:30.605 INFO ERP.hapoc - VirtualIndex$Splitter - generateSplits started, vix.name=empindex ...
04-20-2016 04:09:31.320 INFO ERP.hapoc - TimelineClientImpl - Timeline service address: http://hadoop1.poc.com:8188/ws/v1/timeline/
04-20-2016 04:09:31.351 INFO ERP.ha
poc - RMProxy - Connecting to ResourceManager at /XX.XX.XX.8:8050
04-20-2016 04:09:32.298 WARN ERP.hapoc - SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SplunkJournalRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SplunkJournalRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/0000000, regex=/journal.gz$.
04-20-2016 04:09:32.300 WARN ERP.hapoc - SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.ValueAvroRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.ValueAvroRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/0000000, regex=.avro$.
04-20-2016 04:09:32.301 WARN ERP.hapoc - SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SimpleCSVRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SimpleCSVRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/0000000, regex=.([tc]sv)(?:.(?:gz|bz2|snappy))?$.
04-20-2016 04:09:32.303 WARN ERP.hapoc - SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SequenceFileRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SequenceFileRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/0000000, regex=.seq$.
04-20-2016 04:09:32.313 INFO ERP.hapoc - JobSubmitterInputFormat - using class=com.splunk.mr.input.SplunkLineRecordReader to process split=/apps/hive/warehouse/emp/0000000:0+134217728
04-20-2016 04:09:32.473 WARN ERP.hapoc - ResourceMgrDelegate - getBlacklistedTrackers - Not implemented yet
04-20-2016 04:09:32.475 INFO ERP.ha
poc - ClusterInfoLogger - Hadoop cluster spec: provider=hapoc, tasktrackers=2, mapinuse=1, mapslots=20, reduceinuse=1, reduceslots=4
04-20-2016 04:09:32.598 INFO ERP.ha
poc - SplunkBaseMapper - using class=com.splunk.mr.input.SplunkLineRecordReader to process split=/apps/hive/warehouse/emp/0000000:0+134217728
04-20-2016 04:09:32.878 WARN ERP.ha
poc - BlockReaderFactory - I/O error constructing remote block reader.
04-20-2016 04:09:32.878 WARN ERP.hapoc - java.net.ConnectException: Connection refused
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.878 WARN ERP.hapoc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.878 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.884 WARN ERP.hapoc - DFSInputStream - Failed to connect to /XX.XX.XX.XX:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused
04-20-2016 04:09:32.884 WARN ERP.ha
poc - java.net.ConnectException: Connection refused
04-20-2016 04:09:32.884 WARN ERP.hapoc - at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.884 WARN ERP.ha
poc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.884 WARN ERP.hapoc - at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - BlockReaderFactory - I/O error constructing remote block reader.
04-20-2016 04:09:32.886 WARN ERP.hapoc - java.net.ConnectException: Connection refused
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.886 WARN ERP.hapoc - at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.886 WARN ERP.ha
poc - at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.887 WARN ERP.hapoc - DFSInputStream - Failed to connect to /XX.XX.XX.XX:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused
04-20-2016 04:09:32.887 WARN ERP.ha
poc - java.net.ConnectException: Connection refused

Tags (3)
0 Karma

Splunk Employee
Splunk Employee

It looks as if one of your 2 data nodes might be blocked. I see this error: Failed to connect to /XX.XX.XX.XX:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused

0 Karma