<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: error while connecting splunk with hadoop in All Apps and Add-ons</title>
    <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152455#M13751</link>
    <description>&lt;P&gt;We have Hortonworks Data Platform (HDP) 2.0.6 (pseudo distributed mode)&lt;/P&gt;

&lt;P&gt;Please let me know whether this distribution compatible or not.&lt;/P&gt;

&lt;P&gt;Thanks Vikas Pabale &lt;/P&gt;</description>
    <pubDate>Tue, 04 Mar 2014 18:03:17 GMT</pubDate>
    <dc:creator>VP34333</dc:creator>
    <dc:date>2014-03-04T18:03:17Z</dc:date>
    <item>
      <title>error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152451#M13747</link>
      <description>&lt;P&gt;Hi Team ,&lt;BR /&gt;
I am getting bellow error while add cluster configuration information ....&lt;/P&gt;

&lt;P&gt;Unable to connect to Hadoop cluster 'hdfs://192.168.102.82:8020/' with principal 'None': Invalid HADOOP_HOME. Cannot find Hadoop command under bin directory HADOOP_HOME='/usr/lib/hadoop'.&lt;/P&gt;

&lt;P&gt;Please find below details which i have given.&lt;/P&gt;

&lt;H1&gt;HDFS URI *&lt;/H1&gt;

&lt;H1&gt;namenode.hadoop.example.com:8020 (non-HA) or hadoop.cluster.logical.name (HA)&lt;/H1&gt;

&lt;P&gt;HOST06:8020&lt;/P&gt;

&lt;H1&gt;HADOOP_HOME *&lt;/H1&gt;

&lt;P&gt;/usr/lib/hadoop&lt;/P&gt;

&lt;H1&gt;JAVA_HOME *&lt;/H1&gt;

&lt;P&gt;/usr/java/jdk1.7.0_45&lt;/P&gt;

&lt;P&gt;Please let me know where i am doing wrong.&lt;/P&gt;

&lt;P&gt;Thanks&lt;BR /&gt;
Vikas Pabale&lt;/P&gt;</description>
      <pubDate>Mon, 28 Sep 2020 15:53:16 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152451#M13747</guid>
      <dc:creator>VP34333</dc:creator>
      <dc:date>2020-09-28T15:53:16Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152452#M13748</link>
      <description>&lt;P&gt;I've never tried to connect Splunk with Hadoop but with an error such as yours I'd first make sure JAVA_HOME is set to the location of the correct version of java needed. Then I'd find the Hadoop command under the bin directory and literally tell Splunk exactly what directory to look in (HADOOP_HOME='/usr/lib/hadoop/bin/'). Have you read all the sections in Splunk's documentation for this? you are probably just missing something. Try &lt;A href="http://docs.splunk.com/Documentation/HadoopConnect/latest/DeployHadoopConnect/Configuretheapp" target="_blank"&gt;this&lt;/A&gt; link.&lt;/P&gt;

&lt;P&gt;But like I've never actually tried to do what your doing - just a couple of ideas to try.&lt;/P&gt;</description>
      <pubDate>Mon, 28 Sep 2020 15:53:24 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152452#M13748</guid>
      <dc:creator>sdorich</dc:creator>
      <dc:date>2020-09-28T15:53:24Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152453#M13749</link>
      <description>&lt;P&gt;You can follow this instruction:&lt;BR /&gt;
&lt;A href="http://docs.splunk.com/Documentation/HadoopConnect/latest/DeployHadoopConnect/Downloadandinstalltheapp"&gt;http://docs.splunk.com/Documentation/HadoopConnect/latest/DeployHadoopConnect/Downloadandinstalltheapp&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 14 Feb 2014 22:24:27 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152453#M13749</guid>
      <dc:creator>nhaddadkaveh_sp</dc:creator>
      <dc:date>2014-02-14T22:24:27Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152454#M13750</link>
      <description>&lt;P&gt;You may also want to check the port number. The Hadoop ecosystem does not have standardized port numbers. Cloudera uses 8020 for the NameNode default port. It's also fairly common to use 9000. &lt;/P&gt;</description>
      <pubDate>Wed, 19 Feb 2014 07:25:39 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152454#M13750</guid>
      <dc:creator>bsheppard_splun</dc:creator>
      <dc:date>2014-02-19T07:25:39Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152455#M13751</link>
      <description>&lt;P&gt;We have Hortonworks Data Platform (HDP) 2.0.6 (pseudo distributed mode)&lt;/P&gt;

&lt;P&gt;Please let me know whether this distribution compatible or not.&lt;/P&gt;

&lt;P&gt;Thanks Vikas Pabale &lt;/P&gt;</description>
      <pubDate>Tue, 04 Mar 2014 18:03:17 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152455#M13751</guid>
      <dc:creator>VP34333</dc:creator>
      <dc:date>2014-03-04T18:03:17Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152456#M13752</link>
      <description>&lt;P&gt;I experienced errors trying to run Hunk searches and found that the issue was a result of insufficient Splunk-user permissions.  Make sure you created a sudo user as per the installation instructions and are running Splunk with the proper permissions.  &lt;/P&gt;</description>
      <pubDate>Tue, 04 Mar 2014 18:07:13 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152456#M13752</guid>
      <dc:creator>McDonaldPa08</dc:creator>
      <dc:date>2014-03-04T18:07:13Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152457#M13753</link>
      <description>&lt;P&gt;Vikas,&lt;/P&gt;

&lt;P&gt;HDP 2.0.6 is compatible with Hunk.&lt;/P&gt;</description>
      <pubDate>Tue, 04 Mar 2014 18:11:31 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152457#M13753</guid>
      <dc:creator>nhaddadkaveh_sp</dc:creator>
      <dc:date>2014-03-04T18:11:31Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152458#M13754</link>
      <description>&lt;P&gt;Still i am getting same error .&lt;/P&gt;

&lt;P&gt;I installed splunk on Winodws 7 (32 bit ) &amp;amp; Hortonworks Data Platform (HDP) 2.0.6 (pseudo distributed mode) installed on Linux server both are in same network.&lt;/P&gt;

&lt;P&gt;Thanks ,&lt;BR /&gt;
Vikas Pabale&lt;/P&gt;</description>
      <pubDate>Wed, 05 Mar 2014 05:03:10 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152458#M13754</guid>
      <dc:creator>VP34333</dc:creator>
      <dc:date>2014-03-05T05:03:10Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152459#M13755</link>
      <description>&lt;P&gt;In order to work with Hadoop cluster you need to install Hunk not Splunk. Also we don't support Hunk on Windows.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Mar 2014 05:34:15 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152459#M13755</guid>
      <dc:creator>nhaddadkaveh_sp</dc:creator>
      <dc:date>2014-03-05T05:34:15Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152460#M13756</link>
      <description>&lt;P&gt;Thanks nhaddadkaveh_splunk&lt;BR /&gt;
now we have done with connection between hunk and Hadoop.&lt;/P&gt;

&lt;P&gt;Thanks ,&lt;BR /&gt;
Vikas Pabale&lt;/P&gt;</description>
      <pubDate>Wed, 05 Mar 2014 10:33:06 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152460#M13756</guid>
      <dc:creator>VP34333</dc:creator>
      <dc:date>2014-03-05T10:33:06Z</dc:date>
    </item>
    <item>
      <title>Re: error while connecting splunk with hadoop</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152461#M13757</link>
      <description>&lt;P&gt;Hi All, &lt;BR /&gt;
Getting below error while connecting Hunk with High Available Hadoop (HA Namenode and HA Resource Manager)  Cluster.&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;04-20-2016 04:09:30.605 INFO  ERP.ha_poc -  VirtualIndex$Splitter - generateSplits started, vix.name=emp_index ...
04-20-2016 04:09:31.320 INFO  ERP.ha_poc -  TimelineClientImpl - Timeline service address: &lt;A href="http://hadoop1.poc.com:8188/ws/v1/timeline/" target="test_blank"&gt;http://hadoop1.poc.com:8188/ws/v1/timeline/&lt;/A&gt;
04-20-2016 04:09:31.351 INFO  ERP.ha_poc -  RMProxy - Connecting to ResourceManager at /52.XX.XX.8:8050
04-20-2016 04:09:32.298 WARN  ERP.ha_poc -  SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SplunkJournalRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SplunkJournalRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=/journal\.gz$.
04-20-2016 04:09:32.300 WARN  ERP.ha_poc -  SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.ValueAvroRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.ValueAvroRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=\.avro$.
04-20-2016 04:09:32.301 WARN  ERP.ha_poc -  SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SimpleCSVRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SimpleCSVRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=\.([tc]sv)(?:\.(?:gz|bz2|snappy))?$.
04-20-2016 04:09:32.303 WARN  ERP.ha_poc -  SplunkBaseMapper - Could not create preprocessor object, will try the next one ... class=com.splunk.mr.input.SequenceFileRecordReader, message=File path does not match regex to use this record reader, name=com.splunk.mr.input.SequenceFileRecordReader, path=hdfs://hacluster/apps/hive/warehouse/emp/000000_0, regex=\.seq$.
04-20-2016 04:09:32.313 INFO  ERP.ha_poc -  JobSubmitterInputFormat - using class=com.splunk.mr.input.SplunkLineRecordReader to process split=/apps/hive/warehouse/emp/000000_0:0+134217728
04-20-2016 04:09:32.473 WARN  ERP.ha_poc -  ResourceMgrDelegate - getBlacklistedTrackers - Not implemented yet
04-20-2016 04:09:32.475 INFO  ERP.ha_poc -  ClusterInfoLogger - Hadoop cluster spec: provider=ha_poc, tasktrackers=2, map_inuse=1, map_slots=20, reduce_inuse=1, reduce_slots=4
04-20-2016 04:09:32.598 INFO  ERP.ha_poc -  SplunkBaseMapper - using class=com.splunk.mr.input.SplunkLineRecordReader to process split=/apps/hive/warehouse/emp/000000_0:0+134217728
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -  BlockReaderFactory - I/O error constructing remote block reader.
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -  java.net.ConnectException: Connection refused
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.878 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -  DFSInputStream - Failed to connect to /10.XX.XX.84:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -  java.net.ConnectException: Connection refused
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.884 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -  BlockReaderFactory - I/O error constructing remote block reader.
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -  java.net.ConnectException: Connection refused
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3454)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:777)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:694)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:355)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:618)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at java.io.DataInputStream.read(DataInputStream.java:149)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.fillBuffer(UncompressedSplitLineReader.java:59)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readDefaultLine(LineReader.java:216)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.util.LineReader.readLine(LineReader.java:174)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.UncompressedSplitLineReader.readLine(UncompressedSplitLineReader.java:91)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.skipUtfByteOrderMark(LineRecordReader.java:144)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:184)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.SplunkLineRecordReader.nextKeyValue(SplunkLineRecordReader.java:39)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.doStream(SplunkBaseMapper.java:410)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:375)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkBaseMapper.stream(SplunkBaseMapper.java:331)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:644)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:656)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler$1.accept(SplunkMR.java:653)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.sendSplitToAcceptor(FileSplitGenerator.java:28)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.FileSplitGenerator.generateSplits(FileSplitGenerator.java:79)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1418)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$FileSplitter.accept(VirtualIndex.java:1396)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.addStatus(VirtualIndex.java:576)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$VIXPathSpecifier.listStatus(VirtualIndex.java:609)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex$Splitter.generateSplits(VirtualIndex.java:1566)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1485)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VirtualIndex.generateSplits(VirtualIndex.java:1437)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.input.VixSplitGenerator.generateSplits(VixSplitGenerator.java:55)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.streamData(SplunkMR.java:674)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.executeImpl(SplunkMR.java:936)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR$SearchHandler.execute(SplunkMR.java:771)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.runImpl(SplunkMR.java:1518)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.run(SplunkMR.java:1300)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
04-20-2016 04:09:32.886 WARN  ERP.ha_poc -      at com.splunk.mr.SplunkMR.main(SplunkMR.java:1546)
04-20-2016 04:09:32.887 WARN  ERP.ha_poc -  DFSInputStream - Failed to connect to /10.XX.XX.31:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection refused
04-20-2016 04:09:32.887 WARN  ERP.ha_poc -  java.net.ConnectException: Connection refused
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Wed, 20 Apr 2016 06:51:32 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/error-while-connecting-splunk-with-hadoop/m-p/152461#M13757</guid>
      <dc:creator>rahulgaikwad198</dc:creator>
      <dc:date>2016-04-20T06:51:32Z</dc:date>
    </item>
  </channel>
</rss>

