All Apps and Add-ons

Hunk with Yarn - Class Not Found Exception

rdagan_splunk
Splunk Employee
Splunk Employee

Running Hunk with Yarn and seeing these issues:

ERROR .. - Error while waiting for MapReduce job to complete, job_id=[! http://:8088/proxy/application_/ job_], state=FAILED, reason=Application application_ failed 1 times due to AM Container for appattempt_ exited with exitCode: 1 due to:

So you go to: http://:8088 --> Drill down to your specific Job --> Select ' Log ' (normally on the lower right side of the page)

The log may be useful and show " Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaster Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.v2.app.MRAppMaster .. "

Or the log may not show us Anything useful .. For example ' The requested application exited before setting a tracking URL. '

Tags (3)
1 Solution

rdagan_splunk
Splunk Employee
Splunk Employee

To fix this issue:
1) Go to " http://:8088/conf "
2) Find the Key " yarn.application.classpath "
*** This exact same value should be also found inside your Hadoop nodes (server) -> yarn-site.xml
3) Copy the value for this yarn class path
4) Go to the Hunk Provider settings and add the flag " vix.yarn.application.classpath " --> Paste the value from ' yarn.application.classpath '
For Example,
vix.yarn.application.classpath = $HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/,$HADOOP_COMMON_HOME/lib/,$HADOOP_HDFS_HOME/,$HADOOP_HDFS_HOME/lib/,$HADOOP_MAPRED_HOME/,$HADOOP_MAPRED_HOME/lib/,$YARN_HOME/,$YARN_HOME/lib/

View solution in original post

hsesterhenn_spl
Splunk Employee
Splunk Employee

Hi,

I have seen another error where this solution might be very helpful:

If you install Splunk (with Splunk Analytics for Hadoop license, or just you want to use Hadoop Data Roll) on a separate SH and in addition are just using the "Hadoop Client Libraries" for your distribution (Apache Hadoop, Hortonworks HDP, Cloudera CDH, ...) you might need to copy mapred-site.xml and yarn-site.xml from a NodeManager or DataNode of your cluster to
/path/to/your/Hadoop Packages/etc/hadoop/conf/

Otherwise you need to set a couple of different parameters in your Splunk Hadoop Provider (vix.yarn..., vix.mapred..., vix.mapreduce...)

If
index=hadoop-index | head 10
returns data...
HDFS is working

if (in SmartMode!!!!!!)
index=hadoop-index | stats count
returns a result without errors... you've done the right thing.

And don't forget this cool bog post:

https://www.splunk.com/blog/2014/05/14/hunkonhunk.html

HTH,

Holger

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

To fix this issue:
1) Go to " http://:8088/conf "
2) Find the Key " yarn.application.classpath "
*** This exact same value should be also found inside your Hadoop nodes (server) -> yarn-site.xml
3) Copy the value for this yarn class path
4) Go to the Hunk Provider settings and add the flag " vix.yarn.application.classpath " --> Paste the value from ' yarn.application.classpath '
For Example,
vix.yarn.application.classpath = $HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/,$HADOOP_COMMON_HOME/lib/,$HADOOP_HDFS_HOME/,$HADOOP_HDFS_HOME/lib/,$HADOOP_MAPRED_HOME/,$HADOOP_MAPRED_HOME/lib/,$YARN_HOME/,$YARN_HOME/lib/

Get Updates on the Splunk Community!

Introducing Splunk Enterprise Security 8.0!

Join us on Wednesday, November 20 to learn about Splunk Enterprise Security 8.0!To enhance SOC efficiency, ...

Mastering Threat Hunting

Register to watch Mastering Threat Hunting on Monday, November 18Join us for an insightful talk where we dive ...

Upcoming Community Maintenance: 10/28

Howdy folks, just popping in to let you know that the Splunk Community site will be in read-only mode ...