All Apps and Add-ons

Hunk with Yarn - Class Not Found Exception

rdagan_splunk
Splunk Employee
Splunk Employee

Running Hunk with Yarn and seeing these issues:

ERROR .. - Error while waiting for MapReduce job to complete, job_id=[! http://:8088/proxy/application_/ job_], state=FAILED, reason=Application application_ failed 1 times due to AM Container for appattempt_ exited with exitCode: 1 due to:

So you go to: http://:8088 --> Drill down to your specific Job --> Select ' Log ' (normally on the lower right side of the page)

The log may be useful and show " Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaster Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.v2.app.MRAppMaster .. "

Or the log may not show us Anything useful .. For example ' The requested application exited before setting a tracking URL. '

Tags (3)
1 Solution

rdagan_splunk
Splunk Employee
Splunk Employee

To fix this issue:
1) Go to " http://:8088/conf "
2) Find the Key " yarn.application.classpath "
*** This exact same value should be also found inside your Hadoop nodes (server) -> yarn-site.xml
3) Copy the value for this yarn class path
4) Go to the Hunk Provider settings and add the flag " vix.yarn.application.classpath " --> Paste the value from ' yarn.application.classpath '
For Example,
vix.yarn.application.classpath = $HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/,$HADOOP_COMMON_HOME/lib/,$HADOOP_HDFS_HOME/,$HADOOP_HDFS_HOME/lib/,$HADOOP_MAPRED_HOME/,$HADOOP_MAPRED_HOME/lib/,$YARN_HOME/,$YARN_HOME/lib/

View solution in original post

hsesterhenn_spl
Splunk Employee
Splunk Employee

Hi,

I have seen another error where this solution might be very helpful:

If you install Splunk (with Splunk Analytics for Hadoop license, or just you want to use Hadoop Data Roll) on a separate SH and in addition are just using the "Hadoop Client Libraries" for your distribution (Apache Hadoop, Hortonworks HDP, Cloudera CDH, ...) you might need to copy mapred-site.xml and yarn-site.xml from a NodeManager or DataNode of your cluster to
/path/to/your/Hadoop Packages/etc/hadoop/conf/

Otherwise you need to set a couple of different parameters in your Splunk Hadoop Provider (vix.yarn..., vix.mapred..., vix.mapreduce...)

If
index=hadoop-index | head 10
returns data...
HDFS is working

if (in SmartMode!!!!!!)
index=hadoop-index | stats count
returns a result without errors... you've done the right thing.

And don't forget this cool bog post:

https://www.splunk.com/blog/2014/05/14/hunkonhunk.html

HTH,

Holger

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

To fix this issue:
1) Go to " http://:8088/conf "
2) Find the Key " yarn.application.classpath "
*** This exact same value should be also found inside your Hadoop nodes (server) -> yarn-site.xml
3) Copy the value for this yarn class path
4) Go to the Hunk Provider settings and add the flag " vix.yarn.application.classpath " --> Paste the value from ' yarn.application.classpath '
For Example,
vix.yarn.application.classpath = $HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/,$HADOOP_COMMON_HOME/lib/,$HADOOP_HDFS_HOME/,$HADOOP_HDFS_HOME/lib/,$HADOOP_MAPRED_HOME/,$HADOOP_MAPRED_HOME/lib/,$YARN_HOME/,$YARN_HOME/lib/

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...