All Apps and Add-ons

splunk analytics for hadoop - kerberos

sarnagar
Contributor

As per the below doc to set up kerberos authentication for hadoop:

https://docs.splunk.com/Documentation/Splunk/7.0.0/HadoopAnalytics/ConfigureKerberosauthentication

To which realm does the below parameter refer to? Our hadoop cluster is in a different realm and splunk cluster is ina different realm. Whose realm details should I provide here?

vix.java.security.krb5.kdc =
*vix.java.security.krb5.realm = *

Tags (3)
0 Karma
1 Solution

rdagan_splunk
Splunk Employee
Splunk Employee

Splunk Analytics for Hadoop is only an Hadoop Client. So the realm should be the Hadoop Server realm.
The default realm and the KDC for that realm are indicated in the Kerberos krb5.conf configuration file. On Linux it is normally found on /etc/krb5.conf
Splunk needs these values, so just copy the values from krb5.conf to these 2 flags:
vix.java.security.krb5.realm
vix.java.security.krb5.kdc

View solution in original post

sarnagar
Contributor

Hi @rdagan ,

I have splunkd1@TS.fitco.com user on the splunk node and splunkd1@RT.rtp.com user on hadoop cluster.

I have created keytabfile for splunkd1@TS.fitco.com and provided in the indexes.conf....and I get this error while executing hadoop commands on splunk host.

ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "splunkdev@TS.company2.COM/xx.xx.xx.xxx"; destination host is: "SLPP02.HADOOP.company.COM:8020;

[hadoopidx]
coldPath = $SPLUNK_DB/hadoopidx/colddb
enableDataIntegrityControl = 0
enableTsidxReduction = 0
homePath = $SPLUNK_DB/hadoopidx/db
maxTotalDataSizeMB = 20480
thawedPath = $SPLUNK_DB/hadoopidx/thaweddb

[provider:eihadoop]
vix.command.arg.3 = $SPLUNK_HOME/bin/jars/SplunkMR-hy2.jar
vix.dfs.namenode.kerberos.principal = hdfs/_HOST@HADOOP.company.COM
vix.env.HADOOP_HOME = /opt/local/hadoop-2.6.0-cdh5.9.1
vix.env.HUNK_THIRDPARTY_JARS = $SPLUNK_HOME/bin/jars/thirdparty/common/avro-1.7.7.jar,$SPLUNK_HOME/bin/jars/thirdparty/common/avro-mapred-1.7.7.jar,$SPLUNK_HOME/bin/jars/thirdparty/common/commons-compress-1.10.jar,$SPLUNK_HOME/bin/jars/thirdparty/common/commons-io-2.4.jar,$SPLUNK_HOME/bin/jars/thirdparty/common/libfb303-0.9.2.jar,$SPLUNK_HOME/bin/jars/thirdparty/common/parquet-hive-bundle-1.6.0.jar,$SPLUNK_HOME/bin/jars/thirdparty/common/snappy-java-1.1.1.7.jar,$SPLUNK_HOME/bin/jars/thirdparty/hive_1_2/hive-exec-1.2.1.jar,$SPLUNK_HOME/bin/jars/thirdparty/hive_1_2/hive-metastore-1.2.1.jar,$SPLUNK_HOME/bin/jars/thirdparty/hive_1_2/hive-serde-1.2.1.jar
vix.env.JAVA_HOME = /usr/java/jdk1.8.0_102
vix.family = hadoop
vix.fs.default.name = hdfs://SLPP02.HADOOP.company.COM:8020
vix.hadoop.security.authentication = kerberos
vix.hadoop.security.authorization = 1
vix.javaprops.java.security.krb5.kdc = SLP013.HADOOP.company.COM
vix.javaprops.java.security.krb5.realm = HADOOP.company.COM
vix.mapreduce.framework.name = yarn
vix.output.buckets.max.network.bandwidth = 0
vix.splunk.home.hdfs = /user/splunkdev/hadoopanalytics/
vix.yarn.nodemanager.principal = yarn/_HOST@HADOOP.company.COM
vix.yarn.resourcemanager.address = https://SLPP08.HADOOP.company.COM:8090/cluster
vix.yarn.resourcemanager.principal = yarn/_HOST@HADOOP.company.COM
vix.yarn.resourcemanager.scheduler.address = https://SLPP015.HADOOP.company.COM:8090/cluster/scheduler
vix.mapreduce.jobtracker.kerberos.principal = mapred/_HOST@HADOOP.company.COM
vix.kerberos.keytab = /home/splunkd1/splunkd1.keytab
vix.kerberos.principal = splunkdev@TS.company2.COM

[splunk_index_archive]
vix.output.buckets.from.indexes = hadoopidx
vix.output.buckets.older.than = 172800
vix.output.buckets.path = /user/splunkdev/splunk_index_archive
vix.provider = eihadoop

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

Splunk Analytics for Hadoop is only an Hadoop Client. So the realm should be the Hadoop Server realm.
The default realm and the KDC for that realm are indicated in the Kerberos krb5.conf configuration file. On Linux it is normally found on /etc/krb5.conf
Splunk needs these values, so just copy the values from krb5.conf to these 2 flags:
vix.java.security.krb5.realm
vix.java.security.krb5.kdc

sarnagar
Contributor

HI @rdaga ,

vix.kerberos.principal =
vix.kerberos.keytab =

Does this principal and keytab refer to the user that I create on Hadoop cluster?

/user/splunkuser/ in Hadoop?

0 Karma
Get Updates on the Splunk Community!

Uncovering Multi-Account Fraud with Splunk Banking Analytics

Last month, I met with a Senior Fraud Analyst at a nationally recognized bank to discuss their recent success ...

Secure Your Future: A Deep Dive into the Compliance and Security Enhancements for the ...

What has been announced?  In the blog, “Preparing your Splunk Environment for OpensSSL3,”we announced the ...

New This Month in Splunk Observability Cloud - Synthetic Monitoring updates, UI ...

This month, we’re delivering several platform, infrastructure, application and digital experience monitoring ...