All Apps and Add-ons

Hunk not returning any data either for exploring or during a search

jwalzerpitt
Influencer

Hunk is configured to point to our Hadoop cluster and I've tried two ways to access data:

1) Using Explore Data I select Provider and then Virtual Index and when I prompts me to select a file, no files are listed (confirmed files are there via Hue). There are no errors messages.

2) Under Virtual Index, I select 'Search' and I get, "Error while running external process, return_code=126. See search.log for more info"

My environment variables are as follows:

Java Home = /bin/java
Hadoop Home = /usr/lib/hadoop

Any help would be apprciated

Thx

0 Karma

jwalzerpitt
Influencer

Was just informed the Hadoop cluster is Kerbertized so that may be an issue...

Ran the following to test connectivity:

hadoop fs -ls hdfs://:8020
ls: Call From / to :8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Warning: fs.defaultFs is not set when running "ls" command.

Is there a specific directory I need to be in when I run - http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/FileSystemShell.html#cat - ?

Thx

0 Karma

suarezry
Builder

You're missing the fqdn for the namenode, ie:
hadoop fs -ls hdfs://mynamenode.mydomain.com:8020/

0 Karma

jwalzerpitt
Influencer

Sorry - used the < and > brackets which stripped out the server name.

I did run:

hadoop fs -ls hdfs://servername:8020
ls: Call From servername/IP to servername:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Warning: fs.defaultFs is not set when running "ls" command.

0 Karma

kschon_splunk
Splunk Employee
Splunk Employee

If the Hadoop cluster is kerberized, you need to run the kinit command before trying the hadoop command line tool. This may look like:

kinit -k -t PATH_TO_KEYTAB_FILE PRINCIPAL

See http://web.mit.edu/kerberos/krb5-1.12/doc/user/user_commands/kinit.html for details.

0 Karma

jwalzerpitt
Influencer

Thx for the info - I'll dig into that and give it a shot and let you know

0 Karma

suarezry
Builder

Try doing a command line hdfs cat on the splunk server:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/FileSystemShell.html#cat

This will verify if your hadoop client install was good. Did it return results?

0 Karma
Get Updates on the Splunk Community!

Observability Highlights | January 2023 Newsletter

 January 2023New Product Releases Splunk Network Explorer for Infrastructure MonitoringSplunk unveils Network ...

Security Highlights | January 2023 Newsletter

January 2023 Splunk Security Essentials (SSE) 3.7.0 ReleaseThe free Splunk Security Essentials (SSE) 3.7.0 app ...

Platform Highlights | January 2023 Newsletter

 January 2023Peace on Earth and Peace of Mind With Business ResilienceAll organizations can start the new year ...