Splunk Enterprise

DFS manager: Spark node failed to initialize

rabindrakumarpa
Explorer

Hi All,

Post deploying DFS manager on search Head(Master), when I am trying to start splunk, I am seeing below error:

“The Spark node failed to initialize" after debug message “Building configuration information”.

Also, DFS manager(Application Page) says "Unable to connect Spark master", although Splunk searchhead is working fine.

Checked and ports are open 8005-8010, Changed Java_home in splunk_dfs_manager.sh and spark_app.conf as it was complaining about missing Java_home.

Attaching snippet of Splunkd log.

Please advise.

Labels (1)
0 Karma

burwell
SplunkTrust
SplunkTrust

Have you tried the newest DFS manager 1.2.1? That might help.

rabindrakumarpa
Explorer

I just reworked with DFS manager 1.2.1.. Although the message is little meaningful but yet problem persists..

"Failed to get spark home path from app configuration."

I checked spark_app.conf, it seems to be proper.. However, server.conf is having sparkhome as blank... I attempted to update that as well and tried but issue remained the same.

spark_home = /opt/splunk/etc/apps/splunk_dfs_manager/vendor/spark-2.3.3-bin-hadoop2.7

 

Thanks In advance.

 

 

0 Karma

rabindrakumarpa
Explorer
 
0 Karma

burwell
SplunkTrust
SplunkTrust

I don't have spark_home set and don't have an issue.

I think you should file a case with Splunk support. DFS config with the splunk_dfs_manager is definitely finicky.

0 Karma

rabindrakumarpa
Explorer

Thank You! Sure we will raise case with splunk..

Do we have any recommended configuration strategy to enable data Fabric search without DFS manager .. ?  Any doc/Link in that line will be quite helpful..

We are following..

https://docs.splunk.com/Documentation/DFS/1.1.1/DFS/InstallationChecklist

Thanks in advance..

0 Karma

burwell
SplunkTrust
SplunkTrust

Hi.  I was a beta tester of Splunk 7.3.1 and we "rolled our own" as the DFS manager was not available.

Starting with 8.0 we used the DFS manager. 

The documentation says they don't support rolling your own.

You have to install Spark 2.3.3 and Java yourself on your Spark nodes and manage the Spark configurations.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Calling All Security Pros: Ready to Race Through Boston?

Hey Splunkers, .conf25 is heading to Boston and we’re kicking things off with something bold, competitive, and ...

Beyond Detection: How Splunk and Cisco Integrated Security Platforms Transform ...

Financial services organizations face an impossible equation: maintain 99.9% uptime for mission-critical ...

Customer success is front and center at .conf25

Hi Splunkers, If you are not able to be at .conf25 in person, you can still learn about all the latest news ...