'Hadoop installation error, "error : cannot execute hdfs-config.sh."
I am following this tutorial to install hadoop in my computer. As far as I know, I have followed the instructions perfectly until source ~/.profile, but when I try to format HDFS by entering hdfs namenode -format, it gives me the following error :
ERROR: Cannot execute /usr/local/Cellar/hadoop/3.0.0/libexec/hdfs-config.sh
I tried a lot to look for the solution over the internet but didn't find a solution to it.
Solution 1:[1]
@BIKI I just ran into the same problem, and the Hadoop release 3.0.0 has a weird file structure that does not work with the home directory set the way you would think it should be.
I am on a MAC High Sierra OS 10.13, and installed using brew but I think you would see something similar on Ubuntu, or any UNIX-like system.
Bottom line, if you want to track down the errors, check your HADOOP_HOME in your profile (.bash_profile) and the scripts that are started when you kick off Hadoop. In my case, I have an alias set in my profile called hstart and it calls the following files:
start-dfs.sh
AND
start-yarn.sh
These files call the hdfs-config.sh file which gets lost given the home directory setting.
My Hadoop home directory was set to:
export HADOOP_HOME=/usr/local/Cellar/hadoop/3.0.0
And I changed it to:
export HADOOP_HOME=/usr/local/Cellar/hadoop/3.0.0/libexec
Of course you have to source your configuration profile, and in my case it was:
source .bash_profile
For me, this did the trick. Hope that helps!
Solution 2:[2]
Same issues for Hadoop 3.1.1 and above installed via Brew. HADOOP_HOME was not set up properly. Execute:
$ echo $HADOOP_HOME
And if you will see ”/usr/local/Cellar/hadoop” you have to add your spesific Hadoop version
$ export HADOOP_HOME=/usr/local/Cellar/hadoop/3.1.1
Solution 3:[3]
One of the other reasons one may get this error is due to permissions not being available for running Hadoop in localhost. We usually configure SSH to avoid typing passwords or to avoid providing root permissions to Hadoop. In a simple way, we configure Hadoop to run in non-root mode. What can be done is:
Use sudo every time you want to use Hadoop or to correctly define the SSH Key,
Example:
$ ssh-keygen -t rsa -P ""
$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | |
| Solution 2 | njjnex |
| Solution 3 | Tyler2P |
