'localhost: ERROR: Cannot set priority of datanode process 2984

I set up and configured a multi-node Hadoop .Will appear when I start My Ubuntu is 16.04 and Hadoop is 3.0.2

Starting namenodes on [master]
Starting datanodes
localhost: ERROR: Cannot set priority of datanode process 2984
Starting secondary namenodes [master]
master: ERROR: Cannot set priority of secondarynamenode process 3175
2018-07-17 02:19:39,470 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting resourcemanager
Starting nodemanagers

Who can tell me which link is wrong?



Solution 1:[1]

I had the same error and fixed it by ensuring that the datanode and namenode locations have the right permissions and are owned by the user starting hadoop daemons.

Solution 2:[2]

Check that

  1. The directory path properties in hdfs-site.xml under $HADOOP_CONF_DIR are pointing to valid locations. dfs.namenode.name.dir dfs.datanode.data.dir dfs.namenode.checkpoint.dir

  2. Hadoop user must have write permission for these paths

If the write permission is not present for the mentioned paths, then the processes might not start and the error you see can occur.

Solution 3:[3]

I had the same error, and tried the above method, but it doesn't work.

I set XXX_USER in all xxx-env.sh files, and got the same result.

Finally I set HADOOP_SHELL_EXECNAME="root" in ${HADOOP_HOME}/bin/hdfs, and the error disappeared.

The default value of HADOOP_SHELL_EXECNAME is "HDFS".

Solution 4:[4]

I had the same error when I renamed my Ubuntu home directory, and had to edit core-site.xml, changing the value of the property hadoop.tmp.dir to the new path.

Solution 5:[5]

Just append the word "native" to your HADOOP_OPTS like this:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"

export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native" 

Solution 6:[6]

I had the same issue, you just need to check hadoop/logs directory and look for a .log file for datanode, type more nameofthefile.log and check for the errors, mine was a problem in to configuration, I fixed it and it worked.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Rohan Grover
Solution 2 Sumukh
Solution 3 the Tin Man
Solution 4 the Tin Man
Solution 5 the Tin Man
Solution 6 Damini Suthar