'localhost: ERROR: Cannot set priority of datanode process 32156

I am trying to install hadoop on ubuntu 16.04 but while starting the hadoop it will give me following error

localhost: ERROR: Cannot set priority of datanode process 32156.
Starting secondary namenodes [it-OptiPlex-3020]
2017-09-18 21:13:48,343 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting resourcemanager
Starting nodemanagers

Please someone tell me why i am getting this error ? Thanks in advance.



Solution 1:[1]

I had to deal with the same issue and kept getting the following exception:

Starting namenodes on [localhost]
Starting datanodes
localhost: ERROR: Cannot set priority of datanode process 8944
Starting secondary namenodes [MBPRO-0100.local]
2019-07-22 09:56:53,020 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

As others have mentioned, you need to first make sure that all path parameters are set correctly which is what I checked first. Then followed these steps to solve the issue:

1- Stop dfs service and format hdfs:

sbin/stop-dfs.sh
sudo bin/hdfs namenode -format

2- Change permissions for the hadoop temp directory:

sudo chmod -R 777 /usr/local/Cellar/hadoop/hdfs/tmp

3- Start service again:

sbin/start-dfs.sh

Good luck

Solution 2:[2]

I suggest you take a look at your hadoop datanode logs. This is probably a configuration issue.

In my case, folders configured in dfs.datanode.data.dir didn't exist and an exception was thrown and written to log.

Solution 3:[3]

I have run into the same error when installing Hadoop 3.0.0-RC0. My situation was all services starting successfully except Datanode.

I found that some configs in hadoop-env.sh weren't correct in version 3.0.0-RC0, but were correct in version 2.x.

I ended up replacing my hadoop-env.sh with the official one and set JAVA_HOME and HADOOP_HOME. Now, Datanodes is working fine.

Solution 4:[4]

Faced the same issue, flushed the folders: datanode & namenode. I have put the folders in /hadoop_store/hdfs/namenode & /hadoop_store/hdfs/datanode

Post deleting the folders, recreate and then run the command hdfs namenode -format

Start the hadoop:

After the fix the logs look good:

Starting namenodes on [localhost]
Starting datanodes
Starting secondary namenodes [ip]
2019-02-11 09:41:30,426 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

jps:

21857 NodeManager
21697 ResourceManager
21026 NameNode
22326 Jps
21207 DataNode
21435 SecondaryNameNode

Solution 5:[5]

Problem Solved Here!(Both two high ranked answers didnt work for me)

This issue happens because you are running your hadoop(namenode-user,datanode-user,...) with a user that is not the owner of all your hadoop file-folders.

just do a sudo chown YOURUSER:YOURUSER -R /home/YOURUSER/hadoop/*

Solution 6:[6]

The solution for my situation is add export HADOOP_SHELL_EXECNAME=root to the last line of $HADOOP_HOME/etc/hadoop/hadoop-env.sh ,otherwise, the default value of environment variable is hdfs

Solution 7:[7]

I have encountered the same issues as well.

My problem is as follows: the datanode folder permission is not granted, I have changed the rights as sudo chmod 777 ./datanode/

My advice is to check all the relevant paths/folders and make them 777 first (can changed back afterwards).

There might be some other reasons which lead to the failure of starting the datanode. Common reasons are

  1. wrong configuration in the hdfs-site.xml
  2. the folder specified in the hdfs-site.xml file is not created or do not have writing rights.
  3. the log folder has no writing rights. The log folder is usually under $HADOOP_HOME, change the folder rights as e.g. sudo chmod ...
  4. the ssh link configuration is not set up correctly or lost somehow, try ssh datanode1 to check

If everything has been checked, and something still does not work, we can login the datanode server and go to $HADOOP_HOME/logs folder and check the log information to debug.

Solution 8:[8]

For me the other solutions didn't work. It was not related to directory permissions.

There is an entry JSVC_HOME in hadoop-env.sh that needs to be uncommented.

Download and make jsvc from here: http://commons.apache.org/proper/commons-daemon/jsvc.html

Alternatively, jvsc jar is also present in hadoop dir.

Solution 9:[9]

  1. This can occur for various reasons, its best to check logs @ $HADOOP_HOME/logs

  2. In my case the /etc/hosts file was misconfigured i.e. my host name was not resolving to localhost

Bottom line: Check your namenode/datanode log files :)

Solution 10:[10]

I also encountered this error and found that the error is from the core-site.xml file and changed the file to this form:

<configuration>
    <property>
            <name>fs.defaultFS</name>
            <value>hdfs://master:9000</value>
    </property>     
</configuration>

Solution 11:[11]

This can be caused by many things, usually a mistake in one of the configuration files. So it's best you check the log files

Solution 12:[12]

In php a checkbox is present (and contains the value) in the $_REQUEST (or $_GET or $_POST) if it is checked. Otherwise it is not present. So to get whether your checkbox is checked or not you can use the following:

$is_checked = isset($_POST["checkbox_name"]);

To print the checkbox value use the checked html attribute. If it is given, the checkbox will be ticked.

print(
    "<input ".
        "type='checkbox' ".
        "name='checkbox_name' ".
        (isset($_POST["checkbox_name"]) ? "checked='checked' " : "").
    " />"
);

To work with a dynamic number of elements I use arrays in the $_POST (or $_REQUEST). You can do this by adding [ and ] to the end of the name attributes of your inputs (e.g. like in https://www.php.net/manual/reserved.variables.post.php#87650). An alternative is to use suffixes or prefixes (e.g. checkbox_name_1) but in my opinion this is just a little bit more work to do.

To add new inputs your javascript may look like the following snippet. The name is always the same, just a [<index>] is appended to tell php to handle this as an array. The <index> is the numeric index to use. This is important, otherwise you cannot use isset($_POST[...][$index]) because dynamic indices will "fill up" your checkbox values.

$("#add").on("click", function(){
  let i = $("#table tr").length;
  
  $("#table").append(
    "<tr>" + 
      "<td>Name <input type='text' name='name[" + i + "]' /></td>" + 
      "<td>Is solo parent <input type='checkbox' name='is_solo_parent[" + i + "]' /></td>" + 
      "<td>Is employeed <input type='checkbox' name='is_employeed[" + i + "]' /></td>" + 
    "</tr>"
  );
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
<button id="add">Add</button>
<table id="table">
</table>

In php then you can iterate over the requested variables and check if they exist:

foreach($_POST["name"] as $i => $name){
    // the "name" exists, if no name is given, no entry in the database is wanted

    $is_solo_parent = isset($_POST["is_solo_parent"][$i]); // true or false
    $employeed = isset($_POST["employeed"][$i]); // true or false

    $database->insert($name, $is_solo_parent, $employeed);
}

$database->insert() is of course a function that fills the data into your database. Make sure to escape user input to prevent script injection and best use prepared statements to prevent sql injection.