'Unable to put files into HDFS due to Name Node in SafeMode because of low resources
I am new to Hadoop Ecosystem. I have been trying to put a csv file into HDFS inside a directory that I could create. But when I do that I get an error :
put: Cannot create file/user/Checkouts_By_Title_Data_Lens_2005.csv.COPYING. Name node is in safe mode.
I tried restarting the namenode , as well as formatting it. I also tried the following commands :
hdfs dfsadmin -safemode leave
hdfs dfsadmin -safemode forceExit
Although I could get:
SafeMode is OFF
It turns back into safe mode because of low resources . I did
hdfs fsck /
which gave me this response
Status: HEALTHY
Number of data-nodes: 1
Number of racks: 1
Total dirs: 2
Total symlinks: 0
Replicated Blocks:
Total size: 0 B
Total files: 0
Total blocks (validated): 0
Minimally replicated blocks: 0
Over-replicated blocks: 0
Under-replicated blocks: 0
Mis-replicated blocks: 0
Default replication factor: 3
Average block replication: 0.0
Missing blocks: 0
Corrupt blocks: 0
Missing replicas: 0
Erasure Coded Block Groups:
Total size: 0 B
Total files: 0
Total block groups (validated): 0
Minimally erasure-coded block groups: 0
Over-erasure-coded block groups: 0
Under-erasure-coded block groups: 0
Unsatisfactory placement block groups: 0
Average block group size: 0.0
Missing block groups: 0
Corrupt block groups: 0
Missing internal blocks: 0
FSCK ended at Sat May 14 15:53:19 IST 2022 in 3 milliseconds
I can't seeming get through this safe mode issue please help and let me know if I missed something to share with you guys.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
