'How do I Split Huge File into Chunks of equal size in bash
I have a huge file of 4GB of data i.e 523 rows & 2,655,566 columns. I would like to read the whole file in equally divided chunks. How to do so, suggest the best approach
Here I have 522 lines only and 2,655,566 columns. Which is the best way split by lines or M bytes etc
Solution 1:[1]
May be try using
split –-verbose file_name
if you manually specify the each split size like
split -b100M file_name
where 100M for MB, similarly you can give K,M,G,T,PE
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Tarun Teja |