'What would be the best way to copy large file from local system to a pod in kubernetes [closed]

I have a large file of around 350GB that need to be copied to the pod with PV, I have been trying to use kubectl cp but after all the tries I haven't been able to copy the whole file. Is there a better way to copy the file?

Note: Kubernetes Cluster and Local System on which the files exist are on the same network in a cloud environment.

The error I am getting:

client_loop: send disconnect: Broken pipe


Solution 1:[1]

...350GB that need to be copied to the pod with PV

How about create a PV that already loaded with the 350GB data? In this case all your pod needs is just the PVC.

Updated: In case of Portworx, you refer an existing PV which previously dynamic created and retained by:

...
persistentVolumeReclaimPolicy: Retain
portworxVolume
  volumeID: <existing volume id>

See the Portworx documentation here.

Solution 2:[2]

By using PVC (persistent volume claim) you can copy the large number of volume data to your pods.

A persistent volume claim is a dedicated storage that kubernetes has carved out for your application pod, from the storage that was made available using the storage class. you can refer this link .

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 Tatikonda vamsikrishna