'Azure Data Factory is throwing Error while Fetching Data from On Premise Hadoop to Data Lake Storage
I am trying to copy data from On premise HDFS system to data Lake using Azure Data Factory. I have successfully set up the connection and able to browse the files in HDFS Directory. When I am trying to run the copy activity it is throwing me the below error :--------
Failure happened on 'Sink' side. ErrorCode=UserErrorFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Upload file failed at path rawzone\cibil_enq_file_20200831.parquet.,Source=Microsoft.DataTransfer.Common,''Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to read a 'Hdfs' file. File path: 'Data_Archival/Bureau_Data_Mart/Bureau_Retail20/cibil_enq_file_20200831/cibil_enq_file_20200831.parquet'. Response details: '{}'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (403).
I am also getting the error while creating the dataset and previewing the data. I am attaching the snippet for both of the scenarios.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
