'SQL bulk insert csv with currency sign separator ¤

I have a .csv file with a currency sign field separator (¤), when I execute this query to bulk load it to a table it raise an error. The file is UTF-8 encoded.

BULK INSERT dbo.test
FROM 'file.csv'
WITH (DATA_SOURCE = 'MyAzureBlobStorage',
      FIRSTROW = 2,
      CODEPAGE = 65001, --UTF-8 encoding
      FIELDTERMINATOR = '¤',  --CSV field delimiter
      ROWTERMINATOR = '\n'   --Use to shift the control to next row
     );

The error I get is:

The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.

This is working fine with a semicolon as the separator.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source