'Extract all fields as json from staged CSVs in Snowflake

I'm trying to replace external tables of a set of CSV files with native Snowflake tables using COPY INTO instead. When I use the external table, I get a json "value" field of type variant, along with any other transformed columns I create. I'd like to replicate the same thing using COPY INTO, but when selecting the staged files directly seems I can't get the same "value" field, only the specific columns numbered ($1, $2, ...) that I can then transform.

I've tried using a CSV file format with a RECORD_DELIMITER = '0x0A' and then splitting with split($1, ','). This gives me an array of all the values, but it doesn't give me the same json with the headers as keys.

create or replace file format fftest
record_delimiter = '0x0A'
field_optionally_enclosed_by='"'
escape='\\'
field_delimiter = none
type =  CSV;

select 
    $1::variant as value,
    split($1, ',') as splitted,
    splitted[0]::varchar as col1
from @STAGE
    (file_format => 'fftest', pattern => '.*csv');
    
// value            | splitted                  | col1
// "va1,val2..."    | ["val1", "val2",...]      | val1

Any advice?



Solution 1:[1]

Use PARSE_JSON on varchar column to convert to JSON

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Jeffrey Jacobs