'Snowflake table not populated even though pipe is running
I'm using Snowflake's Kafka connector to get some data into a table. I can see the files in the stage (when I run LIST @STAGE_NAME;). According to SYSTEM$PIPE_STATUS, the pipe is running. Still, I don't see anything in the copy history. However, when I refresh the pipe, I see the table populated in a bit.
Does someone what could be causing this?
Here's the connector configuration in case it helps (nothing special):
{
"buffer.count.records": "2",
"buffer.flush.time": "10",
"buffer.size.bytes": "1",
"input.data.format": "JSON",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"name": "mysnowflakesink",
"snowflake.database.name": "MY_DB",
"snowflake.private.key": "abc",
"snowflake.schema.name": "STREAM_DATA",
"snowflake.topic2table.map": "customers:CUSTOMERS_TEST",
"snowflake.url.name": "xyz.snowflakecomputing.com:443",
"snowflake.user.name": "my_user",
"tasks.max": "8",
"topics": "customers",
"value.converter": "com.snowflake.kafka.connector.records.SnowflakeJsonConverter"
}
Solution 1:[1]
It turned out to be my mistake. Namely, I'm using the connector programmatically, and what I didn't do is to call the preCommit method. Under the hood, it's making the pipe being used to ingest data.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Haris Osmanagi? |
