Snowflake - connector settings

Hello, I am trying to export a CSV file from Easymorph to a Snowflake database but I am not really sure how to do it and I would need additional guidance / documentation.

To export data to Snowflake, I guess we should define first these 3 objects => 1) database, 2) schema (folder in the database) and 3) a file format (eg. csv + other settings related to format).

In Easymorph it seems like we can only select the database and then see all the default “Information_Schema” related information. We are not able to select other schema in the selected database.

Also, if we do not use a file format, it would be useful to know in more details how we should create a Snowflake table in the Snowflake database that will ingest all the CSV file with the right format.

A more detailed documentation on Snowflake connector would be much appreciated. Thank you.

Hello @PatBen, and welcome to the Community!

You can import your CSV to EasyMorph with the “Import delimited text file” action and then use the “Bulk export to database” action to export data to Snowflake (you’ll have to use an ODBC connector since the native Snowflake connector is not supported by that action). With this approach, you won’t have to define file format settings.

You can use the “Create table” command of the “Database command” action to create a Snowflake table. You should place this action after the “Import delimited text file” action in order to be able to create database columns for all the imported columns. The action will autodetect column types, but I would recommend switching the “Data types” field to “Advanced” and manually adjusting autodetecting column types if necessary.

You’ll have to run the configured “Create table” command in order to be able to select the created table in the “Bulk export to database” action.

If you want to export your CSV file directly to Snowflake without importing it to EasyMoprh,
you can try to use the “Custom command” command of the “Database command” action to execute all the required queries, including the creation of the target table. Please refer to the Snowflake documentation for the exact syntax of the required commands.

We never tried this approach, so I can’t guarantee that it should work. But I can’t think of any reasons not to try it. If you want to export a local file, you’ll have to use an ODBC connector. But if you want to export a file from a cloud storage, supported by Snowflake (like Amazon S3), you can use a native connector as well.

Thank you Andrew.

I am not sure how to install/use an ODBC connector. Let me know if you have more information on how to do this.

The alternative way for me would be to use SnowSQL (by Snowflake) from my computer or even the UI Loader from the Snowflake Classic console - but I thought it would be easier to connect directly EasyMorph to Snowflake.

Why not developing a full Snowflake connector, creating an Easymorph entity in Snowflake with the following parameters?

Account: Accountlocator (eg. fd45221)
Account URL: https://fd45221.ca-central1.aws.snoflakecomputing.com
Database: EASYMORPH_DB
Username: EASYMORPH_USER
Password: xxxxxxxxxxx
Warehouse: EASYMORPH_WH
Role: OWNER_XX
Type: Snowflake

Thanks

As for an ODBC connector. Please refer to this link for the ODBC driver installation and ODBS DSN creation and configurations instructions: Installing and Configuring the ODBC Driver for Windows — Snowflake Documentation

When the ODBC DSN will be ready, you can create an ODBC connector in EasyMorph’s connector manager under New connector -> SQL -> ODBC.

Can you please elaborate on what you mean by the “full Snowflake connector”? EasyMorph has a native Snowflake connector. But that connector doesn’t support bulk export because the library, that’s used internally by the Snowflake connector, didn’t have that feature implemented at the time of connector creation.

Andrew, thank you. I will have a look at the documentation you posted.

Regarding the “full Snowflake connector”, I am not really sure but that is what I saw from Extracter/Loader solutions which are specialized in moving data. To establish a connection between the source and the target, they need to create an “Entity” to make the connection and load the data.

That said, I am not a technical guy so I cannot tell you more. I am just investigating the different alternatives to ETL/ELT local files to Snowflake. :wink:

Without a bulk export option, you’re always going to be constrained by ISP upload speeds of an uncompressed file from local to the cloud. So, even if you figure out the direct snowflake connection (I’ve used it for small files), if you’re files are very big (>100M rows), it’s much better to use snowsql. You’ll need to create a stage, which is basically snowflake’s version of blob storage, and push local files into it with snowsql. What’s great about this approach is that snowsql heavily compresses the file with gzip before it uploads to the stage, so it’s much much faster than a direct import. Once the files are in the stage, you can easily pull them into the snowflake DB and purge the stage using the worksheet view. Michael Kahan has a great video on his youtube channel with step-by-step. It’s a hassle to set up, but makes things easy and fast once you complete the upfront work.

1 Like

Interesting. Thank you Rick.