Open in new tab

This page will walk through the setup of Snowflake in K.


Step 1) Establish Snowflake Access

K only needs access to metadata tables in Snowflake. These tables are stored in the “Snowflake” database. You do not need to provide K access to any data stored in your Snowflake database.

Create a Snowflake user with read access to following tables in the Snowflake database.

Log in with a user that has the permissions to create a role/user

--Create a new role for the Catalog user
Create role CATALOG_READ_ONLY;

--Grant the role access to the Account usage schema
grant imported privileges on database Snowflake to CATALOG_READ_ONLY
grant select on all tables in schema SNOWFLAKE.ACCOUNT_USAGE to CATALOG_READ_ONLY;
grant monitor on account to role CATALOG_READ_ONLY

--Create a new user for K and grant it the role (remove the [])
create user [kada_user] password=['abc123!@#'] default_role = CATALOG_READ_ONLY default_warehouse = [warehouse];

From the above record down the following to be used for the setup

  1. User name / Password

  2. Role

  3. Warehouse

  4. Snowflake account (found in the URL of your Snowflake instance - between https:// and .snowflakecomputing.com/…)

Snowflake integration uses username/password. Using keys will be supported in an upcoming release


Step 2) Connecting K to Snowflake

Note that scheduling a source can take up to 15 minutes to propagate the change.


Step 3) Manually run an ad hoc load to test Snowflake setup

A manual source load will also require a manual run of

  • DAILY

  • GATHER_METRICS_AND_STATS

To load all metrics and indexes with the manually loaded metadata. These can be found in the Batch Manager page

Troubleshooting failed loads

  • If the job failed at the extraction step

    • Check the error. Contact KADA Support if required.

    • Rerun the source job

  • If the job failed at the load step, the landing folder failed directory will contain the file with issues.

    • Find the bad record and fix the file

    • Rerun the source job