Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Scroll ignore
scroll-viewporttrue
scroll-pdftrue
scroll-officetrue
scroll-chmtrue
scroll-docbooktrue
scroll-eclipsehelptrue
scroll-htmltrue
scroll-epubtrue

Open in new tab

Info

Integration to Snowflake DMF is in private preview. Please reach out to your KADA account manager to enable it.

This page will guide you through the setup of Snowflake Data Metrics Functions (DMF) for integration with K using the direct connect method.

Integration details

Green

Scope

Included

Comments

Metadata

Status
colour
titleYES

Lineage

Status
colourGreen
titleYES

 

Usage

Status
colourGreen
titleYES

Data Quality

Status
colourGreen
titleYES

See known limitations below

Note

Known limitations

K does NOT SUPPORTcurrently support the out of the box DMFs (in the SNOWFLAKE.CORE schema).

K only uses user defined DMFs because K measures DQ results as a % of rows that meet the test condition.

K expects the DMF result to be a value between 0 to 100

Out of the box DMFs measure the number of rows that meet the test condition which does not provide sufficient context

Replacement DMFs for out of the box DMFs are available here Setting up Snowflake Data Metrics Functions

...

Step 1) Establish Snowflake Access

Ensure that you have setup Snowflake in K as per Snowflake

...

Add the following grants to the K user

Option 1: Extending the K user with access to Snowflake DMF

Assuming you followed the Snowflake integration instructions and created a role called CATALOG_READ_ONLY you can follow the steps to add grants to extend the user to be able to read Snowflake DMF details

Code Block
--Log in with a user that has the permissions to assign/update roles

--Add the following DQ related roles to the KADA role created previously
GRANT APPLICATION ROLE SNOWFLAKE.DATA_QUALITY_MONITORING_VIEWER TO ROLE CATALOG_READ_ONLY;
GRANT DATABASE ROLE SNOWFLAKE.DATA_METRIC_USER TO ROLE CATALOG_READ_ONLY
GRANT DATABASE ROLE SNOWFLAKE.OBJECT_VIEWER TO ROLE CATALOG_READ_ONLY

Option 2: Creating a user with access to Snowflake DMF

To create a user with general access to metadata available in Snowflake Account Usage schema

...

Info

Snowflake integration uses username/password. Using keys will be supported in an upcoming release

...

Step 2) Connecting K to Snowflake

  • Select Platform Settings in the side bar

  • In the pop-out side panel, under Integrations click on Sources

  • Click Add Source and select Snowflake

    image-20240831-082519.pngImage Added


  • Select Direct Connect and add your Snowflake details and click Next

  • Fill in the Source Settings and click Next

    • Name: The

    DMF integration for Snowflake is currently in Private Preview. Please reach out to the KADA team to configure it for your account. General availability is scheduled for 5.41 (August 2024)
    • name you wish to give your Snowflake DMF in K e.g. Snowflake DMF

    • Host: Add a host value

      • If you use your Snowflake account value make sure to add -dq at the end to differentiate this source from your Snowflake source with the same host value.

    • Information Database Role: Add the role the user should use

    • Information Database: Default is Snowflake if using Snowflake DB. Otherwise enter the Database created to store metadata views from Step 1

    • Warehouse: Add the warehouse the user should use

    • Snowflake Host: Add your Snowflake account (excluding the https:// at the start)

      • e.g. abc123.australia-east.azure.snowflakecomputing.com

  • Add the Connection details and click Save & Next when connection is successful

    • Snowflake account: Use the same details (Snowflake account) you previously added in the Snowflake Host setting however this time remove the .snowflakecomputing.com at the end

      • e.g abc123.australia-east.azure

    • Username: Add the Snowflake user name

    • Password: Add the Snowflake user password

...

  • Test your connection and click Save

  • Click Finish Setup to create your new Snowflake DMFsource

  • Return to the Sources page and locate the new Snowflake DMF source that you loaded

  • Click on the clock icon to select Edit Schedule and set your preferred schedule for the Snowflake load

Note that scheduling a source can take up to 15 minutes to propagate the change.

...

Step 3) Manually run an ad hoc load to test Snowflake setup

  • Next to your new Source, click on the Run manual load icon

  • Confirm how you want the source to be loaded

  • After the source load is triggered, a pop up bar will appear taking you to the Monitor tab in the Batch Manager page. This is the usual page you visit to view the progress of source loads

Info

A manual source load will also require a manual run of

  • DAILY

  • GATHER_METRICS_AND_STATS

To load all metrics and indexes with the manually loaded metadata. These can be found in the Batch Manager page

Warning

Troubleshooting failed loads

  • If the job failed at the extraction step

    • Check the error. Contact KADA Support if required.

    • Rerun the source job

  • If the job failed at the load step, the landing folder failed directory will contain the file with issues.

    • Find the bad record and fix the file

    • Rerun the source job