Scroll ignore | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||||||
This page will walk through the setup of Snowflake BigQuery in K.
...
Step 1) Setup a Google Cloud Service Account
Info |
---|
This step is performed by the Google Cloud Admin |
Create a Service Account by going to the Google Cloud Admin or clicking on this link
Give the Service Account a name (e.g. KADA BQ Integration)
Select the Projects that include the BigQuery instance(s) that you want to catalog
Click Save
Create a Service Token
Click on the Service Account
Select the Keys tab. Click on Create new key
Select the JSON option. After clicking ‘CREATE’, the JSON file will automatically download to your device. Provide this to the user(s) that will complete the next steps
Add grants on the Service Account by going to IAM page or clicking on this link
Click on ADD
Add the Service Account to the ‘New principals’ field.
Grant the following roles this principal as shown in the following screenshot.
BigQuery Job User
BigQuery Metadata Viewer
BigQuery Read Session User
BigQuery Resource Viewer
Click SAVE
...
Step 2) Connecting K to BigQuery
Select Platform Settings in the side bar
In the pop-out side panel, under Integrations click on Sources
Click Add Source and select BigQuery
...
Info |
---|
Note that scheduling a source can take up to 15 minutes to propagate the change. |
...
Step 3) Manually run an ad hoc load to test BigQuery setup
Next to your new Source, click on the Run manual load icon
Confirm how your want the source to be loaded
After the source load is triggered, a pop up bar will appear taking you to the Monitor tab in the Batch Manager page. This is the usual page you visit to view the progress of source loads
...