Scroll ignore | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||||||
About Collectors
Insert excerpt | ||||||
---|---|---|---|---|---|---|
|
...
Pre-requisites
Python 3.6 - 3.10
Access to the KADA Collector repository that contains the SQL Server whl
The repository is currently hosted in KADA’s Azure Blob Storage. You will be given a SAS token to access the repository. Reach out to KADA Support (support@kada.ai) if you do not have access.
Download the SQL Server whl (e.g. kada_collectors_extractors_sqlserver_azure-#.#.#-py3-none-any.whl)
Access to K landing directory
Access to SQL Server (see section below)
Check the SQLServer instance port
Run the following query and note the local tcp port.
Code Block SELECT local_tcp_port FROM sys.dm_exec_connections WHERE session_id = @@SPID GO
...
Collector server minimum requirements
Insert excerpt | ||||||||
---|---|---|---|---|---|---|---|---|
|
SQL Server Requirements
Setting up SQL Server for metadata extraction is a 2 step process.
Step 1: Establish SQLServer Access
Note |
---|
CREATE LOGIN in master via the Apply in MASTER using an Azure SQL Admin user |
Code Block |
---|
CREATE LOGIN kadaloginkadauser WITH password='PASSWORD'; CREATE USER kadauser FROM LOGIN kadauser; |
Note |
---|
Run CREATE USER / GRANT VIEW Apply per database in scope for metadata collection. |
Code Block |
---|
CREATE USER kadauser FROM LOGIN kadaloginkadauser; GRANT VIEW DEFINITION TO kadauser; kadauser; GRANT VIEW DATABASE STATE to kadauser; GRANT CONTROL to kadauser; -- required for extended events sys.fn_xe_file_target_read_file |
The following table should also be available to SELECT by the user created in each database
...
Run the following script to setup Extended Events logging.
Note |
---|
Apply the following statement in each databaseper database in scope for metadata collection. |
Code Block | ||
---|---|---|
| ||
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<REPLACE with your key: abc1234>'; CREATE DATABASE SCOPED CREDENTIAL <REPLACE with your blob [https://your.blob.core.windows.net/extended-events>events] WITH IDENTITY='SHARED ACCESS SIGNATURE', SECRET = '< REPLACE WITH YOUR SAS TOKEN: sp=racwdl ...>'; #-- Make sure this file name is unique per database: ADD TARGET package0.event_file (SET filename = N'...' CREATE EVENT SESSION [KADA] ON DATABASE ADD EVENT sqlserver.sp_statement_completed ( ACTION(package0.event_sequence, sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.database_id, sqlserver.database_name, sqlserver.query_hash, sqlserver.session_id, sqlserver.transaction_id, sqlserver.username) WHERE ( ( [statement] LIKE '%CREATE %' OR [statement] LIKE '%DROP %' OR [statement] LIKE '%MERGE %' OR [statement] LIKE '%FROM %' ) --AND [sqlserver].[server_principal_name] <> N'USERS_TO_EXCLUDE' AND [sqlserver].[is_system] = (0) AND NOT [statement] LIKE 'Insert into % Values %' AND [sqlserver].[Query_hash] <> (0) ) ), ADD EVENT sqlserver.sql_statement_completed ( SET collect_statement = (1) ACTION(package0.event_sequence, sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.database_id, sqlserver.database_name, sqlserver.query_hash, sqlserver.session_id, sqlserver.transaction_id, sqlserver.username) WHERE ( ( [statement] LIKE '%CREATE %' OR [statement] LIKE '%DROP %' OR [statement] LIKE '%MERGE %' OR [statement] LIKE '%FROM %' ) --AND [sqlserver].[server_principal_name] <> N'N'USERS_TO_EXCLUDE' AND [sqlserver].[is_system] = (0) AND NOT [statement] LIKE 'Insert into % Values %' AND [sqlserver].[Query_hash] <> (0) ) ) ADD TARGET package0.event_file (SET filename = N'https://your.blob.core.windows.net/extended-events/<REPLACE with your db name: database1>.xel') WITH (MAX_MEMORY = 4096 KB, EVENT_RETENTION_MODE = ALLOW_MULTIPLE_EVENT_LOSS, MAX_DISPATCH_LATENCY = 30 SECONDS, MAX_EVENT_SIZE = 0 KB, MEMORY_PARTITION_MODE = NONE, TRACK_CAUSALITY = ON, STARTUP_STATE = ON) GO |
...
Some python packages also have dependencies on the OS level packages, so you may be required to install additional OS packages if the below fails to install.to install.
You can download the Latest Core Library and Azure SQL whl via Platform Settings → Sources → Download Collectors
...
Run the following command to install the collector
...
The collector requires a set of parameters to connect to and extract metadata from SQLServer Azure.
FIELD | FIELD TYPE | DESCRIPTION | EXAMPLE |
---|---|---|---|
server | string | SQLServer Azure server. If using a custom port append with comma | “mydatabase.database.windows.net” |
host | string | The onboarded host value in K, generally this would be the same as the server value, depending on what you onboard it as. | |
username | string | Username to log into the SQLServer Azure account | “myuser” |
password | string | Password to log into the SQLServer Azure account |
|
databases | list<string> | A list of databases to extract from SQLServer Azure | [“dwh”, “adw”] |
driver | string | This is the ODBC driver, generally its ODBC Driver 17 for SQL Server, if you another driver installed please use that instead. | “ODBC Driver 17 for SQL Server” |
meta_only | boolean | Do you want to extract metadata only without enabling extended events? We currently only support true | true |
output_path | string | Absolute path to the output location where files are to be written | “/tmp/output” |
mask | boolean | To enable masking or not | true |
compress | boolean | To gzip the output or not | true |
events_name | string | The created extended event session name for each database, the event name should be exactly the same per database. This needs to be specified when meta_only is false | KADA |
These parameters can be added directly into the run or you can use pass the parameters in via a JSON file. The following is an example you can use that is included in the example run code below.
...
Code Block | ||
---|---|---|
| ||
import os import argparse from kada_collectors.extractors.utils import load_config, get_hwm, publish_hwm, get_generic_logger from kada_collectors.extractors.sqlserver_azure import Extractor get_generic_logger('root') # Set to use the root logger, you can change the context accordingly or define your own logger _type = 'sqlserver_azure' dirname = os.path.dirname(__file__) filename = os.path.join(dirname, 'kada_{}_extractor_config.json'.format(_type)) parser = argparse.ArgumentParser(description='KADA SqlServer Azure Extractor.') parser.add_argument('--config', '-c', dest='config', default=filename, help='Location of the configuration json, default is the config json in the same directory as the script.') parser.add_argument('--name', '-n', dest='name', default=_type, help='Name of the collector instance.') args = parser.parse_args() start_hwm, end_hwm = get_hwm(_typeargs.name) ext = Extractor(**load_config(args.config)) ext.test_connection() ext.run(**{"start_hwm": start_hwm, "end_hwm": end_hwm}) publish_hwm(_type, end_hwm) |
...