Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. Platform Settings
    On the bottom left of screen click the GEAR icon. And select Platform Settings.

    Then setup the following properties depending on your deployment setup

    Code Block
    storage_type = [LOCAL, S3, AZURE]
    
    # For Azure blob storage
    root_folder = <s3 bucket name>
    storage_azure_access_key
    storage_azure_storage_account
    
    # For aws S3 configuration
    root_folder = <s3 bucket name>
    storage_aws_region 
    storage_aws_access_key
    storage_aws_secret_access_key
    
    # Set this for LOCAL storage
    storage_local_root_directory
    
    # Set this if connecting to sources directly
    enable_connection_test = true
  2. Integrating sources to KADA
    KADA needs to be configured for each source that you want to integrate. Setup can be configure via the KADA front end. See How to: Onboard a new source

  3. KADA Platform Initial load

    1. Setup the following Platform Setting values for initial load

      Code Block
      celery_batch_task_soft_time_limit = 0
      celery_batch_task_time_limit = 0
      metric_window = 30
    2. KADA provides a built in Batch manager for triggering the loading of sources.

    3. Seehttps://kadaai.atlassian.net/wiki/spaces/KS/pages/675708946/How+to%3A+Onboard+a+new+source#4.-Manually-Triggering-Source-loads

    4. Once the sources have been loaded. Manually trigger the following platform jobs. See https://kadaai.atlassian.net/wiki/spaces/KS/pages/1740931226/KADA+Batch+Manager#Manually-triggering-a-Platform-job
      1. GATHER_METRICS_AND_STATS
      2. POST_PROCESS_QUERIES
      3. DAILY

  4. Schedule sources to load.
    KADA provided a scheduler to periodically load the source you have configured.
    Setup the following Platform Setting value to enable the scheduler to run.

    Code Block
    enable_platform_batch = true

    Each Source can now be scheduled to run. See https://kadaai.atlassian.net/wiki/spaces/KS/pages/675708946/How+to%3A+Onboard+a+new+source#3.-Scheduling-a-Source

Upgrading KADA

KADA generally releases new updates each month. See our Release versions to see what the latest version available is.

To check your version see How to: Check the version of K platform

If a new version is available use the following steps to upgrade

...

Code Block
--Query To Create Extended Events Session
CREATE EVENT SESSION [KADA] ON SERVER ADD EVENT sqlserver.sp_statement_completed (
	ACTION(package0.collect_system_time, package0.event_sequence, sqlos.task_time, sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.database_id, sqlserver.database_name, sqlserver.nt_username, sqlserver.query_hash, sqlserver.server_instance_name, sqlserver.server_principal_name, sqlserver.server_principal_sid, sqlserver.session_id, sqlserver.session_nt_username, sqlserver.transaction_id, sqlserver.username) WHERE (
		(
			[statement] LIKE '%CREATE %'
			OR [statement] LIKE '%DROP %'
			OR [statement] LIKE '%MERGE %'
			OR [statement] LIKE '%FROM %'
			)
		AND [sqlserver].[server_principal_name] <> N'USERS_TO_EXCLUDE'
		AND [sqlserver].[is_system] = (0)
		AND NOT [statement] LIKE 'Insert into % Values %'
		AND [sqlserver].[Query_hash] <> (0)
		)
	), ADD EVENT sqlserver.sql_statement_completed (
	SET collect_statement = (1) ACTION(package0.collect_system_time, package0.event_sequence, sqlos.task_time, sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.database_id, sqlserver.database_name, sqlserver.nt_username, sqlserver.query_hash, sqlserver.server_instance_name, sqlserver.server_principal_name, sqlserver.session_id, sqlserver.session_nt_username, sqlserver.transaction_id, sqlserver.username) WHERE (
		(
			[statement] LIKE '%CREATE %'
			OR [statement] LIKE '%DROP %'
			OR [statement] LIKE '%MERGE %'
			OR [statement] LIKE '%FROM %'
			)
		AND [sqlserver].[server_principal_name] <> N'N'USERS_TO_EXCLUDE'
		AND [sqlserver].[is_system] = (0)
		AND NOT [statement] LIKE 'Insert into % Values %'
		AND [sqlserver].[Query_hash] <> (0)
		)
	) ADD TARGET package0.event_file (SET filename = N'G:\extended events\Extendedevents.xel', max_file_size = (20), max_rollover_files = (100))
	WITH (MAX_MEMORY = 4096 KB, EVENT_RETENTION_MODE = ALLOW_MULTIPLE_EVENT_LOSS, MAX_DISPATCH_LATENCY = 30 SECONDS, MAX_EVENT_SIZE = 0 KB, MEMORY_PARTITION_MODE = NONE, TRACK_CAUSALITY = ON, STARTUP_STATE = ON)
GO

-- Check if the session is dropping events and see other data about the session
-- https://sqlperformance.com/2019/10/extended-events/understanding-event-loss-with-extended-events
SELECT
   s.name, 
   s.total_regular_buffers,
   s.regular_buffer_size,
   s.total_large_buffers,
   s.large_buffer_size,
   s.dropped_event_count,
   s.dropped_buffer_count,
   s.largest_event_dropped_size
FROM sys.dm_xe_sessions AS s;

-- Also check log growth rate. Apply filters to remove noise.
-- some filters:
-- [sqlserver].[server_principal_name] = N'name of principal'
-- [sqlserver].[is_system] = (0)
-- [sqlserver].[client_app_name] = N'name of app'

-- NB grouping by XML is not supported in SQLServer. Need to use an external tool for this.

1.3. Oracle 11g+, Oracle Cloud and Oracle Analytics

Required an oracle wallet and the following. items to be updated in the cerebrum-oic.yaml file.

...