Skip to content

Add Redshift connector to Recon#2339

Open
bishwajit-db wants to merge 9 commits intomainfrom
feature/redshift-recon
Open

Add Redshift connector to Recon#2339
bishwajit-db wants to merge 9 commits intomainfrom
feature/redshift-recon

Conversation

@bishwajit-db
Copy link
Copy Markdown
Contributor

Add Redshift connector to Recon

@codecov
Copy link
Copy Markdown

codecov Bot commented Mar 18, 2026

Codecov Report

❌ Patch coverage is 94.02985% with 4 lines in your changes missing coverage. Please review.
✅ Project coverage is 66.30%. Comparing base (df7e9f6) to head (4a89b8c).

Files with missing lines Patch % Lines
src/databricks/labs/lakebridge/deployment/job.py 20.00% 3 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2339      +/-   ##
==========================================
+ Coverage   66.10%   66.30%   +0.19%     
==========================================
  Files          99      100       +1     
  Lines        9291     9358      +67     
  Branches      989      993       +4     
==========================================
+ Hits         6142     6205      +63     
- Misses       2970     2973       +3     
- Partials      179      180       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@github-actions
Copy link
Copy Markdown

github-actions Bot commented Mar 18, 2026

❌ 59/147 passed, 88 failed, 6 skipped, 1h19m6s total

❌ test_run_empty_result_pipeline: RuntimeError: Pipeline execution failed due to errors in steps: empty_result_step (1.074s)
... (skipped 1933 bytes)
OR [databricks.REDSHIFT_DATABASE.lakebridge.assessments.pipeline] Pipeline execution failed due to errors in steps: empty_result_step
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_run_pipeline_with_combined_ddl: RuntimeError: Pipeline execution failed due to errors in steps: inventory, usage (2.134s)
... (skipped 3920 bytes)
ROR [databricks.REDSHIFT_DATABASE.lakebridge.assessments.pipeline] Pipeline execution failed due to errors in steps: inventory, usage
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_run_pipeline_with_ddl: RuntimeError: Pipeline execution failed due to errors in steps: inventory, usage (1.778s)
... (skipped 3896 bytes)
ROR [databricks.REDSHIFT_DATABASE.lakebridge.assessments.pipeline] Pipeline execution failed due to errors in steps: inventory, usage
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_synapse_query_execution: sqlalchemy.exc.InterfaceError: (pyodbc.InterfaceError) ('28000', "[28000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Login failed for user 'REDSHIFT_DATABASE-CLOUD_ENV-TEST_CATALOG-admin'. (18456) (SQLDriverConnect)") (851ms)
... (skipped 243 bytes)
 this error at: https://sqlalche.me/e/20/rvf5)
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_connection_test: sqlalchemy.exc.InterfaceError: (pyodbc.InterfaceError) ('28000', "[28000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Login failed for user 'REDSHIFT_DATABASE-CLOUD_ENV-TEST_CATALOG-admin'. (18456) (SQLDriverConnect)") (908ms)
... (skipped 243 bytes)
 this error at: https://sqlalche.me/e/20/rvf5)
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_mssql_connector_execute_query: sqlalchemy.exc.InterfaceError: (pyodbc.InterfaceError) ('28000', "[28000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Login failed for user 'REDSHIFT_DATABASE-CLOUD_ENV-TEST_CATALOG-admin'. (18456) (SQLDriverConnect)") (808ms)
... (skipped 243 bytes)
 this error at: https://sqlalche.me/e/20/rvf5)
[gw8] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw8] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_profiler_connection_synapse_success: assert '\u2713 Dedicated SQL pool connection successful' in '10:07 DEBUG [databricks.REDSHIFT_DATABASE.lakebridge.contexts.application] Added User-Agent extra cmd=test-profiler-connection\n10:07 DEBUG [databricks.REDSHIFT_DATABASE.lakebridge.contexts.application] Added User-Agent extra profiler_source_tech=synapse\n10:07 DEBUG [databricks.REDSHIFT_DATABASE.lakebridge] User: User(active=True, display_name=\'REDSHIFT_DATABASE-tool-identity\', emails=[ComplexValue(display=None, primary=True, ref=None, type=\'work\', value=\'3fe685a1-96cc-4fec-8cdb-6944f5c9787e\')], entitlements=[], external_id=\'92f40178-1ee9-4156-a6da-28a376a12109\', groups=[ComplexValue(display=\'users\', primary=None, ref=\'Groups/153383108335587\', type=\'direct\', value=\'153383108335587\'), ComplexValue(display=\'admins\', primary=None, ref=\'Groups/149832780896743\', type=\'indirect\', value=\'149832780896743\'), ComplexValue(display=\'REDSHIFT_DATABASE.scope.tool\', primary=None, ref=\'Groups/531996560706268\', type=\'direct\', value=\'531996560706268\'), ComplexValue(display=\'REDSHIFT_DATABASE.scope.admin\', primary=None, ref=\'Groups/847659649002239\', type=\'direct\', value=\'847659649002239\'), ComplexValue(display=\'REDSHIFT_DATABASE.scope.account-admin\', primary=None, ref=\'Groups/688239313962730\', type=\'direct\', value=\'688239313962730\')], id=\'1425339244351829\', name=Name(family_name=None, given_name=\'REDSHIFT_DATABASE-tool-identity\'), roles=[], schemas=[, ], user_name=\'3fe685a1-96cc-4fec-8cdb-6944f5c9787e\')\n10:07 INFO [databricks.REDSHIFT_DATABASE.lakebridge] Testing connection for source technology: synapse\n10:07 INFO [databricks.REDSHIFT_DATABASE.lakebridge.connections.synapse_connection_helpers] Testing connection to dedicated SQL pool...\n10:07 ERROR [databricks.REDSHIFT_DATABASE.lakebridge.connections.synapse_connection_helpers] \u2717 Failed to connect to dedicated SQL pool: (pyodbc.InterfaceError) (\'28000\', "[28000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Login failed for user \'REDSHIFT_DATABASE-CLOUD_ENV-TEST_CATALOG-admin\'. (18456) (SQLDriverConnect)")\n(Background on this error at: https://sqlalche.me/e/20/rvf5)\n10:07 ERROR [databricks.REDSHIFT_DATABASE.lakebridge] Failed to connect to the source system: Connection failed for SQL pools - dedicated: Failed to connect to dedicated SQL pool: (pyodbc.InterfaceError) (\'28000\', "[28000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Login failed for user \'REDSHIFT_DATABASE-CLOUD_ENV-TEST_CATALOG-admin\'. (18456) (SQLDriverConnect)")\n(Background on this error at: https://sqlalche.me/e/20/rvf5)\n10:07 CRITICAL [databricks.REDSHIFT_DATABASE.lakebridge] Connection validation failed. Exiting...\n' (1.984s)
... (skipped 11503 bytes)
 at: https://sqlalche.me/e/20/rvf5)
10:07 CRITICAL [databricks.REDSHIFT_DATABASE.lakebridge] Connection validation failed. Exiting...
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_synapse_connector_execute_query: sqlalchemy.exc.InterfaceError: (pyodbc.InterfaceError) ('28000', "[28000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Login failed for user 'REDSHIFT_DATABASE-CLOUD_ENV-TEST_CATALOG-admin'. (18456) (SQLDriverConnect)") (860ms)
... (skipped 243 bytes)
 this error at: https://sqlalche.me/e/20/rvf5)
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_synapse_connection_check: sqlalchemy.exc.InterfaceError: (pyodbc.InterfaceError) ('28000', "[28000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Login failed for user 'REDSHIFT_DATABASE-CLOUD_ENV-TEST_CATALOG-admin'. (18456) (SQLDriverConnect)") (794ms)
... (skipped 243 bytes)
 this error at: https://sqlalche.me/e/20/rvf5)
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_installs_and_runs_maven_morpheus: assert None is not None (46ms)
... (skipped 10703 bytes)
ks.REDSHIFT_DATABASE.lakebridge.transpiler.installers] Failed to install transpiler: Databricks databricks-morph-plugin transpiler
10:08 DEBUG [tests.integration.install.test_install_and_run] Gathering transpiler logs...
❌ test_synapse_with_credential_format: sqlalchemy.exc.InterfaceError: (pyodbc.InterfaceError) ('28000', "[28000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]Login failed for user 'REDSHIFT_DATABASE-CLOUD_ENV-TEST_CATALOG-admin'. (18456) (SQLDriverConnect)") (869ms)
... (skipped 243 bytes)
 this error at: https://sqlalche.me/e/20/rvf5)
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_installs_and_runs_pypi_bladebridge: assert None is not None (22ms)
... (skipped 10257 bytes)
ks-bb-plugin
10:08 ERROR [databricks.REDSHIFT_DATABASE.lakebridge.transpiler.installers] Failed to install transpiler: bladebridge
10:08 DEBUG [tests.integration.install.test_install_and_run] Gathering transpiler logs...
❌ test_run_pipeline: RuntimeError: Pipeline execution failed due to errors in steps: inventory, usage (30.908s)
... (skipped 7888 bytes)
ROR [databricks.REDSHIFT_DATABASE.lakebridge.assessments.pipeline] Pipeline execution failed due to errors in steps: inventory, usage
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_snowflake_read_schema_happy: databricks.REDSHIFT_DATABASE.lakebridge.reconcile.exception.DataSourceRuntimeException: Runtime exception occurred while fetching schema using select column_name, case when numeric_precision is not null and numeric_scale is not null then concat(data_type, '(', numeric_precision, ',' , numeric_scale, ')') when lower(data_type) = 'text' then concat('varchar', '(', CHARACTER_MAXIMUM_LENGTH, ')') else data_type end as data_type from remorph.INFORMATION_SCHEMA.COLUMNS where lower(table_name)='diamonds' and table_schema = 'SANDBOX' order by ordinal_position : [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m22.279s)
... (skipped 3183 bytes)
iamonds' and table_schema = 'SANDBOX' order by ordinal_position : [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw2] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_for_report_type_schema: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m22.067s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_mock_data_source_no_catalog: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m21.809s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw8] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw8] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_reconcile_data_without_mismatches_and_missing: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m22.023s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_schema_recon_with_data_source_exception: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m22.034s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_mock_data_source_happy: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m22.43s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_data_recon_with_source_exception: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m22.294s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_reconcile_data_with_threshold_and_row_report_type: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m22.01s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_reconcile_data_with_mismatches_and_missing: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.346s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw2] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_compare_data_for_report_all: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.334s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_sql_server_read_schema_happy: databricks.REDSHIFT_DATABASE.lakebridge.reconcile.exception.DataSourceRuntimeException: Runtime exception occurred while fetching schema using SELECT COLUMN_NAME AS 'column_name', CASE WHEN DATA_TYPE IN ('int', 'bigint') THEN DATA_TYPE WHEN DATA_TYPE IN ('smallint', 'tinyint') THEN 'smallint' WHEN DATA_TYPE IN ('decimal' ,'numeric') THEN 'decimal(' + CAST(NUMERIC_PRECISION AS VARCHAR) + ',' + CAST(NUMERIC_SCALE AS VARCHAR) + ')' WHEN DATA_TYPE IN ('float', 'real') THEN 'double' WHEN CHARACTER_MAXIMUM_LENGTH IS NOT NULL AND DATA_TYPE IN ('varchar','char','text','nchar','nvarchar','ntext') THEN DATA_TYPE WHEN DATA_TYPE IN ('date','time','datetime', 'datetime2','smalldatetime','datetimeoffset') THEN DATA_TYPE WHEN DATA_TYPE IN ('bit') THEN 'boolean' WHEN DATA_TYPE IN ('binary','varbinary') THEN 'binary' ELSE DATA_TYPE END AS 'data_type' FROM INFORMATION_SCHEMA.COLUMNS WHERE LOWER(TABLE_NAME) = LOWER('reconcile_in') AND LOWER(TABLE_SCHEMA) = LOWER('dbo') AND LOWER(TABLE_CATALOG) = LOWER('REDSHIFT_DATABASE_CLOUD_ENV_TEST_CATALOG_remorph') : [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.565s)
... (skipped 5628 bytes)
G) = LOWER('REDSHIFT_DATABASE_CLOUD_ENV_TEST_CATALOG_remorph')  : [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw8] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_reconcile_data_with_mismatch_and_no_missing: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.409s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_build_query_for_snowflake_src_for_non_integer_primary_keys: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m22.46s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_capture_mismatch_data_and_cols: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.438s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_schema_recon_with_general_exception: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.454s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_build_query_for_oracle_src: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (1m22.116s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_for_wrong_report_type: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.365s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_build_query_for_snowflake_src: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.52s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_reconcile_aggregate_data_mismatch_and_missing_records: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.294s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw2] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw2] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_compare_data_for_report_hash: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.359s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_reconcile_data_missing_and_no_mismatch: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.42s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_aggregates_reconcile_store_aggregate_metrics: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.384s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_capture_mismatch_data_and_cols_no_mismatch: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.46s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_data_recon_with_general_exception: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.376s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_build_query_for_databricks_src: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.39s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_capture_mismatch_data_and_cols_fail: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.525s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_capture_mismatch_data_and_cols_special_column_names: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.345s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_capture_start_snowflake_all: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.359s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_databricks_read_schema_happy: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (54.647s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw8] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw8] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_generate_final_reconcile_output_row: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.389s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw2] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_for_report_type_is_data: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.335s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_reconcile_aggregate_data_missing_records: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.39s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_test_recon_capture_start_databricks_row: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.199s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_capture_start_oracle_with_exception: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.46s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_build_query_for_snowflake_without_transformations: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.487s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_compare_data_special_column_names: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.321s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_apply_threshold_for_mismatch_with_missing: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.343s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_test_recon_capture_start_databricks_data: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.36s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_generate_final_reconcile_output_data: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.338s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw2] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_generate_final_reconcile_output_schema: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.333s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_generate_final_reconcile_output_all: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.473s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_capture_start_oracle_schema: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.378s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_capture_start_with_exception: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.565s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_generate_final_reconcile_output_exception: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.326s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_apply_threshold_for_mismatch_with_true_absolute: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.38s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_apply_threshold_for_mismatch_with_schema_fail: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.434s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_apply_threshold_for_mismatch_with_wrong_absolute_bound: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.391s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_apply_threshold_for_mismatch_with_true_percentage_bound: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.395s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw2] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_apply_threshold_for_mismatch_with_invalid_bounds: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.433s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_apply_threshold_for_only_threshold_mismatch_with_true_absolute: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.355s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_random_sampler_count: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.335s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_stratified_sampler_count: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.303s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpiles_informatica_to_sparksql_non_interactive[False]: assert (None is not None) (19ms)
... (skipped 10000 bytes)
ricks-bb-plugin
10:11 ERROR [databricks.REDSHIFT_DATABASE.lakebridge.transpiler.installers] Failed to install transpiler: bladebridge
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_teradata_sql: assert (None is not None) (1ms)
assert (None is not None)
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_teradata_sql_non_interactive[True]: assert (None is not None) (1ms)
assert (None is not None)
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_teradata_sql_non_interactive[False]: assert (None is not None) (1ms)
assert (None is not None)
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_gets_maven_artifact_version: assert None is not None (26ms)
... (skipped 9757 bytes)
a.xml (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1017)')))
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_downloads_from_maven: assert False (13ms)
... (skipped 9840 bytes)
0.pom (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1017)')))
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_gets_pypi_artifact_version: assert None is not None (12ms)
... (skipped 9681 bytes)
/json (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1017)')))
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpiles_all_dbt_project_files: ValueError: No such transpiler: Morpheus (379ms)
... (skipped 11890 bytes)
h file or directory: '/tmp/pytest-of-runner/pytest-0/popen-gw4/test_transpiles_all_dbt_projec0/REDSHIFT_DATABASE/remorph-transpilers'
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_sql_file: ValueError: No such transpiler: Morpheus (376ms)
... (skipped 11876 bytes)
 No such file or directory: '/tmp/pytest-of-runner/pytest-0/popen-gw4/test_transpile_sql_file0/REDSHIFT_DATABASE/remorph-transpilers'
[gw4] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_stratified_sampler_negative_count: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.287s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw3] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_redshift_schema_compare: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.463s)
... (skipped 14807 bytes)
bricks query: CREATE TABLE dummy ("col escaped2" INTEGER)
        Source equality check: True
        Databricks equality check: True
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpiles_informatica_to_sparksql: assert (None is not None) (20ms)
... (skipped 10000 bytes)
ricks-bb-plugin
10:12 ERROR [databricks.REDSHIFT_DATABASE.lakebridge.transpiler.installers] Failed to install transpiler: bladebridge
[gw1] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_random_sampler_negative_count: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.38s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpiles_informatica_to_sparksql_non_interactive[True]: assert (None is not None) (14ms)
... (skipped 10000 bytes)
ricks-bb-plugin
10:12 ERROR [databricks.REDSHIFT_DATABASE.lakebridge.transpiler.installers] Failed to install transpiler: bladebridge
[gw7] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_snowflake_schema_compare: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.573s)
... (skipped 270233 bytes)
cks query: CREATE TABLE dummy ("col `$ escaped6" DOUBLE)
        Source equality check: True
        Databricks equality check: False
[gw5] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_apply_threshold_for_mismatch_with_wrong_percentage_bound: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.449s)
pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded.
[gw8] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_redshift_job_succeeds: databricks.sdk.errors.sdk.OperationFailed: failed to reach TERMINATED or SKIPPED, got RunLifeCycleState.INTERNAL_ERROR: Task run_reconciliation failed with message: Library installation failed for library due to user error. Error messages: (7m34.975s)
... (skipped 41154 bytes)
SHIFT_DATABASE.pytester.fixtures.baseline] removing cluster fixture: <databricks.sdk.service._internal.Wait object at 0x7f5d1b2a81f0>
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_schema_compare: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.416s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw9] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_snowflake_job_succeeds: databricks.sdk.errors.sdk.OperationFailed: failed to reach TERMINATED or SKIPPED, got RunLifeCycleState.INTERNAL_ERROR: Task run_reconciliation failed with message: Workload failed, see run output for details. (8m20.013s)
... (skipped 158868 bytes)
SHIFT_DATABASE.pytester.fixtures.baseline] removing cluster fixture: <databricks.sdk.service._internal.Wait object at 0x7f8eecf92650>
[gw2] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_tsql_schema_compare: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.423s)
... (skipped 25687 bytes)
ricks query: CREATE TABLE dummy ([col_timestamp] BINARY)
        Source equality check: True
        Databricks equality check: False
[gw2] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_recon_sql_server_job_succeeds: databricks.sdk.errors.sdk.OperationFailed: failed to reach TERMINATED or SKIPPED, got RunLifeCycleState.INTERNAL_ERROR: Task run_reconciliation failed with message: Workload failed, see run output for details. (10m1.068s)
... (skipped 166961 bytes)
SHIFT_DATABASE.pytester.fixtures.baseline] removing cluster fixture: <databricks.sdk.service._internal.Wait object at 0x7f37188f8130>
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_oracle_schema_compare: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.475s)
... (skipped 29241 bytes)
 CREATE TABLE dummy ("col `$ escaped6" DOUBLE PRECISION)
        Source equality check: True
        Databricks equality check: False
[gw6] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_databricks_schema_compare: pyspark.errors.exceptions.base.RetriesExceeded: [RETRIES_EXCEEDED] The maximum number of retries has been exceeded. (27.47s)
... (skipped 69 bytes)
e maximum number of retries has been exceeded.
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
[gw0] linux -- Python 3.10.20 /home/runner/work/lakebridge/lakebridge/.venv/bin/python

Running from acceptance #4136

@bishwajit-db bishwajit-db force-pushed the feature/redshift-recon branch from fb2d451 to e952d15 Compare March 25, 2026 12:59
@bishwajit-db bishwajit-db force-pushed the feature/redshift-recon branch from e952d15 to b095af4 Compare March 25, 2026 13:01
Copy link
Copy Markdown
Contributor

@m-abulazm m-abulazm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. we need to run the integration tests to be sure

@bishwajit-db bishwajit-db changed the title WIP: Add Redshift connector to Recon Add Redshift connector to Recon Mar 25, 2026
@BesikiML BesikiML self-requested a review April 8, 2026 13:03
@BesikiML
Copy link
Copy Markdown
Contributor

BesikiML commented Apr 8, 2026

Summary

Thanks for adding Redshift support to Recon.

Scope

The change set is large relative to the PR title (“Add Redshift connector to Recon”): it includes many unrelated areas (e.g. profiler/Synapse, workflows, broader config/telemetry). For reviewability and release clarity, consider updating the title/description to reflect the full scope.

Verify before merge

  1. Redshift JDBC driver class mismatch

    • jdbc_reader.py registers com.amazon.redshift.Driver.
    • tests/unit/reconcile/connectors/test_redshift.py expects com.amazon.redshift.jdbc42.Driver.
      Please align implementation, tests, and documentation with the actual driver JAR we expect on the cluster (commonly the JDBC 4.2 class for the current AWS Redshift JDBC driver).
  2. execute.py argument contract
    Reconcile entrypoint now branches on [operation_name] vs [operation_name, install_folder] for Installation. Please confirm this matches how the deployed recon job passes arguments, and consider a short comment or test so future job definition changes do not break installs.

Suggestions (non-blocking)

  • Connectivity: Many Redshift deployments require SSL or extra JDBC properties; optional support or docs for common parameters would help operators.

@bishwajit-db
Copy link
Copy Markdown
Contributor Author

bishwajit-db commented Apr 8, 2026

Summary

Thanks for adding Redshift support to Recon.

Scope

The change set is large relative to the PR title (“Add Redshift connector to Recon”): it includes many unrelated areas (e.g. profiler/Synapse, workflows, broader config/telemetry). For reviewability and release clarity, consider updating the title/description to reflect the full scope.

Verify before merge

  1. Redshift JDBC driver class mismatch

    • jdbc_reader.py registers com.amazon.redshift.Driver.
    • tests/unit/reconcile/connectors/test_redshift.py expects com.amazon.redshift.jdbc42.Driver.
      Please align implementation, tests, and documentation with the actual driver JAR we expect on the cluster (commonly the JDBC 4.2 class for the current AWS Redshift JDBC driver).
  2. execute.py argument contract
    Reconcile entrypoint now branches on [operation_name] vs [operation_name, install_folder] for Installation. Please confirm this matches how the deployed recon job passes arguments, and consider a short comment or test so future job definition changes do not break installs.

Suggestions (non-blocking)

  • Connectivity: Many Redshift deployments require SSL or extra JDBC properties; optional support or docs for common parameters would help operators.

Scope: The PR doesn't seem to affect any other part of the code apart from recon. Please elaborate.

  1. Fixed.
  2. The PR doesn't change the file execute.py. Could you please elaborate what needs to be handled in the PR.
  3. The current framework doesn't support adding custom properties as of now. Will need some changes in it and should be taken as a separate issue.

@BesikiML

@BesikiML
Copy link
Copy Markdown
Contributor

BesikiML commented Apr 17, 2026

bishwajit-db

Thanks for the follow-up — a few clarifications on my earlier review.

Scope

Looking at the current branch again, the changes are reconcile-scoped (Redshift connector, adapter wiring, hash/query bits, constants, docs, tests). My comment about the PR being much broader than the title does not apply to this revision; I may have been thinking of an older diff or I mixed it up. No action needed from you on that unless you still have unrelated commits that are not on this PR.

execute.py

You are right: execute.py isn’t touched in this PR. The argv / Installation note should not have been part of my review for #2339 — that was my mistake. If we still want to harden the job entrypoint, I will treat that as a separate PR against whatever actually invokes recon.

Redshift JDBC driver

Thanks for fixing the mismatch between implementation and tests.

SSL / custom JDBC properties

Fair to track as a follow-up issue if the framework does not support extra options yet.

Thanks again for the work on Redshift recon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants