Delivery Solutions Architect at Databricks
Accelerating customer data & AI initiatives from architecture to go-live. I build production-ready exemplars, migration tooling, and self-service templates so teams can ship data platforms faster.
|
Databricks Asset Bundle for migrating MLflow registered models between Unity Catalog catalogs on the same metastore. Preserves versions, artifacts, metrics, params, tags, aliases, and direct UC grants. |
Programmatic migration and management of Databricks AI/BI (Lakeview) dashboards across workspaces, with diff and validation utilities. |
|
Hands-on Databricks workshop for manufacturing analytics — OEE, quality, downtime, and safety — with AI/BI Genie, dashboards, and row-level security. |
Customer-facing exemplar for high-performance vectorized AES and FF1 format-preserving encryption on Databricks — works on both serverless and classic compute, packaged as a DAB. |
|
AI-powered logistics incident response application built on Databricks — full-stack TypeScript app demonstrating real-time analytics and LLM-driven workflows. |
Zero-downtime nightly replication pattern on Databricks using atomic schema swap — production-ready blueprint for safe, repeatable cutovers. |
Data Engineering Template -- Production-ready pipelines with Auto Loader, Spark Declarative Pipelines, and medallion architectures, packaged as Databricks Asset Bundles
MLOps & Governance Tooling -- Unity Catalog model migration, lineage tracking, and governance automation
Developer Experience -- Backstage templates and self-service tooling that let teams spin up data platform infrastructure in minutes
