Azure Databricks Data Architect
Remote
Blue Ocean Group
$150,000 - $180,000
Full Time
Senior Level
7+ years
Posted 3 weeks ago
Interested in this position?
Upload your resume and we'll match you with this and other relevant opportunities.
Upload Your ResumeAbout This Role
Architect and drive the evolution of a financial services organization's data platform from traditional ETL/warehouse patterns to a modern Azure-native ELT and lakehouse architecture. This highly hands-on role involves significant platform ownership, technical leadership, and direct implementation of data solutions using Azure and Databricks.
Responsibilities
- Architect and drive the evolution from traditional ETL/warehouse patterns to a modern Azure-native ELT and lakehouse architecture
- Design and implement data ingestion, storage, and transformation layers
- Define architectural standards and best practices for the data platform
- Lead technical decision-making on platform components and tools
- Migrate on-prem/legacy systems to Azure cloud-native architecture
- Build CI/CD pipelines for automated testing, deployment, and monitoring
- Implement data quality, governance, and security frameworks
- Serve as escalation point for production issues and complex technical challenges
- Translate business requirements into technical solutions and align cross-functional teams
- Balance innovation with pragmatism and cost-effectiveness
Requirements
- 5+ years production experience in Azure Cloud Ecosystem
- Azure Databricks - designing and optimizing lakehouse architectures
- Azure Data Factory (ADF) - building and orchestrating complex data pipelines
- Azure Data Lake Storage Gen2 - implementing scalable data lake solutions
- Azure Synapse Analytics, SQL MI, or equivalent data warehouse platforms
- Hands-on CI/CD & DevOps experience (Azure DevOps, Git workflows, Jenkins, GitHub Actions)
- Infrastructure as Code (Terraform, ARM templates, or Bicep)
- 7-10+ years designing ETL/ELT pipelines at enterprise scale
- Production experience migrating legacy ETL systems to modern ELT patterns
- Advanced SQL (T-SQL, optimization, performance tuning)
- Python for data engineering (Pandas, PySpark, or similar)
- Multi-terabyte data warehouse design and management
- Data modeling (dimensional, Data Vault, or lakehouse patterns)
- Data governance frameworks and metadata management
- Security and compliance in regulated industries
- Disaster recovery and high availability design
Qualifications
- 7-10+ years designing ETL/ELT pipelines at enterprise scale, 5+ years production experience in Azure Cloud Ecosystem
Nice to Have
- Power BI for analytics enablement
- Delta Lake or similar transactional data lake formats
- Snowflake, Fabric, or competitive modern platforms experience
- Financial services, insurance, or regulated industry background
- Experience with legacy system modernization
- Team leadership or mentoring experience
Skills
Power BI
*
Python
*
SQL
*
Azure
*
Fabric
*
Jenkins
*
Azure DevOps
*
Git
*
Terraform
*
Snowflake
*
Databricks
*
Pandas
*
GitHub Actions
*
Delta Lake
*
PySpark
*
ARM Templates
*
Azure Data Factory (ADF)
*
Bicep
*
Azure Synapse Analytics
*
Azure Data Lake Storage Gen2
*
SQL MI
*
* Required skills
Benefits
Dental Insurance
Profit sharing
Employee-owned equity shares
Vision Insurance
Professional development budget
401(k) Match
Medical Insurance
FSA/HSA
About Blue Ocean Group
Blue Ocean Group is partnering with a stable, employee-owned mid-market financial services organization undergoing significant Azure-based data platform modernization.
Professional Services
View all jobs at Blue Ocean Group →