Databricks Manager

Deloitte McLean, VA $130,800 - $241,000
Full Time Manager Level 6+ years Visa Sponsorship

Posted 1 month ago Expired

This job has expired

Looking for a job like Databricks Manager in or near McLean, VA? Upload your resume and we'll notify you when similar positions become available.

Upload Your Resume

About This Role

This Manager role focuses on leading innovation in big data architecture and analytics, shaping best practices, advising senior stakeholders, and ensuring data solutions align with business objectives and drive measurable results for clients. You will oversee the end-to-end design, deployment, and optimization of enterprise-scale data engineering solutions using Databricks on major cloud platforms.

Responsibilities

  • Architect and Deliver Solutions: Lead the development, implementation, and scaling of advanced data engineering solutions using Databricks across AWS, Azure, or GCP environments
  • Champion Best Practices: Establish, document, and promote best-in-class approaches for data architecture, integration, and modelling
  • Pipeline Ownership: Oversee the design, development, and maintenance of robust data pipelines and data architectures that support large-scale, enterprise data needs
  • Drive Excellence: Initiate and manage efforts to improve data quality, operational efficiency, and process scalability
  • Technology Leadership: Evaluate, pilot, and integrate new big data and analytics technologies, ensuring the organization remains at the cutting edge
  • Strategic Data Governance: Consult on, design, and implement governance, security, and compliance strategies tailored to modern cloud data ecosystems
  • Team Leadership and Mentoring: Lead, coach, and develop teams of data engineers and architects, fostering technical growth and effective project delivery
  • Stakeholder Engagement: Communicate technical concepts and business value to diverse stakeholders
  • DevOps and Automation: Oversee the implementation of CI/CD practices for streamlined deployments and operations

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • 6+ years of hands-on experience in data engineering with a strong focus on Databricks on AWS, Azure, or GCP
  • 6+ years technical proficiency with cloud-native databases, storage solutions, and distributed compute platforms
  • Deep understanding of Lakehouse architecture, Apache Spark, Delta Lake, and related big data technologies
  • Advanced skills in data warehousing, 3NF, dimensional modeling, and enterprise-level data lakes
  • Experience with Databricks components including Delta Live Tables, Autoloader, Structured Streaming, Databricks Workflows, and orchestration tools (e.g., Apache Airflow)
  • Expertise in designing and supporting incremental data loads and building metadata-driven ingestion/data quality frameworks using PySpark
  • Hands-on experience with Databricks Unity Catalog and implementing fine-grained security and access control
  • Proven track record in deploying code and solutions via automated CI/CD pipelines
  • 4+ years leadership experience in managing complex, cross-functional data projects and technical teams
  • Experience with performance optimization of Data engineering pipelines, code, compute resources

Qualifications

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field (Master's preferred)
  • 6+ years of hands-on experience in data engineering with a strong focus on Databricks on any major cloud (AWS, Azure, GCP); 4+ years of leadership experience in managing complex data projects and technical teams

Nice to Have

  • Master's degree
  • Comprehensive knowledge of the AWS, Azure, and GCP cloud ecosystems and associated big data stacks
  • Demonstrated skill in performance tuning and optimization within Databricks/Apache Spark environments
  • Stays current with the latest Databricks feature releases and platform enhancements
  • Exceptional communication and stakeholder management abilities, including comfort interfacing with executive leadership
  • Experience with Databricks Lakeflow
  • Experience in AI/ML

Skills

AWS * Azure * PowerShell * Jenkins * AWS Code Pipeline * Azure DevOps * Databricks * Apache Spark * Apache Airflow * GCP * TFS * Delta Lake * PySpark * Delta Live Tables * Autoloader * Structured Streaming * Databricks Workflows * Databricks Unity Catalog *

* Required skills

About Deloitte

A company transforming technology platforms, driving innovation, and transforming mission-critical operations for clients, especially in the Life Sciences sector.

Professional Services
View all jobs at Deloitte →