Technical Data Analyst
Diligente Technologies
San Francisco Bay Area, CA
Contract
Mid Level
8+ years
Posted 1 month ago Expired
This job has expired
Looking for a job like Technical Data Analyst in or near San Francisco Bay Area, CA? Upload your resume and we'll notify you when similar positions become available.
Upload Your ResumeAbout This Role
The Technical Data Analyst will be responsible for analyzing large datasets, building and maintaining dashboards and reports, and partnering with data engineers to optimize ETL/ELT pipelines for an enterprise-scale data platform.
Responsibilities
- Write complex, high-performance SQL queries to analyze large, structured and semi-structured datasets
- Write SQL for processing raw data, kafka ingestions, adf pipelines, data validation and QA
- Use Python for data analysis, automation, validation, and lightweight data engineering tasks
- Build, enhance, and maintain dashboards and reports using Tableau and ThoughtSpot
- Partner with data engineers to design, validate, and optimize ETL/ELT pipelines
- Work extensively with Databricks (Spark, notebooks, Delta tables) for data exploration and analytics
- Perform data quality checks, reconciliations, and root-cause analysis to ensure data accuracy and consistency
- Translate business requirements into technical data solutions and semantic layers
- Support self-service analytics by documenting datasets, metrics, and business definitions
- Collaborate across teams to troubleshoot data issues and improve reporting performance
Requirements
- 8-10 years of software development and deployment experience
- 5+ years of hands-on experience with SQL, Databricks, ADF, Datastage (or other ETL tool), SSAS cubes, Cognos, Tableau, Thoughtspot and other BI tools
- Strong proficiency in SQL, including complex joins, window functions, CTEs, and performance optimization
- Strong Python skills for data analysis and scripting (e.g., pandas, numpy)
- Hands-on experience with Databricks and distributed data processing concepts
- Hands on experience working with ETL tools and data pipelines (batch and/or streaming)
- Proficiency in reporting and visualization tools such as Tableau, ThoughtSpot, Cognos, SSAS Cubes
- Solid understanding of data warehousing concepts, data modeling, and analytics best practices
- Ability to analyze large datasets and communicate insights clearly to both technical and non-technical audiences
Qualifications
- 8-10 years of software development and deployment experience with at least 5 years of hands-on experience with SQL, Databricks, ADF, Datastage (or other ETL tool), SSAS cubes, Cognos, Tableau, Thoughtspot and other BI tools
Nice to Have
- Experience with cloud data platforms (AWS, Azure, or GCP)
- Familiarity with version control tools (Git) and CI/CD concepts for analytics workflows
- Exposure to data governance, metric standardization, and semantic layers
- Prior experience in enterprise-scale data platforms or COE environments
Skills
Python
*
SQL
*
AWS
*
Azure
*
Tableau
*
Git
*
CI/CD
*
APIs
*
Databricks
*
Kafka
*
GCP
*
Pandas
*
NumPy
*
ADF
*
Spark
*
Cognos
*
Datastage
*
SSAS cubes
*
Thoughtspot
*
Delta tables
*
* Required skills