Senior Data Engineer
Contract
Senior Level
Posted 4 weeks ago
Interested in this position?
Upload your resume and we'll match you with this and other relevant opportunities.
Upload Your ResumeAbout This Role
Join our dynamic data and analytics team as a Senior Data Engineer, responsible for building and maintaining scalable data pipelines that support ingestion from multiple sources. You will ensure data integrity and availability across various systems, working closely with data scientists, analysts, and engineering teams to enable real-time and batch data processing.
Responsibilities
- Design, develop, and maintain robust data pipelines to ingest, transform, and deliver data across internal and external platforms.
- Write and optimize complex SQL queries for data extraction, transformation, and loading (ETL/ELT).
- Implement data ingestion frameworks using batch and streaming technologies.
- Develop data integration workflows and scripts using Python, Shell scripting, or other scripting languages.
- Ensure high performance, reliability, and data quality across all stages of the pipeline.
- Collaborate with cross-functional teams (data science, analytics, product) to understand data needs and deliver scalable solutions.
- Monitor data jobs, identify bottlenecks, and troubleshoot issues in real time.
- Handle large and intricate datasets, perform data profiling, and ensure conformance to data quality standards.
- Apply problem-solving skills to identify root causes of data issues and suggest long-term fixes or enhancements.
- Work in Agile/Scrum environments, participating in planning, reviews, and delivery cycles.
Requirements
- Strong hands-on experience with SQL (writing complex joins, window functions, CTEs, aggregations, etc.).
- Proven experience with data ingestion, integration, and pipeline design across multiple data sources.
- Proficiency in Python, Shell, or other scripting languages for automation and orchestration tasks.
- Familiarity with data processing tools and frameworks such as Apache Airflow, Spark, Kafka, or similar.
- Experience working with relational databases (PostgreSQL, Oracle, MySQL).
- Ability to work with complex and messy data: cleansing, validating, and transforming to ensure consistency.
- Strong analytical and problem-solving skills with attention to detail and data accuracy.
- Exposure to CI/CD practices and version control (GIT).
Nice to Have
- Experience with Cloud data platforms (GCP Big Query).
- Knowledge of data modeling principles and schema design.
- Experience in handling data from APls, flat files, and event streams.
- Background in financial services, payments, or customer analytics.
- Familiarity with data governance practices, metadata management, and PIl data handling.
- Understanding of data security, encryption, and masking techniques.
Skills
Python
*
SQL
*
Oracle
*
Git
*
CI/CD
*
Apache Airflow
*
Kafka
*
PostgreSQL
*
Shell Scripting
*
MySQL
*
Spark
*
Agile/Scrum
*
GCP Big Query
*
* Required skills
Related Searches
Similar Jobs
POS Tester
Active
Net2Source (N2S)
·
Seattle, WA
Python
Java
DevOps
Agile
+5 more
1 week ago
Senior Data Engineer
Active Remote
Jobgether
·
$185,000 - $200,000
Python
SQL
Salesforce
ServiceNow
+4 more
1 week ago
Senior Data Engineer
Active
Loopback Health
·
Dallas, TX
Python
SQL
AWS
Azure
+17 more
1 week ago
Receptionist
Active
Net2Source (N2S)
·
Santa Clara, CA
·
$21 - $22
Word
Excel
Microsoft 365
PowerPoint
+2 more
1 week ago
Risk and Internal Controls Intern
Active
Net2Source (N2S)
·
Orlando, FL
·
$30 - $35
Excel
Microsoft Office Suite
Power Automate
PowerPoint
+5 more
2 weeks ago