Data Architect
The Value Maximizer
Austin, TX
Full Time
Senior Level
6+ years
Posted 2 months ago Expired
This job has expired
Looking for a job like Data Architect in or near Austin, TX? Upload your resume and we'll notify you when similar positions become available.
Upload Your ResumeAbout This Role
This role involves architecting and building scalable, real-time streaming data pipelines using AWS and Kafka. It focuses on data ingestion, validation, enrichment, and reconciliation for financial applications, ensuring data quality and operational KPIs.
Responsibilities
- Provide technical solution discovery for new capabilities
- Assist Product Owners with technical user stories to maintain a healthy backlog
- Lead the development of real-time data pipelines using AWS DMS, MSK, Kafka, or Glue Streaming for CDC ingestion from SQL Server sources
- Build and optimize streaming and batch data pipelines using AWS Glue (PySpark) to validate, transform, and normalize data to Iceberg and DynamoDB
- Define and enforce data quality, lineage, and reconciliation logic for streaming and batch use cases
- Integrate with S3 Bronze/Silver layers and implement efficient schema evolution and partitioning strategies using Iceberg
- Collaborate with architects, analysts, and downstream teams to design API and file-based egress layers
- Implement monitoring, logging, and event-based alerting using CloudWatch, SNS, and EventBridge
- Mentor junior developers and enforce best practices for modular, secure, and scalable data pipeline development
Requirements
- 6+ years of hands-on data engineering experience in cloud-based environments (AWS preferred) with event-driven implementation
- Strong experience with Apache Kafka / AWS MSK including topic design, partitioning, and Kafka Connect/Debezium
- Proficiency in AWS Glue (PySpark) for both batch and streaming ETL
- Working knowledge of AWS DMS, S3, Lake Formation, DynamoDB, and Iceberg
- Solid grasp of schema evolution, CDC patterns, and data reconciliation frameworks
- Experience with infrastructure-as-code (CDK/Terraform) and DevOps practices (CI/CD, Git)
Qualifications
- 6+ years of hands-on expert level data engineering experience in cloud-based environments (AWS preferred) with event driven implementation
Skills
AWS
*
DevOps
*
SQL Server
*
Git
*
CI/CD
*
Terraform
*
DynamoDB
*
Apache Kafka
*
S3
*
Kafka
*
AWS Glue
*
CloudWatch
*
SNS
*
EventBridge
*
PySpark
*
CDK
*
Flink
*
AWS DMS
*
Iceberg
*
AWS MSK
*
* Required skills
Related Searches
Similar Jobs
Data Architect
Expired
BALIN TECHNOLOGIES LLC
·
New York, NY
Git
Snowflake
ETL
PostgreSQL
+1 more
1 month ago
Data Architect
Expired
Robert Half
·
Bloomington, MN
·
$130,000 - $165,000
CI/CD pipelines
DevOps
Infrastructure-as-code
Azure Data Factory
+9 more
1 month ago
Technical Program Manager
Expired
The Value Maximizer
·
Fort Mill, SC
Problem Solving
SQL
Communication
Waterfall
+12 more
2 months ago
Data Architect with Data Vault Experience
Expired
VBeyond Corporation
·
Oaks, PA
Python
Snowflake
dbt
Airflow
+3 more
2 months ago
Informatica Developer
Expired
The Value Maximizer
SQL
AWS
Azure
SQL Server
+8 more
2 months ago