Snowflake Architect

NuStar Technologies Dallas, TX $120,000 - $130,000
Full Time Mid Level

Posted 3 weeks ago

Interested in this position?

Upload your resume and we'll match you with this and other relevant opportunities.

Upload Your Resume

About This Role

Design, build, and optimize scalable cloud-based data platforms using Snowflake, DBT, Snowpark, and various cloud data services. This role involves leading architecture decisions, ensuring best practices, and enabling data analytics and science teams with reliable data solutions.

Responsibilities

  • Design and implement end-to-end Snowflake-based data architectures for analytics, reporting, and advanced data use cases
  • Define data modeling strategies (dimensional, data vault, and analytical models) optimized for Snowflake
  • Establish standards for data ingestion, transformation, storage, and consumption
  • Architect and manage Snowflake features including Warehouses, Databases, Schemas, Cloning, Time Travel, Secure Data Sharing, Data Clean Rooms and Resource Monitoring
  • Optimize performance and cost using warehouse sizing, clustering, caching, and query optimization
  • Implement security best practices including RBAC, masking policies, row access policies, and data governance
  • Lead ELT pipeline development using DBT (models, macros, tests, documentation, and deployments)
  • Design and implement ETL/ELT pipelines using cloud-native Snowpark and third-party tools
  • Architect solutions leveraging cloud data services (AWS, Azure, or GCP) such as object storage, messaging, and orchestration services
  • Develop data processing and automation solutions using Python
  • Partner with business stakeholders, analytics, and data science teams to translate requirements into scalable solutions
  • Mentor data engineers and analysts on Snowflake, DBT, Snowpark and data engineering best practices

Requirements

  • Strong hands-on experience with Snowflake architecture and performance tuning
  • Expertise in DBT (models, testing, macros, documentation, environments)
  • Solid experience with ETL/ELT frameworks and data integration patterns
  • Proficiency in Python for data engineering and automation
  • Experience with Snowpark Implementation
  • Strong knowledge of cloud data services (AWS, Azure, or GCP)
  • Advanced SQL and data modeling skills

Skills

Python * SQL * AWS * Azure * CI/CD * Snowflake * Databricks * Apache Spark * dbt * ETL/ELT * GCP * Snowpark *

* Required skills

About NuStar Technologies

Technology
View all jobs at NuStar Technologies →