Snowflake Architect
Full Time
Mid Level
Posted 3 weeks ago
Interested in this position?
Upload your resume and we'll match you with this and other relevant opportunities.
Upload Your ResumeAbout This Role
Design, build, and optimize scalable cloud-based data platforms using Snowflake, DBT, Snowpark, and various cloud data services. This role involves leading architecture decisions, ensuring best practices, and enabling data analytics and science teams with reliable data solutions.
Responsibilities
- Design and implement end-to-end Snowflake-based data architectures for analytics, reporting, and advanced data use cases
- Define data modeling strategies (dimensional, data vault, and analytical models) optimized for Snowflake
- Establish standards for data ingestion, transformation, storage, and consumption
- Architect and manage Snowflake features including Warehouses, Databases, Schemas, Cloning, Time Travel, Secure Data Sharing, Data Clean Rooms and Resource Monitoring
- Optimize performance and cost using warehouse sizing, clustering, caching, and query optimization
- Implement security best practices including RBAC, masking policies, row access policies, and data governance
- Lead ELT pipeline development using DBT (models, macros, tests, documentation, and deployments)
- Design and implement ETL/ELT pipelines using cloud-native Snowpark and third-party tools
- Architect solutions leveraging cloud data services (AWS, Azure, or GCP) such as object storage, messaging, and orchestration services
- Develop data processing and automation solutions using Python
- Partner with business stakeholders, analytics, and data science teams to translate requirements into scalable solutions
- Mentor data engineers and analysts on Snowflake, DBT, Snowpark and data engineering best practices
Requirements
- Strong hands-on experience with Snowflake architecture and performance tuning
- Expertise in DBT (models, testing, macros, documentation, environments)
- Solid experience with ETL/ELT frameworks and data integration patterns
- Proficiency in Python for data engineering and automation
- Experience with Snowpark Implementation
- Strong knowledge of cloud data services (AWS, Azure, or GCP)
- Advanced SQL and data modeling skills
Skills
Python
*
SQL
*
AWS
*
Azure
*
CI/CD
*
Snowflake
*
Databricks
*
Apache Spark
*
dbt
*
ETL/ELT
*
GCP
*
Snowpark
*
* Required skills
Related Searches
Similar Jobs
Salesforce Loyalty Cloud Architect
Active
NuStar Technologies
·
McLean, VA
·
$130,000 - $160,000
Jenkins
API
Azure DevOps
Git
+25 more
3 weeks ago
KOFAX Technical Lead
Active
NuStar Technologies
·
Minneapolis, MN
·
$120,000 - $135,000
AWS
.NET
Jira
Confluence
+6 more
3 weeks ago
Email Campaign Manager
Active
NuStar Technologies
·
Minneapolis, MN
·
$110,000 - $120,000
CSS
HTML
Salesforce Marketing Cloud
Workfront
+4 more
3 weeks ago
Qlik Data Engineer
Active
NuStar Technologies
·
Raleigh, NC
·
$100,000 - $110,000
Python
SQL
AWS
QlikView
+6 more
3 weeks ago
Solution Architect
Expired
NuStar Technologies
·
New York, NY
Angular
DevOps
Agile
SEO
+18 more
1 month ago