Interested in this position?
Upload your resume and we'll match you with this and other relevant opportunities.
Upload Your ResumeAbout This Role
This role will design modern, AI-ready data architectures for clients, focusing on data modeling, semantic layer design, feature engineering, and AI enablement. The architect will create systems to make data reliable and production-ready for business intelligence, machine learning, and artificial intelligence initiatives.
Responsibilities
- Design and implement end-to-end data architectures in Snowflake, from ingestion through staging, fact/dimension modeling, and semantic layer design
- Define data models that balance flexibility for analysts with performance and scalability for production
- Partner with engineering teams to integrate data from source applications and operational systems
- Establish versioned modeling standards and documentation to ensure consistency across domains
- Build or refine semantic layers that unify metric definitions across BI tools like Tableau, Power BI, or Looker
- Collaborate with business owners to define KPIs, approve new metrics, and monitor adoption
- Architect feature pipelines and data contracts that support point-in-time correctness for machine learning models
- Collaborate with data scientists and AI engineers to implement reusable feature stores for both training and deployment use
- Partner with AI teams to integrate structured and unstructured data into generative and agentic workflows
Requirements
- 7+ years in data engineering/analytics engineering with ownership of production pipelines and BI at scale
- Demonstrated success owning and stabilizing production data platforms and critical pipelines
- Strong grasp of modern data platforms (e.g., Snowflake), orchestration (Airflow), and transformation frameworks (dbt or equivalent)
- Competence with data integration (ELT/ETL), APIs, cloud storage, and SQL performance tuning
- Practical data reliability experience: observability, lineage, testing, and change management
- Operates effectively in ambiguous, partially documented environments; creates order quickly through documentation and standards
- Prior ownership of core operations and reliability for business-critical pipelines with defined SLOs and incident response
- Demonstrated client-facing experience and outstanding written/verbal communication
Qualifications
- Bachelor's degree or equivalent experience
- 7+ years in data engineering/analytics engineering with ownership of production pipelines and BI at scale
Nice to Have
- Deep interest in Generative AI and Machine Learning
- Basic scripting ability in Python
- Practical Generative AI experience: shipped at least one end-to-end workflow (e.g., RAG)
- Working knowledge of LLM behavior
- Comfort with vector search and hybrid retrieval patterns
- Evaluation & safety basics for AI
- MLOps for LLMs
- Familiarity with BI tools (Power BI/Tableau) and semantic layer design
- Exposure to streaming, reverse ETL, and basic MDM/reference data management
- Security & governance awareness
Skills
Power BI
*
Python
*
SQL
*
AWS
*
Tableau
*
Looker
*
Snowflake
*
dbt
*
Airflow
*
* Required skills
About Human Agency
Human Agency partners with organizations to explore, design, and implement AI strategies that are secure, scalable, and human-centered, focusing on amplifying human potential.
Professional Services
View all jobs at Human Agency →
Related Searches
Similar Jobs
Data Architect
Active
BALIN TECHNOLOGIES LLC
·
New York, NY
Git
Snowflake
ETL
PostgreSQL
+1 more
1 week ago
Data Architect
Active
Robert Half
·
Bloomington, MN
·
$130,000 - $165,000
CI/CD pipelines
DevOps
Infrastructure-as-code
Azure Data Factory
+9 more
1 week ago
Chief of Staff, Go-To-Market & Growth
Active Remote
Human Agency
·
Boston, MA
1 week ago
Data Architect
Active
The Value Maximizer
·
Austin, TX
AWS
DevOps
SQL Server
Git
+16 more
2 weeks ago
Data Architect with Data Vault Experience
Active
VBeyond Corporation
·
Oaks, PA
Python
Snowflake
dbt
Airflow
+3 more
2 weeks ago