Human agency

Data Architect

USA Full Time

Data Architect

Location: Remote (U.S.)
Type: Full-Time

About Human Agency

We’re scaling rapidly and have a growing pipeline of opportunities that demand exceptional talent across disciplines. Our mission is to bring on individuals, from creative producers to technical experts to entrepreneurial leaders, who can help us realize this next chapter of growth.

We are a company of doers. Leaders roll up their sleeves, teams work flat, and everyone contributes to what ships. Titles don’t insulate us from feedback or basics. We invite critique, learn quickly, and keep raising the bar. The best ideas win here, no matter where they come from, because clients trust us to deliver the strongest outcomes every time.

Our clients’ missions, products, and bottom lines are sacred. We immerse ourselves in their world, becoming stewards of their goals and partners in solving big problems. Every product, strategy, or asset we create must be both beautiful and functional; practical, usable, and designed for real-world impact.

Humans are our most valuable resource, and we only grow by hiring people who push us forward. Across strategy, engineering, design, data, and operations, we seek out teammates who raise the bar and make us better. Always hire up, never down.

We partner with organizations of all sizes to explore, design, and implement AI strategies that are secure, scalable, and human-centered. We believe AI should amplify human potential, not replace it and we build with that conviction in every engagement. From advisory and tooling to implementation and education, we meet clients where they are at and help them integrate AI in ways that align with their mission and values. Our goal is to empower teams to work smarter, move faster, and unlock new possibilities through thoughtful, responsible innovation.

And through it all, we lead with purpose, love, and adventure. We do meaningful work with people we care about, and we make the ride an adventure worth taking. Because at Human Agency, who we are and how we work are one and the same.

The Opportunity

We’re seeking a Data Architect to design modern, AI-ready data architectures across multiple client engagements.

This role sits at the intersection of data modeling, semantic layer design, feature engineering, and AI enablement. You’ll architect systems that make data reliable, reusable, and production-ready for business intelligence, machine learning, and artificial intelligence.

You should be equally comfortable designing the data backbone for AI-driven products, writing SQL or Python to unblock a model pipeline, or guiding teams through tradeoffs between flexibility, cost, and responsible automation.

Key Responsibilities

Data Modeling & Architecture

  • Design and implement end-to-end data architectures in Snowflake — from raw ingestion through staging, fact/dimension modeling, and semantic layer design.
  • Define data models that balance flexibility for analysts with performance and scalability for production.
  • Partner with engineering teams to integrate data from source applications and operational systems.
  • Establish versioned modeling standards and documentation to ensure consistency across domains.

Semantic Layer & Metric Governance

  • Build or refine semantic layers that unify metric definitions across BI tools like Tableau, Power BI, or Looker.
  • Collaborate with business owners to define KPIs, approve new metrics, and monitor adoption.
  • Implement versioned datasets and definitions to support reliable analytics and reporting.

 

Feature Store & ML Readiness

  • Architect feature pipelines and data contracts that support point-in-time correctness for machine learning models.
  • Collaborate with data scientists and AI engineers to implement reusable feature stores for both training (offline) and deployment (online) use.
  • Monitor data quality and prevent data leakage that could affect model performance.
  • Support event-driven architectures that bridge predictive models with operational systems.

AI & Agentic Workflow Enablement

  • Partner with AI teams to integrate structured and unstructured data into generative and agentic workflows (e.g., RAG, copilots, automated evaluation agents).
  • Design APIs or event structures that serve predictions and triggers in near real time.
  • Measure adoption and value of AI-driven workflows through data instrumentation.

Qualifications

Required

  • 7+ years in data engineering/analytics engineering with ownership of production pipelines and BI at scale.
  • Demonstrated success owning and stabilizing production data platforms and critical pipelines.
  • Strong grasp of modern data platforms (e.g., Snowflake), orchestration (Airflow), and transformation frameworks (dbt or equivalent).
  • Competence with data integration (ELT/ETL), APIs, cloud storage, and SQL performance tuning.
  • Practical data reliability experience: observability, lineage, testing, and change management.
  • Operates effectively in ambiguous, partially documented environments; creates order quickly through documentation and standards.
  • Prior ownership of core operations and reliability for business-critical pipelines with defined SLOs and incident response.
  • Demonstrated client-facing experience (consulting/agency or internal platform teams with cross-functional stakeholders) and outstanding written/verbal communication (executive briefings, workshops, decision memos).

Preferred

  • Deep interest in Generative AI and Machine Learning.
  • Basic scripting ability in Python.
  • Practical Generative AI experience: shipped at least one end-to-end workflow (e.g., RAG) including ingestion, embeddings, retrieval, generation, and evaluation.
  • Working knowledge of LLM behavior (tokens, context windows, temperature/top-p, few-shot/tool use) and how to tune for quality/cost/latency.
  • Comfort with vector search (e.g., pgvector or a hosted vector store) and hybrid retrieval patterns.
  • Evaluation & safety basics: offline evaluation harnesses, lightweight online A/B tests, and guardrails for PII and prompt-injection.
  • MLOps for LLMs: experiment tracking, versioning of prompts/configs, CI/CD for data & retrieval graphs, and production monitoring (latency, cost, drift).
  • Python scripting for data/LLM utilities and service integration (APIs, batching, retries).
  • Familiarity with BI tools (Power BI/Tableau) and semantic layer design.
  • Exposure to streaming, reverse ETL, and basic MDM/reference data management.
  • Security & governance awareness (role‑based access, least privilege, data retention).

Considerations

  • Education: Bachelor’s degree or equivalent experience.
  • Ethics: Commitment to ethical practices and responsible AI.
  • Travel: Occasional (10–30%) for client activities and events.
  • Location: Remote-friendly with preference for candidates in St. Louis, MO and major tech hubs

Compensation

This role offers a competitive base salary with performance-based bonuses. Final compensation will vary based on experience, performance, and location.

Why Work With Human Agency

Join a team of thinkers and builders creating meaningful impact across sectors—with autonomy to lead, the resources to succeed, and room to grow.

Equal Opportunity Commitment

Human Agency is an Equal Opportunity Employer. We value diverse backgrounds and strive to build an inclusive culture where everyone feels welcomed and empowered.