Heartflow, inc.

Senior Data Engineer - AI Empowerment

Rohnert Park, California Full Time

Heartflow is a medical technology company advancing the diagnosis and management of coronary artery disease, the #1 cause of death worldwide, using cutting-edge technology. The flagship product—an AI-driven, non-invasive cardiac test supported by the ACC/AHA Chest Pain Guidelines called the Heartflow FFRCT Analysis—provides a color-coded, 3D model of a patient’s coronary arteries indicating the impact blockages have on blood flow to the heart. Heartflow is the first AI-driven non-invasive integrated heart care solution across the CCTA pathway that helps clinicians identify stenoses in the coronary arteries (RoadMap™Analysis), assess coronary blood flow (FFRCT Analysis), and characterize and quantify coronary atherosclerosis (Plaque Analysis). Our pipeline of products is growing and so is our team; join us in helping to revolutionize precision heartcare.

Heartflow is a publicly traded company (HTFL) that has received international recognition for exceptional strides in healthcare innovation, is supported by medical societies around the world, cleared for use in the US, UK, Europe, Japan and Canada, and has been used for more than 500,000 patients worldwide.  

Heartflow is a medical technology company advancing the diagnosis and management of coronary artery disease using cutting-edge technology. As we continue to revolutionize precision heartcare through our AI-driven integrated heart care solutions , we are seeking a proactive, self-starting Senior Data Engineer - AI Empowerment. This role is focused on architecting the infrastructure necessary to empower business processes with AI, specifically leveraging the power of our data lake to fuel advanced models.

Role Overview

In this role, you will move beyond simple ETL to building robust, observable data products that serve as the foundation for AI enablement. You will architect data pipelines that integrate with Google Vertex AI and Claude to automate complex business workflows. You will act as a technical lead, balancing high-level AWS infrastructure management with the mission of making our data lake assets "AI-ready" for the entire organization.

Job Responsibilities

  • AI Pipeline Orchestration: Build and manage complex data pipelines utilizing the Dagster framework to orchestrate data flows into Google Vertex AI for model training and deployment.
  • Stakeholder Engagement: Work closely with stakeholders from requirements gathering to training and ongoing support
  • LLM Integration: Design and maintain specialized pipelines that leverage Claude to automate business processes, ensuring high accuracy and context-aware outputs from our data lake.
  • AI-Driven Business Enablement: Enable departments to leverage AI-powered insights by providing clean, governed, and structured data sets optimized for LLM consumption.
  • Semantic Layer & BI: Support a unified semantic model in Cube (Cube.js) to provide consistent metrics that fuel both AI agents and human-centric dashboards.
  • Modern Data Stack Management: Manage and optimize high-performance storage and query layers using Iceberg, Amazon Redshift, and Athena to support the data-intensive needs of AI applications.
  • Mentorship & Governance: Lead data governance initiatives to ensure "AI-readiness" and provide technical mentorship to junior engineers on best practices for AI-centric data engineering.
  • 24/7 Support: On-call support for 24/7 critical business processes

Skills Needed

  • Languages & Libraries: Expert proficiency in Python (specifically pyiceberg, boto3, polars, and pandas) and SQL.
  • AI Platforms: Hands-on experience integrating data pipelines with Google Vertex AI and utilizing Claude for natural language processing tasks.
  • AWS Ecosystem: Deep hands-on experience with ECS (Fargate/EC2), Redshift, and Athena.
  • Pipeline Concepts: Mastery of Dagster data pipeline concepts, including Software-Defined Assets and declarative orchestration.
  • Modeling: Strong experience in Cube Dev semantic modeling and building reporting layers for complex business workflows.
  • Domain Knowledge: Proven track record in developing data systems that enable AI/ML capabilities or advanced business process automation.

Educational Requirements & Work Experience

  • Education: B.S. or M.S. in Computer Science, Data Engineering, or a related field.
  • Experience: 5+ years of Data Engineering experience, with significant time spent in AWS environments and a background in enabling AI/ML workflows.
  • Leadership: Demonstrated ability as a self-starter with experience mentoring peers and leading technical projects.

A reasonable estimate of the base salary compensation range is $160,000 to $200,000 per year, and bonus. #LI-IB1

Heartflow is an Equal Opportunity Employer. We are committed to a work environment that supports, inspires, and respects all individuals and do not discriminate against any employee or applicant because of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law. This policy applies to every aspect of employment at Heartflow, including recruitment, hiring, training, relocation, promotion, and termination.
 
Positions posted for Heartflow are not intended for or open to third party recruiters / agencies. Submission of any unsolicited resumes for these positions will be considered to be free referrals.
 
Heartflow has become aware of a fraud where unknown entities are posing as Heartflow recruiters in an attempt to obtain personal information from individuals as part of our application or job offer process. Before providing any personal information to outside parties, please verify the following: A) all legitimate Heartflow recruiter email addresses end with “@heartflow.com” and B) the position described is found on our careers site at www.heartflow.com/about/careers/