As an Associate Director at Alexion, you will be a hands-on technical leader shaping the design and delivery of our next-generation, patient-centric data and AI platforms. You will build highly scalable, reliable, and secure data products and pipelines, and you will do so with AI embedded throughout the lifecycle to increase speed, quality, and resilience.
You will champion responsible AI adoption in daily workflows using AI copilots to accelerate development, automated testing to improve quality, AI-driven observability to detect issues early, and AI-assisted governance to ensure compliance (GDPR, HIPAA, and FAIR/TRUSTed data product principles). You’ll raise the bar on engineering excellence while mentoring others in safe and effective AI use.
You will be responsible for:
Solution delivery
Design and build cloud-native ELT/ETL data pipelines and domain-oriented data products on AWS and Snowflake that are scalable, cost-efficient, and resilient.
Define and implement patterns for batch, micro-batch, and event-driven integrations; optimize for performance, reliability, and security.
AI-accelerated development
Use AI copilots to scaffold SQL/Python/dbt code, generate unit/integration tests, suggest query optimizations, and infer schemas/mappings.
Employ AI to auto-generate technical docs, lineage summaries, and code comments; integrate prompt standards and review checkpoints into PR workflows.
Data quality, observability, and reliability
Implement data quality frameworks and SLAs/SLOs with AI-enabled anomaly and drift detection, and root-cause suggestions; create self-healing runbooks where feasible.
Instrument pipelines with metrics, logs, and traces; leverage AI to correlate incidents across orchestration, warehouse, and source systems.
Governance, privacy, and compliance
Operationalize data governance and privacy controls (RBAC/PBAC, encryption, retention) with AI-assisted PII detection, policy checks, and automated audit artifacts.
Ensure alignment with FAIR and TRUSTed data product principles; contribute to catalog metadata, semantic tags, and discoverability with AI-supported enrichment.
Performance, cost, and platform optimization
Tune Snowflake warehouses, queries, and dbt models; apply AI-driven recommendations to balance cost, performance, and concurrency.
Contribute reusable components, and templates to “golden paths” that embed best practices and AI guardrails.
Collaboration and enablement
Partner with data science and analytics teams on data contracts, feature-ready datasets, and reproducible pipelines; support containerized/serverless runtimes where needed.
Mentor engineers on modern data engineering and responsible AI usage, including prompt engineering, validation patterns, and bias/quality checks.
You will need to have:
Master’s degree in Computer Science, Information Systems, Engineering, or a related field.
10+ years of experience in data engineering, data management, and analytics with a track record of delivering large-scale, secure, and resilient solutions—ideally in life sciences.
Strong hands-on expertise:
SQL and Python; building robust ETL/ELT and orchestration (Apache Airflow, AWS Glue).
Snowflake: resource monitors, RBAC, warehouse sizing, performance tuning, zero-copy clone, data sharing, time travel, Streams/Tasks, SnowPipe; tooling such as SnowSQL, Streamlit, and Cortex.
dbt and Fivetran; designing modular, testable transformations with version control and CI/CD.
AI in data engineering:
Practical experience using AI copilots for code/test generation with human review; AI-assisted schema mapping, documentation, and lineage.
AI-enabled data quality/observability (anomaly/drift detection, incident triage) and self-healing playbooks.
Automated PII detection/tagging and policy checks to support GDPR/HIPAA compliance.
Data governance and reliability:
Familiarity with FAIR and TRUSTed data product principles; experience with data catalogs and metadata standards.
Knowledge of data quality and observability methods and tools; ability to integrate telemetry across pipelines and platforms.
Cloud and platform skills:
AWS architecture patterns (certification preferred), Infrastructure as Code, GitHub-based CI/CD, secrets management.
Experience with containerization and serverless patterns; ability to support DS/ML adjacent workloads.
Experience implementing IaC with Terraform (or CloudFormation)
Communication and leadership:
Ability to explain complex technical concepts to varied audiences and to mentor engineers on best practices and responsible AI.
We would prefer for you to have:
5+ years in biotech/pharma with exposure to R&D and/or commercial analytics use cases; understanding of compliance contexts (e.g., GxP exposure helpful).
Experience across multiple clouds or stacks (Azure, GCP, Databricks).
Familiarity with Kubernetes/Docker for data workloads and knowledge graph concepts.
Date Posted
02-mar-2026Closing Date
05-mar-2026Alexion is proud to be an Equal Employment Opportunity and Affirmative Action employer. We are committed to fostering a culture of belonging where every single person can belong because of their uniqueness. The Company will not make decisions about employment, training, compensation, promotion, and other terms and conditions of employment based on race, color, religion, creed or lack thereof, sex, sexual orientation, age, ancestry, national origin, ethnicity, citizenship status, marital status, pregnancy, (including childbirth, breastfeeding, or related medical conditions), parental status (including adoption or surrogacy), military status, protected veteran status, disability, medical condition, gender identity or expression, genetic information, mental illness or other characteristics protected by law. Alexion provides reasonable accommodations to meet the needs of candidates and employees. To begin an interactive dialogue with Alexion regarding an accommodation, please contact accommodations@Alexion.com. Alexion participates in E-Verify.