McKesson is an impact-driven, Fortune 10 company that touches virtually every aspect of healthcare. We are known for delivering insights, products, and services that make quality care more accessible and affordable. Here, we focus on the health, happiness, and well-being of you and those we serve – we care.
What you do at McKesson matters. We foster a culture where you can grow, make an impact, and are empowered to bring new ideas. Together, we thrive as we shape the future of health for patients, our communities, and our people. If you want to be part of tomorrow’s health today, we want to hear from you.
US Oncology – Decision Intelligence Organization is seeking a highly skilled Senior Data Engineering Specialist to join our dynamic team within the USON Technology Team. As a Senior Data Engineering Specialist, you will lead the design, development, and optimization of complex data pipelines and architectures that support enterprise-wide analytics and decision intelligence initiatives. You will work closely with cross-functional teams to ensure scalable, secure, and high-performance data solutions across McKesson Technology and USON platforms.
The ideal candidate will have deep technical expertise in the Microsoft Azure ecosystem, including Azure Databricks, Azure Data Factory (ADF), and Python, solution architecting and implementing robust data ingestion, transformation, and integration workflows from diverse sources, including relational databases, flat files, APIs, and unstructured data, into cloud-based data platforms such as Azure, Databricks, and Snowflake. Proficiency in data visualization tools such as PowerBI and Tableau is essential to enable intuitive, actionable analytics on top of ingested and transformed data assets.
Key Responsibilities:
Data Architecture & Modeling
Architect and implement scalable data pipelines for ingestion, transformation, and integration across cloud and on-prem environments, using Azure Databricks, Azure Data Factory (ADF) and medallion architecture principles to support structured, semi-structured, and unstructured data.
Design and maintain conceptual and physical data models aligned with business reporting and analytics needs, ensuring compatibility with bronze, silver, and gold layers of the medallion framework for optimized data quality, performance, and governance.
Collaborate with application development teams to define data flows and ensure smooth integration into production systems, using layered data refinement to support analytics and BI tools.
Performance Optimization & Automation
Optimize performance of data solutions through indexing, partitioning, caching, and parallel processing techniques.
Drive automation and self-healing capabilities in data workflows using orchestration tools and metadata-driven design.
Governance, Security & Compliance
Lead the development of reusable frameworks for data quality, lineage, and governance to ensure trustworthy analytics.
Ensure compliance with data privacy, security, and regulatory standards (e.g., HIPAA, SOC 2).
Analytics Enablement & Reporting
Build and support Power BI solutions to enable data-driven decision-making across the organization.
Develop executive-level dashboards and status reports to communicate key insights and progress.
Leadership & Mentorship
Mentor junior engineers and contribute to technical leadership across the data engineering practice.
Promote best practices in data engineering, DevOps, and agile delivery.
Minimum Requirements:
Degree or equivalent and typically requires 7+ years of relevant experience.
Critical Skills:
Extensive hands-on experience in data engineering, including cloud data platforms like Databricks, Snowflake, and Azure Synapse.
Strong proficiency in Python, SQL, and Spark for data processing and transformation.
Deep knowledge of the Microsoft Azure ecosystem and experience with Azure Data Factory and Azure Databricks.
Experience with streaming and batch data processing frameworks using Kafka, Delta Lake.
Proven track record in designing and implementing medallion and lakehouse architectures.
Expertise in data modeling for both transactional and analytical systems.
Familiarity with CI/CD pipelines, version control (Git), and infrastructure-as-code (Terraform, ARM templates).
Excellent communication and stakeholder engagement skills.
Additional Skills:
Experience with enterprise-scale data integration and modernization programs.
Knowledge of data cataloging and metadata management tools (e.g., Alation).
Exposure to AI/ML pipelines and feature engineering workflows.
Strong problem-solving skills with a focus on performance tuning and cost optimization.
Ability to lead cross-functional initiatives and influence architectural decisions.
Experience in building and presenting executive dashboards and reports using PBI.
Bachelor’s degree or Master’s degree (in Computer Science, Information Systems, or related field) preferred.
We are proud to offer a competitive compensation package at McKesson as part of our Total Rewards. This is determined by several factors, including performance, experience and skills, equity, regular job market evaluations, and geographical markets. The pay range shown below is aligned with McKesson's pay philosophy, and pay will always be compliant with any applicable regulations. In addition to base pay, other compensation, such as an annual bonus or long-term incentive opportunities may be offered. For more information regarding benefits at McKesson, please click here.
Our Base Pay Range for this position
$89,700 - $149,500McKesson is an Equal Opportunity Employer
McKesson provides equal employment opportunities to applicants and employees and is committed to a diverse and inclusive environment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability, age or genetic information. For additional information on McKesson’s full Equal Employment Opportunity policies, visit our Equal Employment Opportunity page.
Join us at McKesson!