This Department of War enterprise data and analytics program delivers mission-critical capabilities that enable leaders across the Department to make faster, better-informed decisions using trusted data at scale. Leidos Digital Modernization sector is seeking an experienced Senior Data Engineer Lead to support the delivery, enhancement, and adoption of enterprise data and analytics products used across multiple DoD organizations.
In this role, you will work alongside government partners, engineers, and other industry teammates to translate operational and strategic requirements into scalable, production-ready solutions. You will contribute directly to product planning, execution, and continuous improvement—helping ensure capabilities are delivered efficiently, aligned to mission priorities, and positioned for sustained success.
This position offers the opportunity to work on a high-visibility, enterprise program at the intersection of data, analytics, and emerging AI technologies. Ideal candidates are motivated by mission impact, comfortable operating in complex stakeholder environments, and interested in building deep domain expertise while delivering capabilities with real-world national security outcomes.
Primary Responsibilities:
Design, build, and maintain data pipelines and architectures to support data ingestion, transformation, integration, storage, and dissemination.
Apply software engineering and ETL principles to ensure data accuracy, quality, consistency, and scalability.
Integrate COTS and customer-developed tools within existing data frameworks to meet operational and analytical requirements.
Collaborate with DataOps teams to prepare, automate, and optimize data workflows for real-time analytics.
Implement and enforce data security policies, including data encryption and access controls.
Monitor data quality and implement proactive alerting on data pipelines.
Develop and maintain documentation for data pipelines, models, and governance policies.
Provide Tier-2 and Tier-3 support for enterprise data products and services.
Conduct root cause analysis for recurring issues and implement solutions to prevent future occurrences.
Develop and maintain training materials and a centralized knowledge repository for data operations.
Proactively communicate with customers regarding known issues and new features.
Collect customer feedback to identify areas for improvement and enhance customer satisfaction.
Manage a team of 8-15 direct reports, providing guidance and support for their professional development.
Foster a collaborative team environment that encourages innovation and continuous improvement.
Ensure compliance with all applicable regulations related to data privacy and security.
Basic Qualifications:
Active Top Secret (TS) clearance with SCI eligibility.
Bachelor’s degree in Computer Science, Data Science, Engineering, Information Systems, or related technical discipline and 8–12 years of relevant experience OR Master’s degree in a related field and 6–10 years of relevant experience.
Minimum of 8 years of experience in data engineering or related roles.
Proven experience in designing and implementing data pipelines and architectures.
Experience with data quality monitoring tools and processes.
Excellent communication and interpersonal skills.
Experience developing and maintaining enterprise-scale data pipelines in cloud environments (AWS, Azure, or GCP).
Experience implementing ETL/ELT processes and data orchestration frameworks.
Strong knowledge of ETL processes and data integration techniques.
Experience working with structured and unstructured data sources, including APIs, streaming platforms, and relational/non-relational databases.
Experience integrating data pipelines into DevSecOps CI/CD environments.
Demonstrated experience leading and mentoring technical teams.
Preferred Qualifications:
Active TS/SCI clearance.
Experience with cloud-based data solutions and architectures.
Familiarity with data governance frameworks and best practices.
Experience operating within SAFe or large-scale Agile frameworks supporting enterprise systems.
Experience supporting data platforms across NIPRNet, SIPRNet, and JWICS environments.
Experience implementing data governance controls, lineage tracking, and metadata management frameworks.
Experience supporting AI/ML data preparation and feature engineering workflows.
Knowledge of machine learning and AI principles.
Experience working with containerized environments (e.g., Docker, Kubernetes).
Relevant cloud or data engineering certifications (e.g., AWS Data Analytics, Azure Data Engineer, or equivalent).
Certifications in data engineering or related technologies (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer).
If you're looking for comfort, keep scrolling. At Leidos, we outthink, outbuild, and outpace the status quo — because the mission demands it. We're not hiring followers. We're recruiting the ones who disrupt, provoke, and refuse to fail. Step 10 is ancient history. We're already at step 30 — and moving faster than anyone else dares.
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.