SG Lottery

Head of Lottery Data Engineering & Data Warehousing

Alpharetta, GA Full time

Scientific Games:

Scientific Games is the global leader in lottery games, sports betting and technology, and the partner of choice for government lotteries. From cutting-edge backend systems to exciting entertainment experiences and trailblazing retail and digital solutions, we elevate play every day. We push game designs to the next level and are pioneers in data analytics and iLottery. Built on a foundation of trusted partnerships, Scientific Games combines relentless innovation, legendary performance, and unwavering security to responsibly propel the global lottery industry ever forward.

Position Summary

Job Summary

The Head of Lottery Data Engineering & Data Warehousing leads the strategy, architecture, and delivery of SG's lottery data platform. This leader will ensure data from operational systems flows into trusted, contract-compliant, and scalable warehouse environments across jurisdictions, products, and customers.

The role requires deep business curiousity, sound data engineering judgment and close partnership with DBA, IT, data science, core engineering, and business teams. It defines data contracts, improves timeliness and quality SLAs, reduces KTLO through automation, and sets the transition from current platforms to a more scalable cloud- and AI-ready architecture. This role is a key member of the Product Platform and AI org leadership team.


Scope

  • Leads lottery data ingestion, warehousing, and platform modernization across jurisdictions, games, and data sources.

  • Builds repeatable patterns that lower the time and effort required to add new jurisdictions and new games.

  • Defines the reference architecture for acquisition, storage, resilience, governance, and access.

  • Improves timeliness, accuracy, contractual conformity, and cost-to-serve while reducing KTLO work.


Job Duties / Key Accountabilities

  • Set the strategy, roadmap, and operating model for lottery data engineering and data warehousing.

  • Lead and develop the team; set clear standards for delivery, quality, and operational ownership.

  • Partner with DBA, IT, data science, core engineering, and product teams so source systems produce usable, trusted data.

  • Define and negotiate data contracts covering schema, quality, timeliness, lineage, storage, and access constraints.

  • Establish standard ingestion patterns, tooling, and infrastructure as code that make high-quality data delivery the default.

  • Set, publish, and improve SLAs for timeliness, completeness, accuracy, and availability.

  • Reduce KTLO through automation, observability, resilience, and platform simplification.

  • Build and maintain a catalog of priority data sources and systems of record; drive onboarding plans for new jurisdictions, games, and sources.

  • Lead the move from legacy approaches to a more scalable cloud- and AI-ready platform while protecting business continuity.


Qualifications / Skills / Knowledge

Required

  • 10+ years of data engineering, data platform engineering, data warehousing, or related enterprise engineering experience.

  • 5+ years leading engineering teams, including managers or senior engineers.

  • Experience designing, building, and operating reliable data pipelines and warehouse platforms.

  • Experience with batch and streaming ingestion, ETL/ELT, orchestration, data modeling, data quality, lineage, and observability.

  • Experience defining data contracts and governance standards in environments with regulatory, contractual, or customer-specific constraints.

  • Experience working across DBA, IT, product, application engineering, and analytics or data science teams.

  • Experience modernizing legacy data platforms and planning phased transitions to cloud-based architectures.

  • Clear written and verbal communication skills; able to turn business needs into technical plans and technical constraints into business tradeoffs.

  • Ability to work with teams across multiple geographies

Desired

  • Lottery, gaming, payments, or another regulated transactional environment.

  • Production data platforms supporting analytics, BI, AI, or data science.

  • Infrastructure as code and platform engineering practices.

  • Multi-tenant, jurisdiction-specific, or contractually segmented data environments.

  • Disaster recovery, redundancy, and resilient data platform design.

  • Configurable or metadata-driven onboarding for new jurisdictions, games, or data sources.

  • AWS technologies.

  • Databricks.

Qualifications

Work Conditions

Scientific Games, LLC and its affiliates (collectively, “SG”) are engaged in highly regulated gaming and lottery businesses.   As a result, certain SG employees may, among other things, be required to obtain a gaming or other license(s), undergo background investigations or security checks, or meet certain standards dictated by law, regulation or contracts.   In order to ensure SG complies with its regulatory and contractual commitments, as a condition to hiring and continuing to employ its employees, SG requires all of its employees to meet those requirements that are necessary to fulfill their individual roles.  As a prerequisite to employment with SG (to the extent permitted by law), you shall be asked to consent to SG conducting a due diligence/background investigation on you.

This job description should not be interpreted as all-inclusive; it is intended to identify major responsibilities and requirements of the job. The employee in this position may be requested to perform other job-related tasks and responsibilities than those stated above. 

SG is an Equal Opportunity Employer and does not discriminate against applicants due to race, color, sex, age, national origin, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any other federal, state or local protected class. If you’d like more information about your equal employment opportunity rights as an applicant under the law, please click here for EEOC Poster.