About Betfair Romania Development:
Betfair Romania Development is the largest technology hub of Flutter Entertainment, with over 2,000 people powering the world’s leading sports betting and iGaming brands. Exciting, immersive and safe experiences are delivered to over 18 million customers worldwide, from our office in Cluj-Napoca. Driven by relentless innovation and commitment to excellence, we operate our own unbeatable portfolio of diverse proprietary brands such as FanDuel, PokerStars, SportsBet, Betfair, Paddy Power, or Sky Betting & Gaming.
Our Values:
The values we share at Betfair Romania Development define what makes us unique as a team. They empower us by giving meaning to our contributions, and they ensure that we consistently strive for excellence in everything we do. We are looking for passionate individuals who align with our values and are committed to making a difference.
Win together | Raise the bar | Got your back | Own it | Positive impact
About the Brand: FanDuel
FanDuel is a leading force in the sports-tech entertainment industry, redefining how fans engage with their favourite sports, teams, and leagues. As the premier gaming destination in North America, FanDuel operates across multiple verticals, including sports betting, daily fantasy sports, online gaming, advance-deposit wagering, and media.
Exposure to emerging innovative technologies in business applications is an advantage
The Role
We are looking for a Senior Data Engineer to design and build scalable, reliable, and high-quality data solutions that power analytics, machine learning, and business operations. You’ll be a hands-on contributor who plays a key role in shaping the technical foundation of our data ecosystem.
As a senior member of the team, you’ll work closely with engineers, analysts, product managers, and data scientists to deliver data pipelines and platforms that drive meaningful business outcomes. The ideal candidate is a strong problem solver, a team player, and someone who brings both technical depth and a collaborative mindset to every challenge.
If you’re excited by this challenge and want to work within a dynamic company, then we’d love to hear from you.
Key Accountabilities & Responsibilities:
Build & Maintain Data Infrastructure
Design, develop, and maintain scalable ETL/ELT pipelines to support data products, analytics, and operational systems.
Work with large-scale, complex datasets and ensure data is clean, well-modeled, and accessible across the organization.
Develop and maintain batch and streaming data pipelines and data platforms using modern tools and technologies.
Deliver High-Quality, Reusable Solutions
Build modular, reusable data models and assets using tools like Spark, dbt or similar frameworks.
Collaborate with stakeholders to understand business needs and translate them into reliable, production-ready data solutions.
Ensure data quality, accuracy, and consistency through validation frameworks and monitoring.
Contribute to Engineering Excellence
Follow and promote software engineering best practices, including testing, version control, documentation, and code reviews.
Actively participate in technical design discussions, and architecture reviews.
Help evolve our data platform and tooling to improve performance, developer experience, and scalability.
Collaborate & Communicate
Partner with data scientists, analysts, and product managers to support data-driven decision-making.
Clearly communicate technical concepts, project status, and risks to stakeholders.
Share knowledge and mentor junior engineers through code reviews and collaborative problem-solving.
Skills, Capabilities & Experience Required:
5+ years of experience in data engineering or software engineering with a focus on building data pipelines and platforms.
Strong SQL skills and fluency in at least one programming language (e.g., Python, Scala, or Java).
Hands-on experience with modern data tools such as Airflow, Spark, dbt, Kafka, or Databricks.
Deep understanding of data modeling, data warehousing, and best practices for data quality and observability.
Experience with cloud-based data platforms (e.g., AWS, GCP, Azure).
Strong problem-solving skills and ability to work independently in a fast-paced environment.
Preferred Skills
Experience supporting analytics and machine learning workflows.
Familiarity with data governance, privacy, and compliance frameworks.
Experience working in a product-led or customer-centric organization.
Familiarity with infrastructure-as-code or DevOps practices related to data systems.
Benefits:
Hybrid & remote working options
€1,000 per year for self-development
Company share scheme
25 days of annual leave per year
20 days per year to work abroad
5 personal days/year
Flexible benefits: travel, sports, hobbies
Extended health, dental and travel insurances
Customized well-being programmes
Career growth sessions
Thousands of online courses through Udemy
A variety of engaging office events
Disclaimer:
We are an inclusive employer. By embracing diverse experiences and perspectives, we create a lasting, positive impact for our employees, customers, and the communities we’re part of. You don't have to meet all the requirements listed to apply for this role. If you need any adjustments to make this role work for you, let us know, and we’ll see how we can accommodate them.
We thank all applicants for their interest; however, only the candidates who best meet the job requirements will be contacted for an interview.
By submitting your application online, you agree that your details will be used to progress your application for employment. If your application is successful, your details will be used to administer your personnel record. If your application is unsuccessful, we will retain your details for a period no longer than three years, to consider you for prospective roles within the company.