We’re looking for motivated, engaged people to help make everyone’s journeys better.
We are seeking a skilled Data Engineer to drive the design, build, and maintain scalable data pipelines and infrastructure that support analytics, reporting, and data-driven decision-making. The ideal candidate has strong experience in data architecture, ETL processes, cloud platforms and database technologies. You will work closely with data analysts, data scientists, and business stakeholders to ensure high-quality, reliable, and accessible data solutions.
Data Architecture & Pipeline Development
- Design, develop, and maintain scalable data pipelines and workflows
- Build and manage ETL/ELT processes to ingest, transform, and load data from multiple sources
- Develop and maintain data warehouses and data lakes
- Integrate data from APIs, internal systems, third-party platforms, and databases
- Automate manual data processes to improve efficiency and scalability
Data Quality, Governance & Security
- Ensure data accuracy, integrity, and consistency across systems
- Implement and enforce data governance, security, and compliance standards
- Maintain clear documentation of data models, architecture, and processes
- Establish data validation, monitoring, and alerting mechanisms
Performance Optimization & Operations
- Monitor, troubleshoot, and optimize data workflows and pipeline performance
- Optimize database queries and improve overall system efficiency
- Support continuous improvement of data infrastructure
- Participate in code reviews and follow best practices in version control and CI/CD
Collaboration & Business Support
- Work closely with analysts, data scientists, and business stakeholders to translate requirements into technical solutions
- Provide clean, structured datasets to support analytics and reporting
- Support business intelligence initiatives and advanced analytics use cases
Education
- Bachelor’s degree in Information Management, Computer Science, Data Science, or related field or relevant work experience
Work Experience
- 2+ years of experience in data engineering or related role (adjust per level)
- Strong proficiency in SQL
- Experience with Python, Scala, or Java
- Experience with ETL/ELT tools (e.g., Fivetran, Airflow, dbt, Talend, Informatica)
- Hands-on experience with Snowflake, AWS, Azure, or GCP
- Experience with relational and NoSQL databases
- Understanding of data warehousing and dimensional modelling
Certification, licenses and registration
- Industry certifications (any professional-level cloud certification (AWS/Azure/GCP, Snowflake)
Core Competencies Required
- Strong analytical and problem-solving skills, ability to diagnose complex data issues and develop scalable solutions
- High attention to detail, ensuring data accuracy, integrity, and reliability
- Clear and effective communication skills, translating technical concepts for non-technical stakeholders
- Collaborative mindset, working effectively across cross-functional teams
- Strong sense of ownership and accountability for data quality and pipeline performance
- Adaptability and continuous learning orientation in a fast-evolving technology environment
If you want to be part of a team that helps make travel and culinary memories, join us!