HRHub

Group Data Cloud Platform Architect

Porto Full time

Euronext

Euronext is the leading pan-European market infrastructure, shaping capital markets for future generations. Its mission is to connect European economies to global capital markets, to accelerate innovation and sustainable growth. Euronext is located in 18 countries across Europe, US and Asia, with regulated exchanges in Belgium, France, Ireland, Italy, the Netherlands, Norway and Portugal. The group has expanded organically and externally, with a revenue growing from €458 million in 2014 to €1.5 billion in 2022, with 2,200 employees and 55 nationalities.

Role profile

Euronext is seeking a Data Cloud Platform Architect with over 10 years expertise in AWS data services, extensive exposure to data engineering, experience in European corporate environments, and you should be based in Porto. Exposure to Azure technologies is nice to have. The ideal candidate will have experience with Infrastructure as Code, and a proven ability to implement and enforce data governance and security standards. You should have a strong track record of leading large scale cloud data projects, as well as hands-on coding skills in Python and SQL for prototyping and setting technical standards.

You will be responsible for designing, implementing, and maintaining scalable data solutions, primarily on AWS, ensuring alignment with defined data architecture and governance frameworks. You will collaborate with cross functional teams to develop and optimise data pipelines, ETL processes, and orchestration workflows using tools such as Apache Airflow, AWS Step Functions, Lambda, and Glue. Experience with Iceberg tables and managing large datasets is required. Exposure to Azure platform technologies, including Databricks, is considered an advantage.

As a Data Cloud Platform Architect, you will translate business requirements into technical solutions, distribute tasks effectively within your team, and select the most appropriate tools for cost and performance. You will provide technical leadership, mentorship, and guidance, ensuring adherence to best practices in production awareness, troubleshooting, and security.

Key Responsibilities

  • Demonstrate deep expertise in AWS data services, including Lambda, Glue, Step Functions, and Redshift, with exposure to Azure data services.
  • Lead the implementation and enforcement of data governance and security standards across cloud platforms.
  • Oversee the implement Infrastructure as Code (IaC) solutions using Terraform and CloudFormation.
  • Lead the optimisation projects, ensuring minimal disruption and maximum efficiency.
  • Apply hands-on coding skills in Python and SQL to prototype solutions and establish coding standards.
  • Develop robust solution architectures with a focus on scalability, performance, security, and cost optimisation.
  • Design efficient data models and optimise query performance for large datasets.
  • Manage ETL processes and data integration into Redshift, DuckDB, and PostgreSQL.
  • Set up and manage logging and tracing mechanisms in AWS using services such as CloudTrail and X-Ray.
  • Implement orchestration solutions using Apache Airflow and AWS Step Functions.
  • Utilise Athena for interactive query analysis of large datasets in Amazon S3.
  • Provide technical leadership and act as a subject matter expert in cloud data engineering.
  • Write comprehensive solution and technical documentation.
  • Stay updated on emerging technologies and industry trends.
  • Challenge business requirements and propose innovative solutions for efficiency and performance improvement.

The base skills we are searching are:

  • Deep expertise in AWS data services, with exposure to Azure data services.
  • Expected to be hands-on in developing Infrastructure as Code templates (e.g., Terraform) for platform deployment.
  • Proven ability to define and enforce data governance and security standards.
  • Demonstrated experience leading large-scale data migration and optimisation projects.
  • Strong programming skills in Python and SQL, with experience in prototyping and setting coding standards.
  • Define data engineering standards by creating PoCs and production-grade prototypes.
  • Responsible for establishing and optimizing CI/CD pipelines for data workloads.
  • Experience with Iceberg tables and managing large datasets efficiently.
  • Proficiency in designing scalable and efficient data solutions on AWS, following best practices for cloud architecture and infrastructure.
  • Experience with orchestration tools such as Apache Airflow and AWS Step Functions.
  • Knowledge of ETL tools and experience working with large volumes of data, with a preference for experience with Kafka.
  • Proactive approach to production monitoring and troubleshooting.
  • Excellent communication and teamwork skills, with the ability to provide technical leadership and mentorship.
  • Strong analytical and problem-solving skills, with the ability to analyse requirements and propose innovative solutions.
  • Experience in writing solution documents and technical documentation.
  • Familiarity with Azure Databricks for data engineering and analytics tasks is an advantage.

Key Behaviours

Partnership

Collaborative and open minded.

Attitude to works in team to effective build a strong relationship with the other team member and the other teams   

Innovation

Open to adopt new processes / approaches / ways of working.

Excellence

Oral and written communications are tailored to their audience.

Willingly to go the extra mile, put in the effort to ensure activities are completed on time and with the quality expected.

Pro-actively seeks information/inputs from colleagues/clients.

Candidate Profile

  • Must have:
    • BS/MS degree in Computer Science, Engineering or equivalent working experience
    • English (C1 or higher level)
    • Passion for technology and support.
    • Problem-solving skills.

  • Nice to have:
    • Apache Flink, Kafka, and other streaming data technologies knowledge
    • Certification in AWS, Azure, or similar technologies
    • Experience with Dataiku

Euronext Values

Unity

•        We respect and value the people we work with

•        We are unified through a common purpose

•        We embrace diversity and strive for inclusion

Integrity

•        We value transparency, communicate honestly and share information openly

•        We act with integrity in everything we do

•        We don’t hide our mistakes, and we learn from them

Agility

•        We act with a sense of urgency and decisiveness

•        We are adaptable, responsive and embrace change

•        We take smart risks

Energy

•        We are positively driven to make a difference and challenge the status quo

•        We focus on and encourage personal leadership

•        We motivate each other with our ambition

Accountability

•           We deliver maximum value to our customers and stakeholders

•           We take ownership and are accountable for the outcome

•           We reward and celebrate performance

We are proud to be an equal opportunity employer. We do not discriminate against individuals on the basis of race, gender, age, citizenship, religion, sexual orientation, gender identity or expression, disability, or any other legally protected factor. We value the unique talents of all our people, who come from diverse backgrounds with different personal experiences and points of view and we are committed to providing an environment of mutual respect.

Additional Information

This job description is only describing the main activities within a certain role and is not exhaustive. It does not prevent to add more tasks, projects.