Company Background
Our client started out as a smart recommendation engine to suggest personalized gift ideas. Over time, they expanded this into a custom enterprise solution. Now, out client collaborates with major telecom and media brands throughout Europe and the United States. Their platform supports millions of confident customer decisions daily for brands like Verizon, O2, Tesco, and Vodafone.
Project Description
Our client released a next-gen AI sales agent that brings together agentic AI and proven sales psychology. It leads customer dialogues, gives customized suggestions, manages objections, and helps users decide what to buy, narrowing the gap between e-commerce and in-store experiences.
Technologies
- Python
- ELT pipelines
- AWS
- Snowflake
- NoSQL
- Kafka
What You'll Do
- Contributing to our data engineering roadmap;
- Collaborating with senior data engineers on data architecture plans;
- Collaborating with cross-functional teams to develop and implement robust, scalable solutions;
- Supporting the elicitation and development of technical requirements;
- Building, maintaining and improving data pipelines and self-service tooling to provide clean, efficient results;
- Developing automated tests and monitoring to ensure data quality and data pipeline reliability;
- Implementing best practices in data governance through documentation, observability and controls;
- Using version control and contributing to code reviews;
- Supporting the adoption of tools and best practices across the team;
- Mentoring junior colleagues where appropriate;
Job Requirements
- 5+ years of solid commercial experience in a Data Engineering role;
- Excellent production-grade Python skills;
- Previous experience with real-time data streaming platforms such as Kafka/Confluent/Google Cloud Pub/Sub;
- Experience handling and validating real-time data;
- Experience with stream processing frameworks such as Faust/Flink/Kafka Streams, or similar;
- Comfortable with database technologies such as Snowflake/PostgreSQL and NoSQL technologies such as Elasticsearch/MongoDB/Redis or similar;
- Proficient with ELT pipelines and the full data lifecycle, including managing data pipelines over time;
- Good communication skills and the ability to collaborate effectively with engineers, product managers and other internal stakeholders;
- Knowledge of English - from Upper-Intermediate;
Nice to Have:
- Understanding of JavaScript/TypeScript;
- Understanding of Docker;
- Experience with Terraform;
- Experience with EKS/Kubernetes;
- Experience developing APIs;
What Do We Offer
The global benefits package includes:
- Technical and non-technical training for professional and personal growth;
- Internal conferences and meetups to learn from industry experts;
- Support and mentorship from an experienced employee to help you professional grow and development;
- Internal startup incubator;
- Health insurance;
- English courses;
- Sports activities to promote a healthy lifestyle;
- Flexible work options, including remote and hybrid opportunities;
- Referral program for bringing in new talent;
- Work anniversary program and additional vacation days.