* Design, build, and maintain efficient, scalable data architectures and pipelines.
* Process and transform large-scale datasets, ensuring data quality and integrity.
* Optimize ETL processes to enhance data processing efficiency and reliability.
* Collaborate with the data analytics team to provide necessary data support, ensuring timeliness and accuracy.
* Work with product and engineering teams to drive data-driven decision-making and product optimization.
* Design and implement data warehouse solutions, supporting both data lake and data warehouse architectures.
* Ensure data security, privacy, and compliance, adhering to best practices in data management.
* Bachelor’s degree or higher in Computer Science, Information Technology, Data Science, or a related field.
* Proficient in SQL, with the ability to handle complex queries and optimizations.
* Familiarity with big data technologies such as Spark, Hive, Flink, Kafka, etc.
* Proficient in programming languages such as Python or Java, with good coding practices.
* Experience with database design, data warehousing, and ETL tools (such as Airflow, Luigi).
* Familiarity with cloud platforms (AWS, Azure, GCP) and data lake technologies (Iceberg, Delta Lake).
* Strong communication and teamwork skills, with the ability to collaborate with cross-functional teams and drive project progress.
* Bonus: Experience in data quality, data governance, or related fields.
As an equal opportunity employer, we firmly believe that diverse voices fuel our innovation and allow us to better serve our users and the community. We foster an environment where every employee of Tencent feels supported and inspired to achieve individual and common goals.