Maxis

Data Engineer

Menara Maxis Full time

Are you ready to get ahead in your career?

  • We want to empower you to turn your ambitions into achievements.
  • We thrive in inclusiveness, diversity and embrace close collaborations for you to create impact for yourself and others.
  • Together, we aim to bring the best of technology to help people, businesses and the nation to be ahead in a changing world.
  • To realise our vision to become Malaysia’s leading converged solutions company, we are looking for a new talent to innovate and grow with us in a culture that values commitment, performance and possibilities.

Why does this job exist and why is it critical?​

The Data Engineer works collaboratively with end users to develop robust data products and reporting systems that provide accessible information for decision-making. This role focuses on engineering efficient data pipelines and semantic layers within Google Cloud Platform (GCP) to power high-performance dashboards in Microsoft Fabric (Power BI).

The Data Engineer is responsible for the design, development, testing, implementation, and support of insightful BI solutions, utilizing BigQuery for data warehousing and Python for complex data manipulation. Together with broader teams, the Data Engineer will also continuously drive the adoption of Fabric/Power BI tools and self-service analytics in Maxis to empower data users with faster insights and informed decision-making.

What are you accountable for?

  • Requirements Engineering: Work with cross-functional internal stakeholders to gather and document dashboard requirements; understand, analyze, and translate business requirements into technical specifications for self-serve dashboards.

  • Data Engineering & Visualization: Responsible for designing, developing, testing, and implementing reports and dashboards in Microsoft Fabric/Power BI that utilize curated datasets from GCP BigQuery.

  • Documentation & Transition: Transition developed pipelines and dashboards to the Operations & Support team with detailed briefings and documentation. The Data Engineer will create and maintain user requirements documents, data lineage, and data dictionaries to ensure accuracy whenever there are enhancements or changes.

  • Data Integration & Scripting: Analyze large amounts of data from various sources (e.g., On-prem databases, Cloud-based solutions) to define data requirements. Use Advanced SQL and Python to perform data transformations, cleaning, and automation within BigQuery or Cloud Functions to prepare data for BI consumption.

  • UAT & Collaboration: Work closely with data stewards, data factory (ISD), business users, and vendors during user acceptance testing (UAT) and operational matters to ensure data pipelines and dashboards are delivered as per baseline specifications.

  • Performance & Support: Provide support and maintenance of dashboards to ensure availability and query performance. Build automated data quality checks (using SQL/Python) to constantly monitor data integrity and ensure accurate reporting.

  • Adoption & Self-Service: Promote self-service and drive Microsoft Fabric/Power BI adoption to enable data exploration, analysis, and visualization, making analytics results available to business users for operational decision-making and strategic planning.

  • Demand Management: Assist in data service demand management, translating business requirements into specifications for the Big Data COE team. Coordinate user acceptance tests to ensure data requests are delivered as per target timelines.

What do you need to have to fit this role?

  • Education: Possess a Bachelor’s Degree specialized in: Computer Science, Information Technology, Data Engineering, or a related field.

  • Experience: Candidate should have a minimum of 5 to 8 years of progressive experience in Data Engineering, BI Development, or Analytics roles.

  • Technical Skills (SQL & Python):

    • Proficient in Advanced SQL (Window functions, nested queries, stored procedures), specifically within Google BigQuery.

    • Competent in Python for data manipulation (pandas) and scripting automation tasks.

  • Visualization Skills:

    • Strong knowledge in dashboard creation using Microsoft Fabric / Power BI (including DAX and Power Query).

    • Experience with other tools (Tableau, Looker) is an added advantage but not mandatory.

  • Cloud & Database Knowledge:

    • Strong understanding of GCP architecture, specifically BigQuery datasets, partitioning, and clustering for optimization.

    • Experience in data modeling (Star Schema, Snowflake Schema) to support analytical initiatives.

  • General Competencies:

    • Possesses strong analytical and troubleshooting skills, particularly in debugging SQL queries and pipeline failures.

    • Demonstrated detailed knowledge of BI areas including ETL/ELT design, analytics, and reporting.

    • Ability to work cohesively within the Information Technology division and cross-functional divisions.

    • Telco industry experience would be an added advantage.

What’s next?

  • Once you’ve applied online, our team will carefully review your application. Due to a high volume of applications, we appreciate your patience to allow for a fair and timely review process.
  • Should you be shortlisted for the role, we will send you an invitation via email for a digital interview. You can also check on your application status by logging into your candidate account.

Maxis values diverse voices & people. We hire and reward our employees based on capability & performance — regardless of ethnicity, gender, age, education, religion, nationality or physical ability.