M&G PLC

CFT - AVP Test Manager

Pune Full time

We are M&G Global Services Private Limited (formerly known as 10FA India Private Limited, and prior to that Prudential Global Services Private Limited). We are a fully owned subsidiary of the M&G group of companies, operating as a Global Capability Centre providing a range of value adding services to the Group since 2003.

Our purpose is to give everyone real confidence to put their money to work. With a heritage dating back more than 175 years, we have a long history of innovation in savings and investments, combining asset management and insurance expertise to offer a wide range of solutions. 

Our two distinct operating segments, Asset Management and Life, work together to provide access to balanced, long-term investment and savings solutions.

M&G Global Services has rapidly transformed itself into a powerhouse of capability that is playing an important role in M&G’s ambition to be the best loved and most successful savings and investments company in the world.

Our diversified service offerings extending from Digital Services (Digital Engineering, AI, Advanced Analytics, RPA, and BI & Insights), Business Transformation, Management Consulting & Strategy, Finance, Actuarial, Quants, Research, Information Technology, Customer Service, Risk & Compliance and Audit provide our people with exciting career growth opportunities. Through our behaviours of telling it like it is, owning it now, and moving it forward together with care and integrity; we are creating an exceptional place to work for exceptional talent.

Key Responsibilities:

  • To ensure testing at a programme level is efficiently and effectively delivered within time, quality and cost constraints
  • To engage with and manage a broad range of stakeholders in order to deliver testing programmes across CFT
  • To manage teams of testers, including permanent staff, contractors and 3rd parties who support in the planning and execution of testing activity
  • Lead and where necessary able to do integration testing, API testing, E2E acceptance testing, SCRUM and KANBAN 
  • Support data validation across ETL pipelines, data warehouses, and reporting systems
  • Execute SQL queries to verify data completeness, accuracy, consistency, and transformation logic.
  • Provide the test strategy/test plan and meet agreed deadlines for testing activities, including production of test scripts and test cases.
  • Implement an integrated Quality Approach in a DevOps / legacy ecosystem 
  • Take ownership of Test Automation, building automated tests to support our continuous deployment environment 
  • Take ownership of quality checks for production, deployments of new changes/Enhancements/Bug fixes  etc. 
  • Compile and publish accurate and timely information to support decision making, prioritisation and status assessment
  • Continuously analyse and interpret testing MI, highlighting anomalies and identifying patterns requiring remediation

Additional Responsibilities:

  • Define and execute the enterprise-wide ETL & Data Quality testing strategy aligned to data and finance transformation goals.
  • Standardize and horizontalize reusable ETL test frameworks, SQL libraries, reconciliation accelerators, and DQ rule repositories.
  • Enable continuous data validation by integrating DQ checks and automated tests into CI/CD and orchestration pipelines.
  • Ensure governance, compliance, and audit readiness for all finance-related data transformations and reporting flows.
  • Manage and mature the Data Testing Community of Practice, uplifting skills, capability, and reusability across teams.
  • Partner with Data Engineering, Finance, Product Owners, and SMEs to ensure accurate validation of business‑critical data pipelines.
  •  Lead the definition and tracking of KPIs covering data accuracy, DQ scores, defect leakage, pipeline stability, and test efficiency.
  • Build and promote metadata-driven, schema-aware, and lineage-based testing practices for modern data platforms.
  • Oversee test environment readiness, data refreshes, and synthetic test data creation for controlled and compliant testing.
  • Drive innovation by adopting AI/ML‑based validation techniques, anomaly detection, self‑healing data tests, and predictive DQ insights.
  •  Guide teams through testing of structured, semi‑structured, and unstructured datasets across cloud and on-premises platforms.
  • Ensure effective stakeholder communication, aligning cross-functional teams and providing transparent reporting on test readiness and data quality.

Key Stakeholder Management

Internal

  • UK Finance Accounting
  • UK Finance Actuarial
  • UK Corporate Functions technology

External

  • Software Suppliers
  • External Consultants

Knowledge, Skills, Experience & Educational Qualification

Knowledge & Skills:

  • Deep expertise in ETL, data warehousing, data lakes, data mesh, and large-scale data transformation ecosystems across cloud platforms as in Azure.
  • Advanced SQL proficiency with strong ability to validate complex data transformations, reconcile large datasets, and interpret financial data mappings.
  • Strong understanding of data modelling concepts including star schema, snowflake schema, dimensional modelling, and curated semantic layers.
  • Expertise in enterprise-grade data quality frameworks covering dimensions such as accuracy, completeness, consistency, timeliness, and validity.
  • Ability to implement metadata-driven, schema-aware, and lineage-based testing strategies across structured and unstructured data.
  • Hands-on experience with ETL and orchestration tools such as ADF, Databricks, Airflow, Informatica, Talend, dbt, or similar.
  • Strong knowledge of data governance, data controls, GDPR, and regulatory reporting requirements.
  • Ability to interpret, validate, and troubleshoot finance, investment, and regulatory datasets with strong domain understanding.
  • Familiarity with data observability concepts including freshness, volume, distribution, lineage, and anomaly detection.
  • Experience with cloud-native data platforms such as Databricks, Snowflake, Synapse, Redshift, or BigQuery.
  • Strong scripting capability in Python, PySpark, or Shell to build validation utilities or automation accelerators.
  • Ability to evaluate and integrate data quality, metadata, and lineage tools within enterprise ecosystems.
  • Experience creating horizontal testing assets such as SQL validation libraries, reconciliation engines, and reusable rule repositories.
  • Knowledge of AI and ML techniques applied to quality engineering, including anomaly detection, data drift monitoring, and predictive scoring.
  • Understanding of DevOps, CI/CD pipelines, and continuous data validation practices.
  • Strong leadership abilities to mentor teams, run a Data QA Community of Practice, and drive capability uplift.
  • Excellent stakeholder management skills with the ability to engage senior leaders and cross-functional teams.
  • Strong analytical and problem-solving skills with the ability to navigate complex data landscapes.
  • High attention to detail, strong documentation skills, and commitment to quality, accuracy, and governance.

Experience:

  • 13–15 years of total experience in Quality Engineering, with 8–10 years specialising in Data / ETL / DWH / Data Quality testing in enterprise environments.
  • Proven experience leading large‑scale data transformation programs across Finance, Investments, Risk, Operations, or enterprise data platforms.
  • Strong background in validating data lakes, data warehouses, ETL pipelines, data mesh architectures, and cloud-native data systems (Azure/AWS/GCP).
  • Hands‑on expertise in complex SQL, transformation logic validation, data reconciliation, and large‑volume dataset verification.
  • Experience managing teams of Data Test Engineers, including multi‑vendor and multi‑location delivery teams.
  • Demonstrated capability in building and scaling Data Quality Frameworks, metadata-driven validation, and reusable accelerators for ETL testing.
  • Proven track record establishing test governance, audit compliance, lineage-based controls, and regulatory‑ready test evidence.
  • Experience evaluating, selecting, and integrating testing tools for DQ, ETL automation, orchestration validation, schema evolution, lineage verification, and data observability.
  • Exposure to AI/ML‑enabled data quality techniques such as anomaly detection, drift monitoring, predictive DQ scoring.
  • Experience collaborating with senior stakeholders such as Finance Data Owners, Product Owners, CDO Office, Data Engineering Leads, Enterprise Architects, and Transformation Heads.
  • Experience in modern engineering environments with Agile, DevOps, CI/CD, and continuous data validation pipelines.
  • Strong track record in driving horizontalization, standardization, and CoE capability uplift.
  • Experience working in global organizations, preferably GCC, BFSI, Asset Management, or enterprise-scale data-driven environments.

Educational Qualification:

  • Graduate in any discipline, preferably in CS or relevant background

We have a diverse workforce and an inclusive culture at M&G Global Services, regardless of gender, ethnicity, age, sexual orientation, nationality, disability or long term condition, we are looking to attract, promote and retain exceptional people. We also welcome those who take part in military service and those returning from career breaks.