PwC

IN_Senior Associate__GCP-DataPlex__ D&A_ Advisory_ Bangalore

Bengaluru Millenia Full time

Line of Service

Advisory

Industry/Sector

Not Applicable

Specialism

Data, Analytics & AI

Management Level

Senior Associate

Job Description & Summary

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth.

In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.

*Why PWC
At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more 
.
At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the
above considerations. "

Job Description & Summary: A career within…. A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. 

Responsibilities:

• Design, develop, and implement data engineering solutions on GCP aligned with data governance and data mesh principles.
• Architect and manage secure, scalable data pipelines using Dataflow, BigQuery, Pub/Sub, Dataproc, Dataplex, Data Catalog, Cloud Composer, and Cloud Storage.
• Develop ETL/ELT pipelines using DBT and work with databases such as Snowflake.
• Implement data mesh architecture (domain-oriented ownership, data-as-a-product, self-serve infrastructure, federated governance).
• Define and enforce data standards, lineage, metadata management, and data quality frameworks with governance teams.
• Establish data governance policies including data classification, IAM access controls, cataloging, auditing, discoverability, traceability, and compliance.
• Implement automation and CI/CD pipelines for data workflows using Terraform or Deployment Manager (IaC).
• Translate business requirements into scalable technical solutions aligned with governance and mesh frameworks.

Mandatory skill sets:

  • GCP Data Services: BigQuery, Dataflow, Pub/Sub, Dataproc, Dataplex, Cloud Storage, Data Catalog, Cloud Composer
    • Data Engineering & Data Architecture
    • Data Governance Frameworks: lineage, metadata management, data quality, security controls
    • Data Mesh architecture and domain-driven design
    • ETL/ELT development (DBT) and databases like Snowflake
    • Programming: SQL, Python and/or Java
    • Orchestration: Apache Airflow / Cloud Composer
    • Infrastructure as Code: Terraform / Deployment Manager
    • DevOps & CI/CD practices for data workflows

Preferred skill sets:

  • Experience with data catalog and metadata management tools beyond GCP native tools (e.g., Apache Atlas, Amundsen).
  • Knowledge of data privacy frameworks such as GDPR, CCPA.
  • Familiarity with containerization (Docker, Kubernetes) for data infrastructure.

Years of experience required:

7-11

Education Qualification

B.Tech / M.Tech / MBA / MCA

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering

Degrees/Field of Study preferred:

Certifications (if blank, certifications not specified)

Required Skills

Cloud Storage, Dataproc

Optional Skills

Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more}

Desired Languages (If blank, desired languages not specified)

Travel Requirements

Not Specified

Available for Work Visa Sponsorship?

No

Government Clearance Required?

No

Job Posting End Date