PwC

IN-Manager_Data Architect_Data and Analytics_Advisory_Bangalore

Bengaluru Millenia Full time

Line of Service

Advisory

Industry/Sector

Not Applicable

Specialism

Data, Analytics & AI

Management Level

Manager

Job Description & Summary

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth.

In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.

*Why PWC

At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.

At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. "

Job Description & Summary:  

A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. 

Responsibilities: 

 

About the Role: 

 

We are hiring sharp, hands-on Data Architect to lead the design and implementation of scalable, high-performance data solutions across both traditional and cloud-based data platforms. This role demands deep expertise in PySpark, SQL, Python and Data Modelling, along with a strong understanding of cloud platforms and modern data engineering practices 

 

What you will do:  

  • Architect, Design and implement scalable end-end data solutions, ensuring scalability, performance, and cost-efficiency. 

  • Build and Deploy batch and near real-time use cases in cloud environments 

  • Development using Pyspark and Python scripts for large-scale data processing and ETL workflows  

  • Write optimized, complex SQL for data transformation and analysis. 

  • Optimize existing Pyspark and SQL scripts over large-scale datasets (TBs) with a focus on performance and cost-efficiency. 

  • Create and maintain data models, ensuring data quality and consistency  

  • Leverage AI/ML models in data transformations and analytics. 

  • Implement data governance and security best practices in cloud environments  

  • Collaborate across teams to translate business requirements into robust technical solutions 

Mandatory skill sets: 

 

‘Must have’ Primary skills and experiences  

  • 7+ years of hands-on experience in Data Engineering  

  • Strong command over SQL, Python, and PySpark for data manipulation and analysis  

  • Deep experience with data & analytics & warehousing and implementation in cloud environments (Azure/AWS 

  • Proficiency in data modeling techniques for cloud-based systems (Databricks, Snowflake) 

  • Solid understanding of ETL/ELT processes and best practices in cloud architectures 

  • Experience with dimensional modeling, star schemas, and data mart design  

  • Performance optimization techniques for cloud-based data warehouses  

  • Strong analytical thinking and problem-solving skills 

 

 

Secondary Skills: 

  • Airflow (Workflow Design and Orchestration) 

  • Apache Kafka – real-time streaming 

  • CI/CD (Automation, GitOps, DevOps for Data) 

  • Understanding of warehousing tools like Teradata, Netezza, etc. 

 

Preferred skill sets: 

 

‘Good to have’ knowledge, skills and experiences  

  • Familiarity with data lake architectures and delta lake concepts  

  • Data Warehouse experience using Databricks/Snowflake 

  • Knowledge of data warehouse migration strategies to cloud  

  • Experience with real-time data streaming technologies (e.g., Apache Kafka, Azure Event Hubs)  

  • Exposure to data quality and data governance tools and methodologies  

  • Understanding of  

  • Certifications in Azure or AWS or Databricks 

Years of experience required: 

 

Experience  

  • 7-10 years 

Certifications 

  • Spark Certified 

  • Databricks DE Associate/Professional Certified 

 

Good to Have: 

  • Snowflake SnowPro Core Certified 

Education qualification: 

  • BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) 

 

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering

Degrees/Field of Study preferred:

Certifications (if blank, certifications not specified)

Required Skills

Structured Query Language (SQL)

Optional Skills

Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more}

Desired Languages (If blank, desired languages not specified)

Travel Requirements

Not Specified

Available for Work Visa Sponsorship?

No

Government Clearance Required?

No

Job Posting End Date