PwC

IN-Senior Associate_ Azure Devops__ Data Analytics _ Advisory _Bangalore

Bengaluru Millenia Full time

Line of Service

Advisory

Industry/Sector

Not Applicable

Specialism

Data, Analytics & AI

Management Level

Senior Associate

Job Description & Summary

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth.

In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.

*Why PWC

At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.

At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. "

Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge.  

 

Responsibilities: 

 

  • The DevOps Lead Engineer is responsible for leading E2E setup of enterprise scale platforms from the ground up, including GitHub Enterprise, JFrog Artifactory, Azure Kubernetes Service (AKS), Argo CD, and shared Azure networking, ensuring security, scalability, resiliency, and reliability. 

  • Architect and implement enterprise ready platform foundations, defining network topology, identity and access management, security controls, high availability, disaster recovery, and governance standards across all platforms. 

  • Design, build, and operate AKS platform architectures, including cluster strategy, nodes, ingress, networking, secrets management, workload isolation, GitOps integration, and lifecycle management to support large scale adoption. 

  • Strong Linux administration & operations expertise, using advanced command line tools to provision, configure, secure, and operate AKS nodes at scale. 

  • Establish and configure GitOps based deployment models using Argo CD, including application onboarding, environment promotion strategies, RBAC, multi cluster management, and integration with GitHub and AKS. 

  • Set up and govern source control and artifact management platforms -GitHub Enterprise and JFrog Artifactory, defining organization structures, access models, security scanning, retention policies, and CI/CD integrations. 

  • Drive CI/CD and deployment strategies for containerized Kubernetes workloads, enabling Blue Green and zero downtime releases. 

  • Automate infrastructure provisioning & operations using Terraform, PowerShell, Python, and Bash, ensuring consistency, and operational efficiency. 

  • Design observability for availability, performance, and reliability across platforms. 

  • Partner with global stakeholders to align platform capabilities with product needs, ensuring secure Azure environments using industry best practices. 

  • Create a healthy team culture aligned with the mission and values of Providence i.e., Integrity, Compassion, Dignity, Respect, Justice. 

Required Skills:-  

  • 5+ years of experience in Microsoft Azure based services and Devops tools 

  • Demonstrated problem solving skills and customer service skills. 

  • Strong technical problem-solving skills, analytical skills to define architecture and setup Enterprise grade platform foundations like AKS, GitHub, JFrog, ArgoCD. 

  • An expert in designing, building, and operating AKS clusters from scratch, including node pools, ingress, service, autoscaling, upgrades, security, and reliability. 

  • Deep handson experience in AKS networking and integrating Kubernetes with Azure services, JFrog Artifactory, GitOps, Snowflake, SQL, Databricks, and ADF. 

  • Experience in Azure Administration, in various services i.e., AKS, AzureSQL , AzureVM, AzureSQL, ADF etc., which includes understanding network configuration, best practices of deploying services, Performing automation using IaaC approach and also troubleshooting production issues  

  • Good experience in setting up Blue-green deployment strategies on Kubernetes. 

  • Optimize cluster utilization, runner capacity, artifact storage for cost, performance. 

  • Expertise in GitHub Enterprise setup, governance, security policies, and selfhosted GitHub Actions runners at scale. Set up and integrate JFrog Artifactory for CICD. 

  • Setup Argo CD, with GitOps, env promotion, and CICD deployments for AKS. 

  • Ability to drive E2E Setup, configuration independently with minimal supervision 

  • Strong Linux administration and operations expertise to build, operate AKS nodes. 

  • Experience in Scripting (Terraform, PowerShell/ Python/ Bash) for automations.  

  • Good understanding of Azure concepts including Networking, Routing, Load Balancing, Disaster Recovery, Firewalls, ADF Pipelines, Databricks, AKS, Docker, etc. 

  • Experience in cloud technologies & tools, Service Models, Deployment Models 

  • Experience in CDW (Snowflake) deployments, DataOps and MLOps is a plus 

 

 

Mandatory skill sets: 

 

  • experience in Microsoft Azure based services and Devops tools 

  • Strong technical problem-solving skills, analytical skills to define architecture and setup Enterprise grade platform foundations like AKS, GitHub, JFrog, ArgoCD. 

expert in designing, building, and operating AKS clusters from scratch, including node pools, ingress, service, autoscaling, upgrades, security, and reliability 

  • Deep handson experience in AKS networking and integrating Kubernetes with Azure services, JFrog Artifactory, GitOps, Snowflake, SQL, Databricks, and ADF. 

  • Deep handson experience in AKS networking and integrating Kubernetes with Azure services, JFrog Artifactory, GitOps, Snowflake, SQL, Databricks, and ADF. 

 

 

preferred skill sets:

 

Good understanding of Azure concepts including Networking, Routing, Load Balancing, Disaster Recovery, 

Years of experience required: 

 

Experience: 5-10 Years 
 

Education qualification: 

 

B.Tech / M.Tech / MCA/M.E 

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required: Bachelor of Engineering

Degrees/Field of Study preferred:

Certifications (if blank, certifications not specified)

Required Skills

Azure Devops

Optional Skills

Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more}

Desired Languages (If blank, desired languages not specified)

Travel Requirements

Available for Work Visa Sponsorship?

Government Clearance Required?

Job Posting End Date

May 20, 2026