Line of Service
AdvisoryIndustry/Sector
Not ApplicableSpecialism
Data, Analytics & AIManagement Level
ManagerJob Description & Summary
At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals.As the Staff Data Platform Engineer in the Cyber Data Platform team, you will be key in designing, developing, and deploying a scalable data platform that empowers users to create advanced analytics, machine learning, and GenAI solutions to enhance our security defences. You will lead architectural decisions, guide technical direction, and mentor the engineering team to deliver high quality data services.
Responsibilities:
Technical Leadership: Take in charge of the design, development and delivery of cyber data platform solutions and provide technical guidance and mentorship across all teams of cyber data platform.
Data Architecture: Design, build and management of realtime, near realtime and batch data architectures that support threat detection, incident response and reporting through advanced analytics, machine learning and GenAI capabilities.
Coding: Contribute to raising the quality bar of the team's codebase by producing high-quality code, conducting thorough peer reviews, and proactively providing constructive feedback to other team members on their code
Data Security and Compliance: Implement and enforce data security standard methodologies, ensuring the protection of sensitive information and compliance with relevant regulations.
Automation and DevOps: Implement various automations and DevOps practices to streamline the deployment, configuration, and management of data platform components.
You will need
Proven experience in leading the teams to deliver resilient systems and providing technical guidance for optimal product solutions.
Experience building data platforms on cloud services like Databricks on Azure or GCP BigQuery, with the goal of delivering a self-service data platform for users.
Proficient in Python and SQL, with a solid grasp of architectural patterns, coding standards, code reviews, version control, and CI/CD practices.
Expertise in ETL and ELT frameworks for large scale realtime and batch data processing, with hands-on experience in Kafka, Flink, Airflow and dbt as well as containerisation technologies like Docker and Kubernetes.
Ability to provide clear input, guide, opportunities to help engineers develop and advance.
Knowledge of cybersecurity principles and practices.
Mandatory Skill sets:
Databricks on Azure or GCP BigQuery,Python, Devops, CICD
Preferred skill sets:
Gen AI
Years of Experience Required:
6-9 Yrs
Education Qualification
B.E,B.Tech,M.E,MCA,M.Tech
Education (if blank, degree and/or field of study not specified)
Degrees/Field of Study required: Bachelor of Engineering, Master of EngineeringDegrees/Field of Study preferred:Certifications (if blank, certifications not specified)
Required Skills
Azure DevopsOptional Skills
Accepting Feedback, Accepting Feedback, Active Listening, Algorithm Development, Alteryx (Automation Platform), Analytical Thinking, Analytic Research, Big Data, Business Data Analytics, Coaching and Feedback, Communication, Complex Data Analysis, Conducting Research, Creativity, Customer Analysis, Customer Needs Analysis, Dashboard Creation, Data Analysis, Data Analysis Software, Data Collection, Data-Driven Insights, Data Integration, Data Integrity, Data Mining, Data Modeling {+ 43 more}Desired Languages (If blank, desired languages not specified)
Travel Requirements
Not SpecifiedAvailable for Work Visa Sponsorship?
NoGovernment Clearance Required?
NoJob Posting End Date
April 7, 2026