Job Title: Senior Engineer – Data Engineer
Location: Pune, India
Corporate Title: AVP
Role Description
We are looking for a Senior Data Engineer to lead the design, development, and optimization of our data infrastructure across Oracle, PostgreSQL and Google Cloud platform (GCP). In this role, you'll build scalable data pipelines, mentor junior engineers, and collaborate cross-functionally to drive data-driven decision-making across the organization. The ideal candidate combines deep Oracle, PostgreSQL & GCP data engineering capabilities.
What we’ll offer you
As part of our flexible scheme, here are just some of the benefits that you’ll enjoy
Best in class leave policy
Gender neutral parental leaves
100% reimbursement under childcare assistance benefit (gender neutral)
Sponsorship for Industry relevant certifications and education
Employee Assistance Program for you and your family members
Comprehensive Hospitalization Insurance for you and your dependents
Accident and Term life Insurance
Complementary Health screening for 35 yrs. and above
Your key responsibilities
Good experience with PostgreSQL in Query optimization and vacuum tuning, logical & physical replication. Partitioning and indexing strategies, pgBounce/connection Pooling etc.
Strong hands-on experience in Oracle with SQL, PL/SQL, performance optimization, data security, encryption and auditing.
Experience with BigQuery, including advanced SQL, schema design (partitioning, clustering), performance tuning, cost optimization, and features like BigQuery ML and external tables.
Proficiency in Python for data processing, scripting, and development of custom Dataflow pipelines, Cloud Functions, and Airflow DAGs.
Experience with change data capture (CDC) mechanisms for efficient incremental data loading.
Design robust, scalable data models and data warehouse structures to support analytical and operational use cases
Optimize data pipeline performance and cost-efficiency across GCP services (BigQuery, Dataflow, CloudSQL) through diligent monitoring, tuning, and resource management.
Collaborate with stakeholders to translate business needs into technical solutions
Champion best practices in CI/CD, infrastructure as code (Terraform), and observability
Mentor junior data engineers and conduct technical reviews
Stay current with emerging GCP features and tools to keep the stack modern and efficient
Your skills and experience
6+ years of professional experience in data engineering
3+ years of hands-on experience with GCP data ecosystem
Advanced proficiency in SQL and programming languages like Python or Java
Strong understanding of ETL/ELT patterns, data warehousing, and streaming data architecture
Experience with workflow orchestration tools (e.g., Airflow / Cloud Composer)
Proven experience in schema design, performance tuning, and big data optimization
Familiar with DevOps practices, including Docker, CI/CD pipelines, monitoring, and IaC (Terraform)
Strong stakeholder management skills and the ability to communicate at senior level.
Proven experience of delivering results in matrixed organizations under pressure and tight timescales
Excellent verbal, interpersonal and written communication skills.
Bachelor’s degree in computer science or a related field.
How we’ll support you
Training and development to help you excel in your career
Coaching and support from experts in your team
A culture of continuous learning to aid progression
A range of flexible benefits that you can tailor to suit your needs
About us and our teams
Please visit our company website for further information:
https://www.db.com/company/company.html
We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.