Capco poland

GCP Data Engineer

India - Pune Full Time

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

 

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT



In this role, you will:

 

· Technical leadership for a team of engineers that focus on development, deployment and operations

· Lead and contribute to multiple pods with moderate resource requirements, risk, and/or complexity

· Interface technically with a range of stakeholders with customer and business impact

· Leading others to solve complex problems

· Experience in working within an agile, multidisciplinary DevOps team

· Migrate and re-engineer existing services from on-premises data centers to Cloud (GCP/AWS)

· ETL Engineer is responsible for performing system development work with background of various data models and data warehousing concepts.

· Write, analyse, review, and rewrite programs to departmental and Group standards

· Understanding the business requirements and provide the real-time solutions

· Following the project development tools like JIRA, Confluence and GIT

· Build and maintain operations tools for monitoring, notifications, trending, and analysis.

· Enhance programs to increase operating efficiency or adapt to new requirements

· Review code from team members Analyst/Developers as part of the quality assurance process

· Produce unit test plans with detailed expected results to fully exercise the code.

· Work closely with solution architect, business Analyst and technology lead to contribute in achieving final outcome.

· Demonstrate upward graph in adaption of various Engineering Practices.

 

To be successful in this role, you should meet the following requirements:

 

· Mandatory Skills: -

o Senior Data Engineer with Cloud (GCP) (Experience - 8 to 12 Years)

o Mandatory Google Cloud Big Query Scripting Skills or Cloud SQL hands-on experience

o Mandatory SQL PL/SQL Scripting Experience [High level expertise and hands-on work]

o Mandatory shell scripting Or Python Skills [High level expertise and hands-on work]

o Should have experience on any of the RDBMS

o Should have leadership experience.

 

· Good To Have: -

o GCP Data Engineer certifications is an added advantage.

o Experience/knowledge on GCP components like GCS, Big Query, Airflow, Cloud SQL and Google Cloud SDK

o Good to have knowledge - Preferable Control M or ETL Tool or Any from - UC4 Atomic, Airflow Composer

o Prefer to have Juniper Ingestion process awareness

o Experience in working within an agile, multidisciplinary ‘Dev-Ops’ team.

o Changes, Incident and Problem Management