DentsuAegis

Data Engineer

DGS India - Pune - Kharadi EON Free Zone Full time

Job Description:

Business Title    

Data Engineer
Years of Experience    

Min 3 and max upto 7.
Job Descreption    

"Looking for a hands‑on Senior Data Engineer – AWS with experience to development, build, and maintain scalable, secure, and high‑performance data platforms on AWS.
This is an individual contributor role focused on data pipeline development, cloud data engineering, and analytics enablement. The role requires strong hands‑on skills in AWS data services, SQL, and Python, along with experience building reliable batch and streaming data pipelines in a global delivery environment."
Must have skills    

"Cloud & Data Engineering (AWS)
Strong hands‑on experience with AWS data services, including:
- Amazon S3
- AWS Glue
- Amazon Athena
- Amazon Redshift
- Amazon EMR

Experience designing cloud‑native data lakes and data warehouse architectures
Solid understanding of batch data pipelines and basic exposure to streaming concepts

SQL & Python (Mandatory)
Strong SQL skills (mandatory)
Writing complex queries, joins, aggregations, and transformations
Experience working with large datasets in Redshift / Athena

Strong Python skills (mandatory)
Python for data engineering and ETL use cases
Experience with PySpark / Spark is a strong plus

Good understanding of data modeling, transformations, and performance tuning

Data Processing & Engineering
Hands‑on experience with distributed data processing frameworks (Spark / PySpark)
Experience handling structured and semi‑structured data
Understanding of schema evolution, data quality checks, and validation logic

DevOps & Platform Basics

Working knowledge of Infrastructure as Code (Terraform and/or CloudFormation)
Basic experience with CI/CD pipelines for data workloads
Understanding of logging and monitoring using CloudWatch

Collaboration

Ability to work closely with architects, DevOps, QA, and business stakeholders
Good communication skills to explain technical concepts clearly"
Good to have skills    "Exposure to streaming technologies such as Amazon Kinesis or Kafka
Familiarity with Lakehouse and modern data platform patterns
Experience integrating AWS data platforms with BI / reporting tools
Basic knowledge of data governance, data quality, and metadata concepts
Awareness of AWS cost optimization best practices
Experience working in Agile delivery models, with global clients
Exposure to AI / ML"
Key responsibiltes 

   "Data Engineering & Development
Design and build scalable ETL / ELT pipelines on AWS
Develop SQL‑based data transformations and Python‑based data pipelines
Implement data ingestion pipelines using AWS services such as S3, Glue, EMR
Build data models optimized for analytics, performance, and cost efficiency

Platform & Operations
Support deployment and execution of data pipelines across environments
Monitor pipeline performance, reliability, and data quality
Troubleshoot data pipeline issues and perform root‑cause analysis
Apply best practices for security, reliability, and scalability

Collaboration & Delivery
Work closely with architects and product teams to understand requirements
Translate business and analytics needs into working AWS data solutions
Contribute to documentation, code reviews, and engineering standards"
Education Qulification

    1. Bachelor’s or Master Degree or equivalent Degree
Certification If Any    

"1.AWS Certified Solutions Architect / DevOps – Professional
2. Snowflake Core  "
Shift timing    12 PM to 9 PM and / or  2 PM to 11 PM - IST time zone 

Location:

DGS India - Pune - Kharadi EON Free Zone

Brand:

Merkle

Time Type:

Full time

Contract Type:

Permanent