Berkadia Services India Pvt Ltd
-----------------------------------------------------------------------------------------------
Integrity is Everything I We Take the Long View I We believe People Matter I We Stand for Excellence I We Love our Jobs I We Innovate
|
|
|
|
|
|
|
Job title:
Department:
Location:
|
Senior Data Engineer
Innovation Technology
Bangalore
|
|
Terms:
|
Full Time – Hybrid work model (subject to Berkadia policy changes)
Split Shift (3:00pm to 12:00am)
|
|
Working Hours
|
Reports To Philip Valent
The Opportunity
As a member of the Data Pipeline team you’ll be at the center of building our platform that enables us to redefine the Commercial Real Estate industry. We’re building a data pipeline that merges with the tested design of data warehousing and the latest big data technologies. You will be designing and building data projects while securing core data elements.
In this role, individual will be required to work on one or multiple aspects of Innovation verticals with the following key responsibilities –
- Research new technology and share knowledge with team and peers
- Design, implement and release data applications in AWS using big data technologies
- Coordinate development efforts across the organization
- Influence team in development standards and processes
Your Qualification
Education:
- Bachelor’s degree in information technology, Computer Science, Engineering, Information Systems, or a closely related technology field; or equivalent practical experience.
- Preferably Post Graduate degree including MBA/PGDBM
Qualifications:
Qualifications to perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the experience required:
- 5+ years of professional experience in data engineering or a closely related field.
- 5 -8 years of hands-on Apache Spark experience (PySpark preferred)
- Proven experience building and maintaining ETL/ELT pipelines using tools/frameworks such as PostgreSQL, Docker, Apache Airflow, AWS Step Functions, or similar.
- 1-2 years of experience building high-performance algorithms in scalable languages such as Scala, Python and R
- Strong expertise in SQL (complex queries, performance tuning, analytical functions).
- Excellent interpersonal, verbal and written communication skills
- Experience mentoring junior engineers and performing code reviews
- Strong logical, analytical, problem solving and reporting skills
- AWS Elastic Map Reduce (EMR) experience
- Experience with modern data warehousing and/or lakehouse architecture.
- Familiarity with CI/CD, Git, and agile development practices.
- Experience managing large scale products with a passion for quality/efficiency.
Related Jobs