Assurant

Data Engineer

Hyderabad Full time

Data Engineer, Assurant, GCC-India

Data Engineer reports to the Manager of Data Services and is responsible for developing Data Engineering solutions that conform to group standards, budgets and agreed estimated timelines. This role wrangles data in support of Data Science projects and constructs, tests and maintains scalable data solutions for structured and unstructured data to support reporting, analytics, ML and AI.

Data Engineer assists with optimization the performance of bigdata eco-systems while assisting with research and building of proof of concepts to test out theories recommended by Senior and Lead Data Engineers.

This position will be in Hyderabad at our India location.

Working Hours - 3:30 PM to 12:30 AM

What will be my duties and responsibilities in this job?

  • Gain a thorough understanding of the requirements and ensure that work product aligns with customer requirements
  • Works within the established development guidelines, standards, methodologies, and naming conventions
  • Builds processes to ingest, process and store massive amount of data
  • Assists with optimization the performance of bigdata ecosystems
  • Wrangles data in support of data science projects
  • Performs product ionization of ML and statistical models for Data Scientists & Statisticians
  • Constructs, tests and maintains scalable data solutions for structured and unstructured data to support reporting, analytics, ML and AI
  • Assists with research and building of proof of concepts to test out theories recommended by Senior and Lead Data Engineers
  • Collaborates and contributes to identifying project risks, design mitigation plans, develops estimates
  • Contributes to the design, development of data-pipelines, and feature engineering of data solutions.
  • Established processes or methods are still relied on, however, Data Engineer will be required to come up with creative solutions to problems.
  • Senior teammates are still needed to provide oversight into solutions to complex problems.
     

What are the requirements needed for this position?

  • Bachelor of Science in a related field required.
  • 5 years of design and development experience.
  • Azure Databricks, data factory , Azure Storage (ADLS).
  • Knowledge of CRISP-DM methodology relevant to Data Engineering i.e: Data preparation and Deployment.
  • Knowledge of Big Data technologies, concepts, and their applications for data processing.
  • Advanced knowledge of Business Intelligence, Data Warehousing.
    Knowledge in fundamentals of Machine Learning and Artificial Intelligence using Microsoft technologies.
  • Performance tuning and code optimization in SQL.
  • Data profiling and dimension modeling techniques and creation of logical and physical data models.
  • Experience working with job scheduling tools.
  • Experience with SQL, C#, .NET, Python, Linux Shell Scripting or MS Power Shell, R or SAS, Spark, Hive, Pig, NoSQL
  • Experience with Cloud Service Models: PaaS, IaaS, SaaS