Target

Sr Engineer - Ad Tech

Bangalore,India Full time

About us:

Working at Target means helping all families discover the joy of everyday life. We bring that vision to life through our values and culture. Learn more about Target here.


The Data Engineer plays a key role in delivering scalable, reliable, and high-quality data solutions that empower business teams with timely, accurate, and actionable insights. This role combines hands-on engineering expertise with technical leadership across a global data engineering team.

  • The Data Engineer is responsible for designing, developing, and maintaining robust data pipelines, distributed systems, and large-scale data platforms using modern Big Data technologies. The role involves building and optimizing foundational datasets that support reporting, analytics, and advanced data use cases across business units and internal stakeholders.

  • Key responsibilities include architecting and implementing efficient data workflows, writing high-quality production-grade code, configuring and integrating data platforms, debugging complex systems, and ensuring performance, scalability, and reliability of data solutions.

  • This role requires close collaboration with cross-functional teams, including business stakeholders, analysts, and technical partners, to translate requirements into well-designed data products. The ideal candidate is a highly motivated data engineering professional who thrives in a collaborative environment, contributes hands-on to development efforts, and supports the growth and mentoring of junior and lead engineers within a growing global team.


As a Senior Data Engineer at Roundel , you’ll take the lead as you…

  •    Assess client needs and convert business requirements into business intelligence (BI) solutions roadmap relating to complex issues involving long-term or multi-work streams.

  •     Analyze technical issues and questions identifying data needs and delivery mechanisms

  •    Implement data structures using best practices in data modeling, ETL/ELT processes, Spark, Scala, SQL, database, and OLAP technologies

  •     Build data caches and performant APIs following internal standards and best practices.

  •     Embrace the DevOps mentality (CI/CD) by building solutions designed for availability and scalability in an iterative manner

  •     Manage overall development cycle, driving best practices and ensuring development of high quality code for common assets and framework components

  •     Develop test-driven solutions and provide technical guidance and heavily contribute to a team of high caliber Data Engineers by developing test-driven solutions and BI Applications that can be deployed quickly and in an automated fashion.

  •     Manage and execute against agile plans and set deadlines based on client, business, and technical requirements

  •    Drive resolution of technology roadblocks including code, infrastructure, build, deployment, and operations

  •    Advocate for technologies, frameworks, design patterns, processes and guiding values of Data Engineering

  •    Ensure all code adheres to development & security standards


About you:

   • 4 year degree or equivalent experience
   • 5+ years of software development experience preferably in a data engineering/Hadoop development (Hive, Pig, Sqoop, Spark, etc.)
   • Hands on Experience in Object Oriented or functional programming such as Scala / Python
   • Knowledge or experience with a variety of database technologies (HBase, Postgres, Teradata, Cassandra, SQL Server, Oracle)
   • Knowledge with design of data integration using API and streaming technologies (Kafka) as well as ETL and other data Integration patterns
   • Knowledge or experience in designing various data models for ensuring scalability and performance of data usage
   • Experience with CI/CD toolchain (Drone, Jenkins, Vela, Kubernetes) a plus
   • Maintains technical knowledge within areas of expertise
   • Constant learner and team player who enjoys solving tech challenges with global team.
   • Have experience working on Druid, Postgres or related technology
   • Hands on experience in building complex data pipelines and flow optimizations
   • Be able to understand the data, draw insights and make recommendations and Be able to identify any data quality issues upfront
   • Experience with test-driven development and software test automation
   • Follow best coding practices & engineering guidelines as prescribed
   • Strong written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to variety of audiences
   • Stays current with new and evolving technologies via formal training and self-directed education