Bottomline technologies

Data Engineer

India Full Time

Why Choose Bottomline?

Are you ready to transform the way businesses pay and get paid? Bottomline is a global leader in business payments and cash management, with over 35 years of experience and moving more than $16 trillion in payments annually. We're looking for passionate individuals to join our team and help drive impactful results for our customers. If you're dedicated to delighting customers and promoting growth and innovation - we want you on our team!

Position Summary:

Bottomline is looking for a Data Engineer to grow with us.

The data engineer is responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and analysis. They play a crucial role in building, managing and test (QA) the data pipelines that enable efficient and reliable data integration, transformation, and delivery for all data users across the enterprise. Bring innovation and add value to organisation by implementing AI driven solutions in different projects in Data Engineering area.

The Data Engineer will work on implementing data flows to make data available in the Enterprise Data Warehouse from systems of record and operational data stores.  The Data Engineer will get to work on best-in-class cloud technologies(Snowflake, Fivetran, AWS, Azure, Salesforce, Airflow etc) and AI projects for Data Engineering Projects.

Along with making data available in the Enterprise Data Warehouse the Data Engineer will work with Data Analysts to implement data models and calculate key business KPIs for the use of the wider business for reporting and analytics.

The Data Engineer will have the opportunity to learn and develop their skills by working on assignments as part of a Scrum Team. They should be delivery focused and driven to problem solve.

 

How you’ll contribute:

  • Design and develop data pipelines that extract data from various sources, transform it into the desired format, and load it into the appropriate data storage systems. 
  • Collaborate with data scientists and analysts to optimize models and algorithms for data quality, security, and governance. 
  • Integrate data from different sources, including databases, data warehouses, APIs, and external systems. 
  • Ensure data consistency and integrity during the integration process, performing data validation and cleaning as needed. 
  • Transform raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques. 
  • Optimize data pipelines and data processing workflows for performance, scalability, and efficiency. Come-up with solutions utilising AI concepts through co-pilot, GitHub to improve efficiency and scalability.
  • Monitor and tunes data systems, identifies and resolves performance bottlenecks, and implements caching and indexing strategies to enhance query performance. 
  • Implement data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of data. Utilise AI through Co-pilot or GitHub or any other trending AI tools, to automate and seamlessly query data based on user inputs in plain English to assess accuracy, consistency and completeness
  • Carry out Data QA to validate the pipelines and data post integration as a supporting hand for the QA team
  • Take authority, responsibility, and accountability for exploiting the value of enterprise information assets and of the analytics used to render insights for decision making automated decisions and augmentation of human performance. 
  • Collaborate with leaders to establish the vision for managing data as a business asset. 
  • Establish the governance of data and algorithms used for analysis, analytical applications, and automated decision making. 

 

What will make you successful: 

  • A bachelor’s degree in computer science, data science, software engineering, information systems, or related quantitative field
  • At least four (4) years of work experience in data management disciplines, including data integration, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks. 
  • Proven project experience developing and maintaining data warehouses (Snowflake experience is preferable)
  • Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data to support AI, ML, and BI 
  • Strong ability in programming languages such as Java or Python and other scripting languages
  • Previous experience with languages/tools such as SQL
  • Significant experience working in the ETL process and building pipeline for data retrieval using Rest API’s. Knowledge in ETL tool like Talend, Informatica is preferable
  • Proficiency in OLAP, Star, Dimensional, and Snowflake schemas.
  • Basic knowledge of BI Tools – Power BI, Tableau.
  • Basic knowledge of DevOps tools – GitHub, Atlassian Tools, VS Code etc
  • Experience working in a structured development environment (i.e., environment with the standard SDLC process).
  • Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure) and modern data warehouse tools (Snowflake, Databricks)
  • Experience with database technologies such as SQL, NoSQL, Oracle, or Teradata
  • Experience with using AI in Data Engineering/ Data Analysis work and experience in Data QA.
  • Knowledge in Apache technologies such as Kafka, Airflow to build scalable and efficient data pipelines (nice to have).
  • Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products.
  • Expert problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve repetitive problems.
  • Excellent business acumen and interpersonal skills; able to work across business lines at a senior level to influence and effect change to achieve common goals.
  • Ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options.

 

#LifeAtBottomline

#LI-DNI

We welcome talent at all career stages and are dedicated to understanding and supporting additional needs. We're proud to be an equal opportunity employer, committed to creating an inclusive and open environment for everyone.