Job Summary
Synechron is seeking a dedicated Business Data Analyst & Data Engineer to bridge business requirements and technical solutions in enterprise data environments. This role involves gathering detailed business requirements, designing and developing scalable ETL pipelines, data models, and data governance frameworks to support analytics, reporting, and data-driven decision-making. You will collaborate with stakeholders, data teams, and QA to ensure data accuracy, security, and compliance, enabling the organization to leverage data assets effectively for strategic initiatives.
Software Requirements
Required:
Strong SQL skills for data validation, analysis, and data management (5+ years) supporting relational and data warehouse environments
Experience designing and deploying ETL pipelines using tools such as Informatica, Talend, or similar (5+ years)
Knowledge of big data tools and cloud data platforms supporting enterprise analytics (AWS, Azure, GCP)
Familiarity with data modeling, metadata management, and data governance principles
Experience supporting Agile projects using tools like JIRA and Confluence
Preferred:
Experience supporting cloud-native data solutions or data lake architectures
Knowledge of NoSQL databases like MongoDB, Cassandra
Exposure to AI/ML integration in data pipelines (preferred)
Overall Responsibilities
Gather and analyze business requirements, translating them into data models, ETL pipelines, and data architecture supporting analytics and reporting
Design, develop, and optimize scalable data workflows across cloud and on-premise platforms
Collaborate with business, data scientists, and technical teams to leverage data assets securely and efficiently
Conduct data validation, reconciliation, and quality checks to ensure Data Integrity and compliance standards are met
Support data migration and consolidation efforts supporting enterprise data strategies
Implement metadata management and data governance frameworks for operational and regulatory compliance
Document data flow diagrams, data models, and governance policies
Monitor data pipeline performance, troubleshoot issues, and implement process improvements
Drive automation of data processes to increase efficiency and reduce manual intervention
Technical Skills (By Category)
Languages & Data Tools (Essential):
SQL (Oracle, MySQL, PostgreSQL) supporting data validation and query optimization (5+ years)
ETL tools: Informatica, Talend, or similar (5+ years) supporting large enterprise data pipelines
Preferred:
Python scripting for automation and data transformation support
Data & Architecture:
Data modeling, metadata management, and data governance frameworks supporting enterprise standards
Cloud & Infrastructure:
Cloud environments: AWS, Azure, or GCP supporting data migration and pipelines (preferred)
Databases & Storage:
Relational databases and NoSQL data stores (MongoDB, Cassandra) (preferred)
Tools & Frameworks:
Data lineage, data cataloging, and data quality tools supporting enterprise compliance
Experience Requirements
5+ years supporting data engineering, data migration, and warehouse functions in large-scale organizations
Proven experience designing and managing scalable ETL pipelines and data governance frameworks
Experience working within enterprise or regulated industries (finance, healthcare, or government described as advantageous)
Knowledge supporting AI/ML data pipelines and automation (preferred)
Day-to-Day Activities
Develop, optimize, and monitor ETL pipelines supporting analytics, BI, and data warehousing needs
Collaborate with business stakeholders, data scientists, and BI teams to ensure data quality and availability
Conduct data validation, data reconciliation, and metadata management tasks
Automate manual data processes to improve operational efficiency
Support data migration, system upgrades, and infrastructure deployment activities
Troubleshoot data pipeline issues, security concerns, and performance bottlenecks
Maintain documentation of data flow architecture and compliance protocols
Participate in reviews, design discussions, and continuous process improvements
Qualifications
Bachelor’s or Master’s degree in Data Engineering, Computer Science, or related fields
5+ years supporting enterprise data environments, data migration, or warehousing initiatives
Certifications in cloud data services or enterprise data management (preferred)
Proven ability to support large, secure, and compliant enterprise data assets
Professional Competencies
Strong analytical and troubleshooting expertise supporting complex data workflows
Effective communication skills for requirements gathering and stakeholder engagement
Ability to work independently and collaborate in multidisciplinary teams
Adaptability to evolving data needs, standards, and regulatory environments
Continuous learning mindset supporting emerging data technologies and governance standards
Results-oriented focus on data quality, security, and operational efficiency
SYNECHRON’S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.