Job Title:
Sr. Analyst, Global Analytic Insights (TCF)
Job Description
Concentrix Catalyst is the driving force behind Concentrix’s transformation and technology services, we integrate world-class digital engineering, creativity, and a deep understanding of human behaviour to find and unlock value through tech-powered and intelligence-fueled solutions.
We take our clients’ visions from concept to release in as little as a few months and often work together to evolve products for years. We're spread across 12 regions around the world and are now expanding our data engineering team.
Required Skills & Experience
- 5 – 8 year’s experience in data engineering in cloud-based environments like Databricks and Azure Platform.
- Strong experience in designing and building ETL (Extract, Transform, Load) pipelines.
- Advanced proficiency in Python PySpark for data manipulation and modelling
- Strong experience in working with medallion architecture
- Solid understanding of Unity catalogue and work experience on the same
- Good experience of data modelling
- Solid understanding of data warehouse architecture.
- Strong work experience with data lake and data warehouse concepts.
- Ability to understand PHP code base and reverse engineer the logic
- Proficiency with SQL and working with large datasets.
- Experience with data integration tools and technologies.
- Familiarity with business intelligence tools (Power BI).
- Ability to collaborate with cross-functional teams, including application developments teams, business stakeholders and data scientists.
- Experience with version control (e.g., Git) and agile development methodologies.
Key Responsibilities
- Data Migration and Integration: Lead the migration of data from existing host systems into a new Data Lake/ Data Warehouse architecture on Databricks.
- Data Consolidation: Consolidate data from multiple sources and ensure that it is integrated into a unified data repository for reporting and analytics.
- Development of Databricks Workflows: Build, maintain, and optimize workflows and data pipelines on Databricks, ensuring reliable and timely data delivery.
- Data Quality Assurance: Ensure data quality and integrity throughout the migration process and proactively address any data quality issues.
- Technical Documentation: Create and maintain clear technical documentation for all data engineering tasks, including workflows, pipelines, and any new data models.
Location:
IND Hyderabad - Unit No. 601 6th Flr Maximus Building 2A Mindspace
Language Requirements:
Time Type:
Full time
If you are a California resident, by submitting your information, you acknowledge that you have read and have access to the Job Applicant Privacy Notice for California Residents