Fractal

Digi Ops L1 Support Engineer

Bengaluru Full time

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

About Fractal :

Founded in 2000, Fractal Analytics (www.fractal.ai) is a strategic analytics partner to the most admired Fortune 500 companies globally and helps them power every human decision in the enterprise. Fractal currently has 5500+ employees across 18 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has been recognized as 'Great Workplace' and 'India's Best Workplaces for Women' in the top 100 (large) category by The Great Place to Work® Institute; featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research Inc., a leader in Analytics & AI Services Specialists Peak Matrix 2022 by Everest Group and recognized as an 'Honorable Vendor' in 2022 Magic Quadrant™ for data & analytics by Gartner Inc. For more information, visit https://fractal.ai/

Job Description:

Need someone with strong Data Engineering skillet to ensure production (operations/support) related activities are delivered as per SLA. Need to work on issues/requests, bug fixes, minor changes, co-ordinate with the development team in case of any issues, work on enhancements.

Role Details
You will be part of the operation team providing L1 support to a client working in specified business hours or working in a 24*7 support model.

You will provide level 1 (L1) operational support for data pipelines and platforms built on Azure Data Factory (ADF), Databricks, SQL, and Python. The role focuses on data ingestion, monitoring, first-level debugging & troubleshooting, and ensuring stable, reliable data operations aligned with ITIL processes (Incident, Service Request, Problem, Change).
 

Key Responsibilities

Operations & Monitoring

  • Monitor ADF pipelines, Databricks jobs, SQL jobs, and scheduled tasks across environments (Dev/Test/Prod).
  • Proactively identify failures, performance degradation, data delays, and SLA risks using Azure Monitor, Log Analytics, and platform alerts.
  • Execute standard runbooks/SOPs for common failure scenarios (e.g., ADF activity retry, Databricks job restart, dependency checks).

Incident & Request Handling (ITIL)

  • Log, categorize, and prioritize incidents/requests in the ITSM tool (e.g., ServiceNow/Jira) per defined SLAs.
  • Perform first-level triage: review logs, identify error patterns, validate inputs, check dependencies (linked services, secrets, storage, compute).
  • Resolve L1 incidents within scope; escalate to L2/L3 with complete diagnostics (error traces, pipeline run IDs, job run links, steps taken).
  • Participate in Major Incident bridge calls to provide status, recovery steps, and ETA updates.

Troubleshooting & Recovery

  • Validate data ingestion sources (APIs, files, DB connectors) and destination readiness (tables, schemas, storage paths).
  • Execute approved fixes: pipeline re-runs, job restarts, cache clear, refresh credentials (via Key Vault), validate connection strings, and re-queue tasks.
  • Perform data sanity checks (row counts, schema drift, partition presence) and raise data quality issues to L2.

Documentation & Continuous Improvement

  • Maintain and update SOPs/runbooks, known error database (KEDB), and shift handover notes.
  • Contribute to post-incident reviews and problem records with clear root-cause context from L1 perspective.
  • Suggest monitoring and alerting enhancements (thresholds, dashboards, auto-recovery).

Governance & Compliance

  • Adhere to ITIL processes for Incident, Problem, Change, and Release management.
  • Follow access controls and least privilege policies; raise access requests via established channels.
  • Ensure compliance with data security and privacy requirements (masking, secure secrets, audit logging).

Technical skills
 

  • Azure Data Factory (ADF): Monitoring pipeline runs, integration runtimes, triggers, activity logs; basic retries and dependency checks.
  • Databricks: Job run monitoring, cluster status awareness, log inspection; ability to restart jobs per SOP.
  • SQL (Azure SQL/Databricks SQL/Synapse): Basic querying for validation (counts, schema checks), job status tables, and data completeness.
  • Python: Read logs, understand error messages, execute/trigger simple scripts per runbook; familiarity with notebooks is a plus.
  • Azure fundamentals: Storage (ADLS), Key Vault, RBAC, Log Analytics, Alerts, and Dashboards.
  • Monitoring/ITSM: Azure Monitor, Log Analytics/KQL basics, ServiceNow/Jira ticketing.

Non-technical skills

  • Drive Incident/Problem resolution by assisting in key operational activities in terms of delivery, fixes, and supportability with operations team.
  • Assist in change ticket review / approvals / planning and work with internal teams.
  • Assist with projects transitioning from project teams to Support teams.
  • Escalation points for operation-related issues
  • Experience working in ServiceNow is preferred.
  • Attention to detail a must, with focus on quality and accuracy.
  • Able to handle multiple tasks with appropriate priority and strong time management skills.
  • Flexible about work content and enthusiastic to learn.
  • Knowledge of service support, operation & design processes (ITIL)
  • Ability to handle concurrent tasks with appropriate priority.
  • Strong relationship skills to work with multiple stakeholders across organizational and business boundaries at all levels.

Education Qualification and Certifications required, If any
Certifications (Preferred):

  • Microsoft Certified: Azure Data Fundamentals
  • Azure Data Engineer Associate



 

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!