Design, develop, and maintain robust data integrations from various source systems into our Google Cloud Platform (GCP) environment.
Design, develop, and maintain web application components and APIs using Python (particularly Django or similar frameworks), including backend and frontend development tasks, to support data access, integration, and visualization needs.
Implement and continuously optimize query performance in BigQuery to ensure timely and efficient data retrieval for ETL/ELT processes and analytical purposes.
Collaborate with cross-functional teams to understand their data requirements for ETL/ELT and reporting purposes and translate them into effective data models and BigQuery solutions.
Monitor the health and performance of GCP infrastructure components and data solutions using tools like Datadog, ensuring expected behavior and reliability.
Respond to and help troubleshoot alerts and incidents surfaced via monitoring systems and PagerDuty to minimize downtime and ensure data integrity.
Utilize Git for version control and participate in CI/CD pipelines for data pipeline deployments.
Stay current with best practices and emerging technologies in data integration, BigQuery optimization, and data quality management relevant to supporting applications in a cloud environment.
Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or a related subject.
3+ years of practical experience in data engineering or related fields, with a strong focus on Python and SQL (writing complex queries and optimizing performance).
2+ years of experience employing Python web development frameworks, such as Django, is required, along with a comprehensive understanding of backend, frontend, and full-stack development principles.
2+ years of hands-on experience with Google Cloud Platform (GCP) or Amazon Web Services (AWS) for data engineering tasks.
1+ years of experience with cloud infrastructure monitoring tools (e.g., Datadog, GCP Cloud Monitoring) and incident alerting platforms (e.g., PagerDuty).
Conceptual understanding or at least 1 year of experience with Machine Learning concepts and their potential application in data processes.
Proven experience developing and deploying Full Stack applications using Python and Django, including handling frontend components (e.g., using Django templates, integrating with simple JS, or potentially a lightweight frontend library if applicable).
Familiarity with containerization technologies (e.g., Docker) and orchestration (e.g., Kubernetes) within a GCP context.
Experience automating incident response procedures.
Primary Location:
CRI-SabanaFunction:
Function - Data and AnalyticsSchedule:
Full time