This is a remote based position in US.
We, the Cloud Platform Engineering team at Calix are responsible for the Platforms, Tools, and CI/CD pipelines at Calix. Our mission is to enable Calix engineers to accelerate the delivery of world-class products while ensuring the high availability.
We are seeking a skilled and experienced Staff Cloud Platform Engineer to join Cloud Platform team. The Staff Cloud Platform Engineer to design, implement, and manage cloud infrastructure and data pipelines using Google Cloud Platform (GCP) services like DataStream, Dataflow, and Dataproc. The ideal candidate will have a strong background in DevOps practices, cloud infrastructure automation, and big data technologies. You will collaborate with data engineers, developers, and operations teams to ensure seamless deployment, monitoring, and optimization of data solutions. This role also involves ensuring the smooth operation of Looker, supporting business intelligence (BI) initiatives, and enabling data-driven decision-making across the organization.
Responsibilities:
Design and implement cloud infrastructure using IaC – Terraform
Automate provisioning and management of Dataproc clusters, Dataflow jobs, and other GCP resources
Integrate tools like GitLab CI/CD, or Cloud Build for automated testing and deployment.
Deploy and manage real-time and batch data pipelines using Dataflow or DataStream.
Ensure seamless integration of data pipelines with other GCP services like Big Query, Cloud Storage, and Kafka or Pub/Sub.
Monitor performance, reliability, and cost of Dataproc clusters, Dataflow jobs, and streaming applications.
Optimize cloud infrastructure and data pipelines for performance, scalability, and cost-efficiency.
Implement security best practices for GCP resources, including IAM policies, encryption, and network security.
Ensure Observability is an integral part of the infrastructure platforms and provides adequate visibility about their health, utilization, and cost.
Collaborate extensively with cross functional teams to understand their requirements; educate them through documentation/trainings and improve the adoption of the platforms/tools.
Qualifications:
10+ years of overall experience in DevOps cloud engineering, or data engineering.
5+ years of experience in DevOps, cloud engineering, or data engineering.
Proficiency in Google Cloud Platform (GCP) services, including Dataflow, DataStream, Dataproc, Big Query, and Cloud Storage.
Expertise in Infrastructure as Code (IaC) tools like Terraform or Cloud Deployment Manager.
Strong experience with Looker, Tableau, or ThoughtSpot administration
Knowledge of real-time data streaming technologies (e.g., Apache Kafka, Pub/Sub).
Familiarity with data orchestration tools like Apache Airflow or Cloud Composer.
Strong proficiency in SQL query optimization
Experience with CI/CD tools like Jenkins, GitLab CI/CD, or Cloud Build.
Knowledge of containerization and orchestration tools like Docker and Kubernetes.
Strong scripting skills for automation (e.g., Bash, Python).
Experience with monitoring tools like Cloud Monitoring, Prometheus, and Grafana.
Familiarity with logging tools like Cloud Logging or ELK Stack.
Strong problem-solving and analytical skills.
Excellent communication and collaboration abilities.
Ability to work in a fast-paced, agile environment.
#LI-Remote
The base pay range for this position varies based on the geographic location. More information about the pay range specific to candidate location and other factors will be shared during the recruitment process. Individual pay is determined based on location of residence and multiple factors, including job-related knowledge, skills and experience.
San Francisco Bay Area:
156,400 - 265,700 USD AnnualAll Other US Locations:
As a part of the total compensation package, this role may be eligible for a bonus. For information on our benefits click here.