At PointClickCare our mission is simple: to help providers deliver exceptional care. And that starts with our people. As a leading health tech company that’s founder-led and privately held, we empower our employees to push boundaries, innovate, and shape the future of healthcare.
With the largest long-term and post-acute care dataset and a Marketplace of 400+ integrated partners, our platform serves over 30,000 provider organizations, making a real difference in millions of lives. We also reinvest a significant percentage of our revenue back into research and development, ensuring our employees have the resources to innovate and make a lasting impact. Recognized by Forbes as a top private cloud company and honored as one of Canada’s Most Admired Corporate Cultures, we offer flexibility, growth opportunities, and meaningful work.
At PointClickCare, we empower our people to be the architects of a smarter healthcare future; one that is human-first and accelerated by AI to create meaningful and lasting change. Employees harness AI as a catalyst for creativity, productivity, and thoughtful decision-making. By integrating AI tools into our daily workflows, collaboration is enhanced, outcomes are improved, and every team member has the proficiency to maximize their impact. It all starts with our hiring practices where we uncover AI expertise that complements our mission, and we continue to invest in training and development to nurture innovation throughout the employee journey.
**Travel to Office expectations**
For Remote Roles: If this role is remote, there will be in-office events that will require travel to and from the Mississauga and/or Salt Lake City office. These will include, but not limited to, onboarding, team events, semi-annual and annual team meetings.
For Hybrid Roles: If this role is Hybrid, there will be an expectation to reside within commutable distance to the office/location specified in the job listing. This will include, but not limited to, weekly/bi-weekly/monthly events in the office with your specific team. This is a requirement for this role.
PointClickCare is searching for a Principal Data Engineer who will contribute to best-practice data engineering by designing and rigorously delivering production-grade streaming pipelines on a regular basis, while also encouraging and optimizing the daily execution of technical excellence across an empowered team. This is a hands-on leadership role, requiring the ability to enhance and implement batch and real-time data solutions already in progress, mentor other team members, and deliver both business and technical objectives through ambiguity and uncertainty.
This is an opportunity to shape the future of our data ecosystem. You’ll work with a passionate team and modern technologies to drive innovation that impacts the entire organization. The ideal candidate thrives as an individual contributor, while making a significant technical impact and elevating the team’s capabilities.
To succeed as a Principal Data Engineer at PointClickCare, you need to be collaborative, adventurous and passionate. Collaborative means that you’re enthusiastic about jumping in to help achieve the team’s top priorities, no self-promoting politicians allowed. Adventurous means that you’re not afraid to dive into uncharted technical territory and get your own hands dirty, while supporting and driving delivery of complex features through a dedicated Scrum team. Passionate means that you’re eager to learn and share knowledge that drives the team forward, and excited to be part of a movement that is positively impacting the lives of seniors and their caregivers all over North America.
What your day-to-day will look like:
-Lead and guide the design and implementation of scalable streaming data pipelines
-Engineer and optimize real-time data solutions using frameworks like Apache Kafka, Flink, Spark Streaming
-Collaborate cross-functionally with product, analytics, and AI teams to ensure data is a strategic asset
-Advance ongoing modernization efforts, deepening adoption of event-driven architectures and cloud-native technologies
-Drive adoption of best practices in data governance, observability, and performance tuning for streaming workloads
-Embed data quality in processing pipelines by defining schema contracts, implementing transformation tests and data assertions, enforcing backward-compatible schema evolution, and automating checks for freshness, completeness, and accuracy across batch and streaming paths before production deployment
-Establish robust observability for data pipelines by implementing metrics, logging, and distributed tracing for streaming jobs, defining SLAs and SLOs for latency and throughput, and integrating alerting and dashboards to enable proactive monitoring and rapid incident response
-Foster a culture of quality through peer reviews, providing constructive feedback and seeking input on your own work
What qualifications we’re looking for:
-Principal Data Engineer with at least 10 years of professional experience in software or data engineering, including a minimum of 4 years focused on streaming and real-time data systems
-Proven experience driving technical direction and mentoring engineers while delivering complex, high-scale solutions as a hands-on contributor
-Deep expertise in streaming and real-time data technologies, including frameworks such as Apache Kafka, Flink, and Spark Streaming
-Strong understanding of event-driven architectures and distributed systems, with hands-on experience implementing resilient, low-latency pipelines
-Practical experience with cloud platforms (AWS, Azure, or GCP) and containerized deployments for data workloads
-Fluency in data quality practices and CI/CD integration, including schema management, automated testing, and validation frameworks (e.g., dbt, Great Expectations)
-Operational excellence in observability, with experience implementing metrics, logging, tracing, and alerting for data pipelines using modern tools
-Solid foundation in data governance and performance optimization, ensuring reliability and scalability across batch and streaming environments
-Experience with Lakehouse architectures and related technologies, including Databricks, Azure ADLS Gen2, and Apache Hudi
-Strong collaboration and communication skills, with the ability to influence stakeholders and evangelize modern data practices within your team and across the organization
Additional qualities we value
-Strong analytical and problem-solving mindset
-Ability to learn quickly and adapt to new technologies, even when uncomfortable
-Self-starter who thrives with minimal supervision and collaborates effectively as a team player
-Excellent organizational and critical-thinking skills
-Comfortable leveraging AI tools to accelerate development