Job Title
Data Integrations LeadJob Description Summary
Reporting to the Global Head of Data Architecture and Engineering, you will be the primary custodian of our data platform standards, integration patterns, and engineering best practices. A high-level strategy and architectural guardrails are already in place; your role is to translate them into a living North Star – defining the standards and platform capabilities that all data engineering teams build against. This is a player-coach position: you will remain somewhat hands-on with complex platform work while mentoring other engineers and driving consistency across a global organisation. Our primary stack is Microsoft Azure and Databricks and deep expertise in both is essential.Job Description
Key Responsibilities:
· Own and evolve our data platform reference architectures and standards, spanning ingestion, batch and real-time integration, event-driven patterns, lakehouse design, and AI/ML enablement.
· Define, socialise, and enforce engineering standards across multiple teams – including metadata-driven patterns, code reuse, and software engineering best practice.
· Champion DataOps and CI/CD practices organisation-wide, including pipeline standards, automated testing, and release governance within the Azure and Databricks ecosystem.
· Define the approach to metadata management, data cataloguing, lineage, and data quality, ensuring compliance and data residency requirements are met across global jurisdictions.
· Shape the platform's AI/ML enablement strategy, ensuring data is accessible and well-governed for AI teams without creating ungoverned, bespoke patterns.
· Mentor and develop data engineers across teams, setting a high technical bar and fostering a culture of quality and continuous improvement.
Knowledge & Experience:
· Extensive hands-on experience across the data platform spectrum – ingestion, batch and real-time streaming, event-driven integration, lakehouse architecture, and AI/ML data enablement probably gained over 10+ years including at least 3 years working with Azure and Databricks
· Deep practical expertise with Microsoft Azure data services in secure private VNet environments, including Event Hubs, Service Bus, ADLS, Function Apps, API Manager, Cosmos DB, and Azure Monitor.
· Expert-level, hands-on Databricks experience – including Lakeflow, Spark Declarative Pipelines (DLT), Delta Lake, Unity Catalog, Lakehouse architecture and cluster management & optimisation.
· Proven DataOps and CI/CD expertise using Azure DevOps, Databricks Asset Bundles (DABs), or similar, with strong instincts for metadata-driven, reusable engineering patterns.
· Demonstrable track record defining and enforcing architecture standards across large, distributed engineering organisations, with the ability to influence without direct authority.
· Exposure to Corporate Real Estate data domains would be welcome but not essential
· Familiarity with Terraform for Azure data platform provisioning would be beneficial but not required
· Background contributing to a data platform community of practice or centre of excellence is helpful.
In compliance with the Americans with Disabilities Act Amendments Act (ADAAA), if you have a disability and would like to request an accommodation in order to apply for a position at Cushman & Wakefield, please call the ADA line at 1-888-365-5406 or email HRServices@cushwake.com. Please refer to the job title and job location when you contact us.
INCO: “Cushman & Wakefield”