We are looking for an experienced Data Engineer to support the delivery of a large-scale enterprise systems integration programme for a leading facilities management client. Working alongside .NET Integration Engineers, you will be responsible for the data layer of the integration, connecting to source systems, profiling and transforming data, and ensuring clean, well-structured payloads flow through the event-driven Azure Integration Hub. 
In addition to adapter-level data work, you will build batch ingestion pipelines into the client's Databricks-based data platform and help establish the data interfaces required for the enterprise MDM implementation. The ideal candidate combines strong hands-on data engineering skills with practical experience connecting to complex enterprise application landscapes and working within structured delivery programmes.
Responsibilities 
Source Connectivity & Data Profiling 
β’ Establish and validate connections to in-scope enterprise source systems spanning HR, payroll, recruitment, ERP, CRM, procurement, CAFM, field service, fleet, and QHSE platforms, covering a range of connectivity patterns including REST APIs, SOAP/XML, database connectors, and file-based extracts 
β’ Conduct data profiling across source systems to assess data quality, volumes, formats, and structures, documenting findings and working with business stakeholders to define and implement automated data quality tests 
β’ Identify and escalate data quality issues that could impact integration or MDM readiness, and track remediation progress against agreed thresholds prior to go-live 
Adapter Data Layer & Transformation 
β’ Design and implement the data transformation logic within integration adapters, including field-level mappings, canonical format conversions, data type handling, and enrichment rules as defined in approved Integration Design Documents 
β’ Build and maintain reusable transformation components that support consistent data handling across multiple integration events and domain waves, reducing duplication and ensuring alignment with agreed data models 
β’ Implement data validation rules within adapters to enforce mandatory field checks, referential integrity, and format compliance before payloads are published to the Service Bus, supporting robust error handling and exception workflows 
 
 
Batch Ingestion into the Data Platform 
β’ Build and maintain batch ingestion pipelines from in-scope source systems into the client's Databricks-based data platform, covering Bronze (raw), Silver (cleansed and standardised), and Gold (business-ready) layers as required 
β’ Configure pipeline orchestration, scheduling, incremental load patterns, and error handling to ensure reliable, repeatable data delivery into the lakehouse environment 
β’ Implement data quality checks within the ingestion pipeline using the client's established data quality framework, ensuring test coverage across ingested datasets and flagging exceptions for steward review 
MDM Data Interfaces 
β’ Design and implement data feeds between source systems and the enterprise MDM platform, supporting the ingestion of master data records for domains including Customer, Supplier, Employee, Site, and Project 
β’ Work with the MDM workstream and data stewards to align source data structures with MDM domain models, supporting match and merge configuration, survivorship rule testing, and the propagation of mastered data back to consuming systems 
β’ Support the reference data wave by preparing and loading initial reference datasets into the MDM platform, ensuring data is cleansed, mapped, and validated prior to ingestion 
Collaboration & Governance 
β’ Work closely with .NET Integration Engineers to ensure the data layer of each adapter is consistent with the approved integration design and collaborate with solution architects and the MDM workstream to maintain alignment across the platform 
β’ Contribute to CI/CD pipelines, source control, and documentation standards, ensuring all data engineering artefacts are production-grade and handed over to the client team with appropriate runbooks and operational guides