We are looking for an experienced QA Engineer to lead the testing approach across a large-scale enterprise systems integration programme for a leading facilities management client. Working closely with .NET Integration Engineers and Data Engineers, you will be responsible for designing and implementing a structured test strategy that covers the full integration lifecycle, from unit and component testing through to system integration testing, UAT coordination, and regression. 
This is a hands-on role that requires both technical depth and strong organisational skills. You will own the testing workstream end-to-end across seven domain delivery waves, ensuring that integrations are thoroughly validated before deployment and that the client's business users are well supported through UAT. The ideal candidate brings proven experience testing event-driven integrations in Azure environments and is comfortable working at the intersection of data engineering, API integration, and enterprise application connectivity. 
Responsibilities 
Test Strategy & Planning 
β’ Define and maintain the overall test strategy for the integration programme, covering test objectives, scope, approach, tooling, environments, entry/exit criteria, and responsibilities across all test phases and domain waves 
β’ Collaborate with .NET Integration Engineers and Data Engineers during the design phase to ensure integration designs are testable, and that field-level mappings, transformation logic, error handling, and data validation rules are clearly defined and verifiable 
β’ Plan and sequence test activities across domain waves, managing dependencies between integration events, shared reference data, and MDM interfaces to ensure a coherent and risk-based testing approach 
Test Design & Execution 
β’ Design and write test scripts for unit, system integration (SIT), and UAT phases, covering happy path, negative, boundary, and edge case scenarios for each integration event, including create, update, and delete variants 
β’ Execute SIT end-to-end across source-to-target system pairs, validating data integrity, field-level mapping accuracy, transformation correctness, error handling behaviour, retry logic, and performance against agreed latency thresholds 
β’ Validate that integration adapters correctly handle exception scenarios, including malformed payloads, missing mandatory fields, duplicate events, and downstream system unavailability and that dead-letter and alerting mechanisms function as designed 
β’ Test data quality validation rules within adapters and ingestion pipelines, ensuring that invalid or non-compliant data is correctly identified, routed to exception workflows, and reported for steward review 
UAT Coordination 
β’ Prepare UAT test scripts tailored to business user scenarios, ensuring coverage of end-to-end workflows across integrated systems and alignment with the business processes each integration supports 
β’ Coordinate and manage UAT execution with client business users, including scheduling, onboarding participants, tracking progress, and maintaining a clear view of pass/fail status across all test cases 
β’ Triage and manage defects raised during UAT, working with the .NET and DE team to prioritise fixes, validate resolutions, and confirm readiness for production deployment 
Regression & Automation 
β’ Design and implement an automated regression test suite to support ongoing validation of deployed integrations, prioritising coverage of high-volume and business-critical events 
β’ Maintain and evolve the regression suite across waves as new integrations are introduced, ensuring previously deployed integrations remain stable and that cross-domain dependencies are covered 
Defect Management & Reporting 
β’ Own defect management throughout the programme, maintaining a complete and up-to-date defect log in Azure DevOps with clear severity classification, root cause attribution, and resolution status 
β’ Produce test reports at the conclusion of each test phase and wave, summarising test execution results, defect metrics, outstanding risks, and a clear recommendation on go-live readiness 
β’ Contribute to lessons learned and continuous improvement of the test approach across waves, refining scripts, tooling, and processes based on delivery experience