Role Overview
We are looking for an experienced Automated Tester to build and evolve a robust, scalable automation capability across APIs, data pipelines, and our graph‑based identity platform. This role spans CI/CD gating, automated data and contract validation, graph integrity testing, performance and regression testing, and full end‑to‑end API testing within AWS. The successful candidate will ensure that our identity platform consistently meets quality, compliance, and operational standards.
Key Responsibilities
• Design, implement, and maintain automated test suites across API, data workflow, and graph components, integrating them into CI/CD pipelines with quality gates (unit, static analysis/security, data quality, contract/compliance, and final integrity checks).
• Own CI/CD quality enforcement, including gated releases, automated evidence of test results, and publish blocks when validation fails (e.g., consent, schema, integrity, scoring).
• Develop automated contract and compliance checks, including schema validation, required fields mapping, TCF/consent validation, and PII‑risk checks aligned to governance requirements.
•           Build and execute automated end‑to‑end API tests using tools such as Bruno, Insomnia, or Postman, validating API behaviour through API Gateway, Lambda integrations, and downstream data/graph operations.
• Build data quality monitors (nulls/ranges, volume deltas, distribution checks) and statistical tests suited to identity clusters; operationalize them as part of routine runs.
• Create graph integrity tests in Neo4j, covering node/edge parity, constraints, traversal path validation, merge/split detection, and evidence‑path reproducibility checks.
• Implement performance and resilience testing, including query latency targets, batch window completion for rebuilds, and regression suites for every deployment.
• Contribute to operational guardrails by instrumenting monitoring, alerting, and cost‑aware test strategies; collaborate with engineering to ensure predictable weekly rebuild cadence and controlled ad‑hoc extracts.
• Partner with product, data science, and platform teams to define test coverage for deterministic and probabilistic identity resolution, confidence scoring, stability indexing, and market‑calibrated thresholds.
• Provide clear test reporting and audit‑ready evidence, including human‑readable traces that support regulatory review and operational triage.Must‑Have Skills & Experience:
Automation & CI/CD
• Strong CI/CD automation with gated release controls and the ability to block publication when gates fail (e.g., consent, integrity, scoring). Experience integrating automated suites into Harness CI/CD tool.API Testing
• Ability to design and implement end‑to‑end automated API tests using Bruno, Insomnia, Postman, or similar frameworks.
• Strong understanding of REST API behaviours, request chaining, environment variables, authentication flows, and test assertions.
• Experience testing APIs running on AWS API Gateway with Lambda backends, including understanding of routing, payload transformations, throttling, and logging.AWS
• Hands‑on experience with:
• S3 (structured landing/prepared zones and artefact validation).
• IAM (roles/permissions for automated testing and least‑privilege CI/CD).
• Airflow/MWAA (end‑to‑end workflow testing, scheduled rebuilds, and validation tasks).
• Logging/monitoring (instrumentation for quality gates, alerts, and evidence capture).Data & API Testing
• Data contract and schema validation; automated data quality checks (null/range, volume deltas, distribution/consistency) for identity datasets.
• Performance testing for query latency and batch‑process windows; regression testing for rebuilds and deployments.
• Compliance validation (consent enforcement, PII risk, audit‑ready outputs) integrated into automated pipelines.Graph & Identity Context
• Neo4j/graph integrity testing: node–edge constraints, path traversal validation, evidence‑path verification, and merge/split detection.
• Understanding of deterministic + probabilistic identity resolution, confidence scoring, and stability indexing to design meaningful tests for linkage accuracy and drift.Languages & Tooling
• Python for test automation, data validations, and CI tooling integrations; familiarity with test frameworks and contract‑testing approaches as applicable.Should‑Have Skills (Phase‑2+ / Production Maturity):
• Drift detection and score distribution monitoring for identity resolution outcomes; alerting on behavioural/statistical shifts.
• Consent and governance checks that mirror operational policies (e.g., consent‑only processing at node/edge level, version‑controlled thresholds).
• Advanced Neo4j testing patterns: traversal‑based scoring verification, stability index checks across rebuilds, and auditable result reproducibility.
• Prior testing experience with identity resolution / data science / probabilistic systems—including calibration/threshold validation and explainability.
• Familiarity with observability (alerts, dashboards) to uphold guardrails on performance, resilience, and cost visibility.Nice‑to‑Have Experience:
• Data‑driven SaaS platforms; regulated data processing environments
• Contract‑testing frameworks (e.g., Pact) and property‑based testing approaches
• Cloud observability stacks (e.g., CloudWatch/OpenSearch/Grafana) for pipeline and API SLOsPersonal Attributes:
• Analytical, detail‑oriented, and systematic about evidence and traceability
• Strong communicator; can translate complex identity/graph concepts into clear test plans
• Proactive at identifying quality risks, gaps, and remediation strategies
• Comfortable in fast‑paced environments with weekly rebuild cadences and tight release gates