About HighLevel:HighLevel is an AI powered, all-in-one white-label sales & marketing platform that empowers agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. We are proud to support a global and growing community of over 1 million businesses, comprised of agencies, consultants, and businesses of all sizes and industries. HighLevel empowers usersΒ with all the tools needed to capture, nurture, and close new leads into repeat customers. As of mid 2025, HighLevel processes over 4 billion API hits and handles more than 2.5 billion message events every day. Our platform manages over 470 terabytes of data distributed across five databases, operates with a network of over 250 microservices, and supports over 1 million hostnames.
Our People
With over 1,500 team members across 15+ countries, we operate in a global, remote-first environment. We are building more than software; we are building a global community rooted in creativity, collaboration, and impact. We take pride in cultivating a culture where innovation thrives, ideas are celebrated, and people come first, no matter where they call home.
Β
Our ImpactAs of mid 2025, our platform powers over 1.5 billion messages, helps generate over 200 million leads, and facilitates over 20 million conversations for the more than 1 million businesses we serve each month. Behind those numbers are real people growing their companies, connecting with customers, and making their mark - and we get to help make that happen.
About the Role:
We are looking for a Senior Product Data Engineer to own the event ingestion and identity layer that connects product instrumentation to downstream analytical systems.
This role focuses on the operational reliability and correctness of event and identity data as it moves through the data platform. You will design and operate pipelines, schema validation, and replay workflows that ensure product events remain consistent and safe to use for analytics and customer-facing reporting.
You will work closely with product engineering teams on instrumentation patterns, with the CDP team on event contracts and definitions, and with platform teams to ensure event infrastructure and analytical systems scale reliably. This role builds the foundational event and identity datasets required for reliable downstream modeling. Behavioral models, canonical entities, and business analytics datasets are owned by the analytics engineering team.
Responsibilities:
β’ Define event schemas, required fields, and compatibility rules in collaboration with the CDP team
β’ Implement automated validation and contract enforcement to prevent breaking schema changes
β’ Maintain versioning and compatibility guarantees for event producers and downstream consumers
β’ Build and maintain pipelines that ingest, validate, and process high-volume product events
β’ Ensure event streams are deduplicated, ordered correctly, and safe for downstream consumption
β’ Partner with platform teams to ensure ingestion pipelines scale with product growth
β’ Define and maintain identity stitching logic across anonymous and authenticated users
β’ Handle identity merges, splits, and corrections while preserving tenant boundaries
β’ Ensure identity resolution remains explainable, deterministic, and safe for downstream datasets
β’ Design workflows that allow event datasets and identity graphs to be replayed or rebuilt safely
β’ Build tooling for historical corrections, schema evolution, and dataset reprocessing
β’ Ensure downstream models can be rebuilt without manual intervention when definitions evolve
β’ Provide guidance and tooling that help product teams emit events consistently
β’ Maintain validation checks and schema enforcement that catch instrumentation issues early
β’ Collaborate with engineering teams to evolve instrumentation safely over time
β’ Ensure deletion and suppression requests propagate correctly through event and identity pipelines
β’ Partner with governance and security teams to support policy requirements
β’ Define requirements and interfaces for event infrastructure and downstream analytical systems
β’ Work with platform teams to ensure pipelines remain reliable, scalable, and observable.
Requirements:
β’ 4+ years of experience in data engineering, platform engineering, or product data roles
β’ Strong experience building and operating event ingestion or streaming pipelines
β’ Experience implementing schema validation, data contracts, or event governance frameworks
β’ Strong SQL and Python, with experience building data processing or validation tooling
β’ Familiarity with identity resolution, entity resolution, or customer identity systems
β’ Experience operating analytical data systems or large-scale event datasets
EEO Statement:The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government record-keeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.
#LI-Remote #LI-NJ1