About DeleteMe:Β
DeleteMe is the leader in proactive privacy protection. We help Individuals, Families, Businesses and Security teams reduce their human attack surface by continuously monitoring and removing exposed personal data (PII) from the open web β the very data threat actors use to launch social engineering, phishing, Gen-AI deepfake, doxxing campaigns, physical threats, and identity fraud.
Operating as a fast-growing, global SaaS company, DeleteMe serves both consumers and enterprises. DeleteMe has completed over 100 million opt-out removals, helping customers reduce risks associated with identity theft, spam, doxxing, and other cybersecurity threats. We deliver detailed privacy reports, continuous monitoring, and expert support to ensure ongoing protection.
DeleteMe acts as a scalable, managed defense layer for your most vulnerable attack vector: your people. Thatβs why 30% of the Fortune 100, top tech firms, major banks, federal agencies, and U.S. states rely on DeleteMe to protect their workforce.
DeleteMe is led by a passionate and experienced team and driven by a powerful mission to empower consumers with privacy.
Β
Job Summary:
This position is a key partner across the organization, sitting within the Data Warehouse team to bridge the gap between raw data engineering and business strategy. The Data & Analytics Engineer is responsible for designing, building, and optimizing scalable data models in Snowflake using dbt, ensuring data integrity and high performance. This role balances technical warehouse architecture with the ability to translate complex business requirements into actionable data products.
Job Responsibilities:
β’ Data Modeling & Development: Architect and maintain robust, modular data models in Snowflake using dbt, following industry-standard modeling methodologies (e.g., Kimball).
β’ Warehouse Optimization: Write and tune advanced SQL to ensure optimal query performance, cost-efficiency, and resource management within the Snowflake environment.
β’ Data Observability & Quality: Implement and manage automated testing, monitoring, and alerting frameworks to ensure data accuracy, freshness, and lineage.
β’ Stakeholder Collaboration: Partner with business units to define KPIs, capture requirements, and translate business logic into technical data specifications.
β’ End-to-End Delivery: Own the full data lifecycle from ingestion to production-grade data marts and strategic BI visualizations and dashboard building.
β’ Engineering Excellence: Apply software engineering best practices to data development, including version control (Git), CI/CD, and detailed technical documentation.
β’ Process Improvement: Continuous refactoring of legacy code and data structures to improve maintainability and scalability of the analytics stack.
Job Requirements:
β’ Mastery of complex SQL, including window functions, CTEs, and performance tuning for large-scale datasets.
β’
Proven experience building production-grade dbt projects, including macros, seeds, and testing suites.
β’
Strong understanding of Snowflake-specific features such as clustering, virtual warehouses, and zero-copy cloning.
β’
Deep knowledge of dimensional modeling, fact/dimension design, and data warehousing principles.
β’
Availability in US Eastern (EST) hours.
β’
Ability to understand organizational drivers and communicate technical details effectively to non-technical stakeholders.
β’
Strong problem-solving skills with the ability to identify root causes in data discrepancies or performance bottlenecks.
Qualifications:
β’
Bachelorβs degree in Computer Science, Data Science, Statistics, Business, or a related field.
β’
3+ years of experience in Analytics Engineering, Data Engineering, or a highly technical BI role.
β’
Proficiency in Snowflake, dbt (with strong SQL), and data architecture.
β’
Proven track record of delivering end-to-end data solutions in a cloud warehouse environment.
β’
Strong data storytelling and presentation skills.
β’
Experience supporting various business functions like Finance, Operations, Sales, Marketing , preferably in SaaS.
Nice to Have:
β’
Experience with Python for data scripting or automation.
β’
Familiarity with data observability tools (e.g., Monte Carlo, Elementary).
β’
Experience in a high-growth startup environment.
β’
Cybersecurity experience
What We Offer:
β’
Comprehensive health benefits - Medical, Vision, Dental
β’
Flexible work schedule
β’
Generous 401k matching up to 6%
β’
20 days paid time off
β’
15 sick days
β’
12 company-paid holidays
β’
Childcare expense reimbursement
β’
Fitness and cell phone reimbursement
β’
Birthday time off
β’
Competitive salary - We publish salary ranges to promote transparency and ensure fair compensation. Final offers are based on skills, experience, and internal equity.
β’
This role may require occasional domestic and international travel. All standard travel expenses will be covered in accordance with the company's travel reimbursement policy.