BUILD, SCALE & OPERATE LEADING DTC BRANDS ALONGSIDE A-PLAYERS
MANEUVER MARKETING
Our Vision, Mission & Success are fuelled by our commitment to be a driving force of positive change to the health of everyday consumers, providing conscious, high-quality & innovative supplement products.
In just 5 years, we kicked off our own DTC Health & Wellness brand from scratch and scaled it to USD$100M+ in annual sales, serving more than 3,000,000 customers worldwide with an average of 4,000 daily orders across 9 SKUs.
These results caught the attention of The Financial Times [Upgrade to PRO to see link] as they ranked us among APACs top High-Growth Companies. We have also been awarded 2nd place on the E50 Awards [Upgrade to PRO to see link] jointly organised by The Business Times and KPMG in Singapore.
This is just the beginning of our journey, and you could be part of the next stage of our growth!
YOUR NEXT ROLE
We are seeking a Data Engineer to provide ongoing operational support for our data warehouse infrastructure. This role is focused on data reliability, proactive monitoring, incident response, and continuous platform improvement, ensuring business teams can confidently rely on data for decision-making.
This role is open to both full-time and part-time candidates. For part-time engagement, we are looking for a consistent commitment of 15β20 hours per week, with preferred overlap during Singapore business hours for collaboration and operational response.
WHAT YOUβLL DO
Data Reliability & Pipeline Monitoring
Ensure data pipelines run reliably and data is fresh, accurate, and available as expected.
- Monitor, build, and respond to Daton pipeline notifications and alerts
- Track data latency, freshness, and completeness across all source systems
- Design, build, and maintain QC processors for all source data and custom reports
- Monitor job execution, investigate failures, and perform root cause analysis at:
- Pipeline level
- QC / validation level
- API / source system level
- Create and enhance data pipelines, onboard new platform integrations, and implement logic changes to existing pipelines
- Coordinate with source system owners and vendors when issues originate upstream
- Monitor alerts from source systems and custom reports
- Ensure overall data reliability through proactive monitoring and validation
- Optimize query performance and warehouse costs
- Maintain documentation for all logic, schema, and pipeline changes, with a continuously updated change log
Data Quality & Validation
Implement and maintain automated data quality checks (source + reports) to build trust and confidence in data across the organization.
- Monitor and respond to data quality and test failures
- Implement automated validation checks, including; null checks, duplicate detection, range & boundary checks, valid value checks, referential integrity checks
- Implement business-logic validations for key KPIs
- Perform daily validation of critical metrics against source UIs (Shopify, GA4, Meta, Klaviyo, Google Ads, Loop, etc.)
- Ensure KPI consistency across raw, transformed, and reporting layers
- Implement anomaly detection for key tables and metrics
Cost Optimization
Optimize warehouse performance and manage costs proactively to ensure sustainable data operations.
- Monitor and respond to billing alerts for BigQuery, dbt, and ETL tools
- Maintain cost monitoring dashboards
- Implement and optimize table partitioning and clustering
- Optimize incremental loads and expensive queries
- Proactively flag high-cost queries via Slack
- Query performance optimization (where applicable)
Source System Monitoring & (API) Integration Management
Proactively manage issues originating from upstream systems and maintain healthy integrations
- Monitor and respond to source schema and data-type changes
- Handle source delays caused by API limits, downtime, or auth failures
- Coordinate with vendors and internal teams to resolve upstream issues
- Assess business impact and classify incidents as P0/P1 when required
Security & Compliance
Ensure data access and handling align with regulatory requirements and security best practices.
- Maintain GDPR, CCPA, and related compliance controls
- Manage RBAC and column-level security in BigQuery
- Ensure PII masking and access restrictions
- Respond to security incidents related to data access or credentials
Documentation & Change Management
- Maintain documentation for pipelines, tables, and business logic
- Update test cases for logic or schema changes
- Document incident RCA and resolutions
- Maintain operational runbooks
- Manage logic and schema change requests from business team
WHAT YOU BRING
- Strong Google BigQuery expertise (SQL optimization, partitioning, clustering)
- Experience with ETL tools (Daton, Fivetran, or similar)
- Pipeline monitoring and alerting experience
- Strong SQL for debugging and validation
- E-commerce data experience (Shopify, GA, ad platforms preferred)
- Experience maintaining production data systems
- Strong troubleshooting and RCA skills
- Clear communication with technical and non-technical stakeholders
- Proactive, ownership-driven mindset
- Ability to work independently in a remote setup
- Strong documentation discipline
TIME COMMITMENT & AVAILABILITY
Full-time
5 days per week, based on our standard full-time working schedule.
Part-time
- Expected commitment: 15β20 hours per week
- Flexible schedule, with preference for consistent weekly availability
- Preferred availability: Singapore business hours (9:00 AM β 6:00 PM SGT) for real-time collaboration
Response Time Expectations
- P0 (Critical): Acknowledgement within 2 hours on business days
- P1 (High): Acknowledgement within 4 hours on business days
- P2 (Standard): Acknowledgement within 24 hours