About ShyftLabs
At ShyftLabs, we live and breathe data. Since 2020, weβve been helping Fortune 500 companies unlock growth with cutting-edge digital solutions that transform industries and create measurable business impact. Weβre growing fast and weβre looking for passionate problem-solvers who are ready to turn big ideas into real outcomes.
The Opportunity
Weβre looking for a Data Architect to lead the design and evolution of a Customer Data Mart that powers CDP, analytics, marketing activation, and reporting use cases. This role is for a senior technical leader who brings architectural clarity to complex customer data ecosystems, owns execution end-to-end, and sets the technical direction for scalable, secure, and resilient data platforms.
As a Data Architect, you will partner closely with client stakeholders, marketing teams, product leaders, and engineering teams to design and implement modern, cloud-native data architectures. You will operate as both a hands-on architect and a strategic advisor, shaping long-term data strategy while remaining deeply involved in technical implementation.
This role is best suited for a candidate possessing a robust foundation in data engineering and architecture, extensive experience with enterprise-scale data systems, and a proven capability to manage ambiguity while implementing structured, long-term solutions.
What You'll Be Doing
β’ Own the technical vision and architecture for the Unified Customer Data Mart, ensuring solutions are scalable, secure, compliant, and aligned with enterprise standards.
β’ Design and implement end-to-end data architectures of data pipelines, including raw data ingestion (Bronze), data cleaning and standardization (Silver), and curated data marts (Gold) that serve CDP, reporting, and activation use cases.
β’ Define and evolve data modeling standards for customer data, including customer dimensions, transaction facts, engagement events, web behavior, support interactions, and loyalty activity. Establish naming conventions, schema patterns, and architectural best practices across the customer data domain.
β’ Decomposing complex business requirements into structured technical solutions and driving alignment with client stakeholders.
β’ Formulate, compare, and present multiple architectural approaches for data ingestion, transformation, identity resolution, and consumption patterns, guiding clients and internal teams toward optimal long-term solutions that balance speed, maintainability, and scalability.
β’ Architect and build production-grade data pipelines using DBT and Airflow that support customer analytics, segmentation and reporting at scale.
β’ Partner directly with client stakeholders to understand business objectives, translate customer journey requirements into robust technical designs, and act as a trusted technical advisor on data architecture decisions.
β’ Lead and mentor cross-functional teams, including Analytics Engineers, Data Engineers, and BI developers, setting a high bar for technical quality, code review standards, and documentation practices.
β’ Influence and contribute to data governance initiatives, including PII handling, data quality frameworks, identity resolution strategies, and platform reliability. Define RLS policies, data retention rules, and compliance patterns.
β’ Drive the development of reusable frameworks and accelerators for customer data pipelines, including DBT macros, Airflow DAG patterns, and data quality tests that can be applied across multiple sources and brands.
β’ Contribute to technical strategy and roadmap planning for the building data mart, including source prioritization, and integration with downstream systems (CDP, ESP, reporting tools).
What You Bring
β’ Deep expertise in SQL and Python, with demonstrated ability to design, optimize, and troubleshoot complex distributed data systems.
β’ 5+ years of experience in data engineering and/or data architecture, with a proven track record of building and scaling enterprise-level data platforms.
β’ Extensive experience designing and implementing data lakes, cloud data warehouses, and modern analytics architectures in production environments.
β’ Hands on experience with DBT for transformations and modular data modeling
β’ Hands on experience with Google BigQuery (mandatory) or equivalent cloud warehouses (Snowflake, Databricks)
β’ Hands on experience with Airflow (or similar orchestration frameworks)
β’ Proven experience implementing medallion or layered data architectures, including raw ingestion, conformed layers, and curated marts.
β’ Strong foundation in dimensional modeling, star/snowflake schemas, conformed dimensions, and designing for both analytical and operational use cases.
β’ Experience with Customer Data Platforms (CDPs) and multi-channel customer data integration, including identity resolution (deterministic and probabilistic matching).
β’ Experience designing for security and compliance, including PII masking, access controls, RLS policies, encryption, and privacy regulations (GDPR/CCPA).
β’ Strong understanding of cloud architecture principles, including storage optimization, cost management, security patterns, and scalability in GCP environments.
β’ Demonstrated ability to operate independently with full architectural ownership while influencing senior stakeholders in client-facing environments.
β’ Experience leading and mentoring engineers, setting architectural standards, and driving technical governance.
Nice to Have
β’ Exposure to identity resolution solutions: Understanding of PII handling, privacy compliance (GDPR/CCPA), and secure identity mapping.
β’ Experience with legacy enterprise systems integration, including Oracle, SQL Server, MySQL, and Hadoop. Understanding of CDC (Change Data Capture) patterns and batch vs. real-time ingestion trade-offs.
β’ Background in data governance, metadata management, or data observability practices. Experience with data catalogs (e.g., Alation, Collibra), data quality frameworks, and lineage tracking.
β’ Experience defining platform standards, reference architectures, or internal best practices for data engineering teams. Ability to create reusable patterns and accelerators.
β’ Background in building internal tools, shared data platforms, or accelerators used across multiple teams or clients. Experience with CI/CD for data pipelines and infrastructure-as-code.
β’ Understanding of BI tools (Power BI, Tableau) and how they consume data from data warehouses. Experience designing semantic layers or datasets for business-user self-service.
β’ Experience with retail or e-commerce customer data use cases, including transaction history, loyalty programs, marketing engagement, and cross-channel customer journeys.
Salary Range
β’ $120,000 - $160,000 (CAD)
Why Youβll Love Working at ShyftLabs
Hybrid Flexibility: 4 days per week in our downtown Toronto office.
Comprehensive Benefits: 100% coverage for health, dental, and vision insurance for you and your dependents from day one.
Real Impact: Build AI systems used by Fortune 500 companies.
Growth & Learning: Continuous learning opportunities and influence over technical direction.
Ownership: Shape applied research and AI strategy in a fast-growing, product-focused data company.
Inclusion at ShyftLabs
Weβre building something big, and we want you on the journey with us. If youβre ready to use data and innovation to make an impact, apply today and letβs grow together.
ShyftLabs is an equal-opportunity employer committed to creating a safe, diverse, and inclusive environment. We encourage applicants of all backgrounds including ethnicity, religion, disability status, gender identity, sexual orientation, family status, age, and nationality to apply. If you require accommodation during the interview process, let us know and weβll be happy to support you.