The FreedomPay Commerce Platform is the technology of choice for many of the largest companies across the globe in retail, hospitality, lodging, gaming, sports and entertainment, foodservice, education, healthcare and financial services. FreedomPay’s technology has been purposely built to deliver rock solid performance in the highly complex environment of global commerce. The company maintains a world-class security environment and was first to earn the coveted validation by the PCI Security Standards Council against Point-to-Point Encryption with EMV standard in North America. FreedomPay’s robust solutions across payments, security, identity and data analytics are available in-store, online and on-mobile and are supported by rapid API adoption. The award winning FreedomPay Commerce Platform operates on a single, unified technology stack across multiple continents allowing enterprises to deliver a consistent, repeatable experience on a global scale. FreedomPay is a fast paced, high growth company with a great culture with competitive benefits and compensation with a business casual atmosphere.
 Responsibilities
•
Design and evolve data models and database objects for operational and analytical workloads in Microsoft SQL Server and Snowflake (schemas, roles, warehouses, performance and cost optimization).
•
Build and maintain ELT/ETL pipelines (batch and near-real-time), leveraging Snowflake capabilities (Snowpipe, Streams/Tasks) and orchestration tools (e.g., Airflow or Azure Data Factory) as appropriate.
•
Implement and support data streaming and event-driven ingestion patterns using technologies such as Kafka, Azure Event Hubs, (topics/streams, schemas, consumers, and replay strategies).
•
Leverage Redis and other low-latency data stores for caching and real-time access patterns; partner with application teams to define fit-for-purpose SLAs and data freshness targets.
•
Develop and maintain curated datasets and self-service analytics in Sigma Computing (workbooks, datasets, governance and performance), and support legacy reporting where needed (e.g., SSRS).
•
Collaborate with engineering, analytics, and product teams to deliver data solutions that meet business requirements.
•
Automate deployments using Git-based workflows and CI/CD (e.g., Azure DevOps), including database migration/versioning (Flyway)
•
Use Claude Code (AI-assisted development) to accelerate data pipeline delivery (design, implementation, refactoring, documentation, and troubleshooting) while adhering to security, quality, and SDLC standards.
•
Participate in Agile ceremonies and contribute to continuous improvement of data engineering processes and standards.
•
Establish data quality, testing, and observability (e.g., unit/integration tests for pipelines, data validation, lineage, alerting, SLAs) to ensure reliable delivery.
•
Partner with engineering, analytics, and product teams to define and deliver data products (source-to-target mappings, contracts, SLAs), enabling trustworthy analytics and operational use cases.
•
Ensure data security, governance, and compliance across platforms (PII handling, encryption, auditing, retention), including Snowflake RBAC, secure data sharing, and access controls.
•
Troubleshoot and resolve performance, reliability, and scalability issues across data platforms; instrument pipelines with logging/metrics and on-call friendly runbooks.
Qualifications
•
Strong understanding of modern data engineering practices and tools (cloud data platforms, orchestration, testing/observability, DataOps, and AI-assisted development with Claude Code).
•
Strong English reading and writing communication skills, with an ability to express and understand complex technical concepts.  As other languages are a requirement, that will be explicitly noted during the recruitment process.
•
Strong analytical, problem-solving, and conceptual skills.
•
Hands-on experience with Snowflake and integrating it into production data pipelines.
•
Experience enabling governed self-service analytics with Sigma Computing (datasets, workbooks, access controls, and performance best practices).
•
Experience with streaming/event platforms such as Kafka, Azure Event Hubs, including schema/versioning considerations and operational support.
•
Proficiency with Python for data engineering automation and/or building pipeline components; experience with orchestration (Airflow and/or Azure Data Factory) is strongly preferred.
•
Experience using Claude Code to develop, test, and iterate on data pipeline solutions (e.g., generating boilerplate, improving SQL/Python, and speeding up root-cause analysis) with appropriate human review.
•
Ability to work in teams and strong interpersonal skills.
•
Ability to work under pressure and meet tight deadlines. 
•
Ability to anticipate potential problems and determine and implement solutions.
Education/Experience
•
Relevant training in principles and techniques of database development and modeling. Be familiar with systems concepts, design, and standards.  Provide expertise in software usage, functionality, performance, security, aesthetics, resilience, reuse, comprehensibility, and economic and technological tradeoffs.
•
Bachelor’s degree in Computer Science, Software Engineering, MIS, or related discipline; or equivalent practical experience.
•
7+ years of experience in data engineering and/or database engineering, including building and operating production data pipelines.
•
Experience working within an AGILE Scrum or SafAgile software development environment
•
Strong written and verbal interpersonal communication skills in the English language
Technical Expertise
•
Expert in designing, optimizing, and scaling relational and cloud-native data platforms (SQL Server, Snowflake, Cosmos DB, Redis).
•
Experience with streaming architectures and tooling (Kafka/Event Hubs/Kinesis), including delivery semantics, late/out-of-order events, and operational monitoring.
•
Strong proficiency in Python and SQL for building pipeline components, automation, and data transformations; familiarity with modern ELT patterns and reusable frameworks.
•
Experience with orchestration and DataOps, Git-based workflows, CI/CD
•
Data quality and observability experience
•
Analytics enablement experience with Sigma Computing, including modeling curated datasets for performance and supporting governed self-service.
•
Comfortable using Claude Code for AI-assisted development to improve engineering velocity and consistency across SQL/Python codebases, tests, and documentation.
•
Proven track record of architecting multi-terabyte, high-performance data solutions
•
Advanced proficiency in T-SQL, query optimization, indexing strategies, and database security
•
Experience implementing DataOps, CI/CD, and automated testing for database deployments
•
Five or more years of experience developing the full range of objects in Microsoft SQL 2019 (or better) databases using SQL and T-SQL
•
Five or more years of experience creating and maintaining ETL packages with SQL Server Integration Services (SSIS); At least three years of experience SSIS with SQL Server 2019 (or better) preferred.
•
Five or more years collaborating with team members on SQL, T-SQL, SSIS, and SSRS performance tuning problems to resolution.
•
Working knowledge of deployments utilizing Database Projects, MS-ISPACs and DACPACs preferred
•
Basic C# applied (non-academic) programming experience knowledge a plus
•
Familiarity or experience with Azure SQL, Azure Data Factory is a plus
•
Experience with version control, preferably GIT
•
Experience with work management tools such as Jira or Azure DevOps
•
Proficient in SDLC process and able to follow through the lifecycle of a process
Attributes
•
Strong collaborative approach to team members inside and outside of database development group to develop solutions and solve problems
•
Professional, positive, and self-motivated approach.
•
Strong relationship-building, written and verbal communication skills at all levels.
•
Ability to work independently and as part of a team, managing multiple tasks and priorities.
•
Commitment to continuous learning and adapting to new technologies and best practices.
•
Exceptional communication skills, able to articulate complex concepts to technical and non-technical audiences
•
Commitment to continuous learning and driving organizational excellence
As the fastest growing commerce company in the industry, we offer the opportunity for tremendous upward mobility within the company as well as development and professional growth opportunities. FreedomPay's fulltime roles provide exceptional benefits including medical, prescription, dental and vision coverage, Life Insurance, Retirement Plans with company match, commission sharing plan, flexible hybrid working environment, and great parental and other leave programs. All positions must be able to successfully pass a background check as well as a credit check.
FreedomPay is an Equal Opportunity Employer, including Disability/Veterans. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.