The central Business Intelligence team builds and maintains the SIXT Data Platform and its Data Catalogue, serving thousands of users globally. Our flagship product, the SIXT Data Shop, ingests tens of millions of events daily, processes them to meet diverse consumer needs, and delivers insights through a comprehensive toolboxβfrom self-service batch processes to real-time ML model endpoints.
Our current focus areas include advancing operational automation, expanding self-service capabilities for data ingestion, ETL workflows, and distribution, and enabling innovative use cases for our customers. Our team of 20 Data Engineers and Business Intelligence Engineers is hands-on, collaborative, and constantly exploring cutting-edge technologies. At SIXT, we value success, agility, ownership, and intercultural collaboration. Join us and become part of the SIXT family.
YOUR ROLE AT SIXT
β’ Drive Innovation: Explore and implement the latest AWS and big data technologies to uncover hidden opportunities, enable new capabilities, and build integrations for the SIXT Data Shop.
β’ Collaborate Cross-Functionally: Partner with Data Engineers, BI Analysts, and Data Scientists to architect optimal solutions for diverse analytical use cases, including dashboarding, ad hoc analytics, data-as-a-product, and machine learning.
β’ Shape Platform Strategy: Contribute to the Data Platform vision and roadmap through your expertise, innovative ideas, and intellectual curiosity.
β’ Ensure Data Integrity: Demonstrate exceptional work ethics and integrity when handling sensitive customer data, maintaining the highest standards of data protection.
β’ Architect Agentic Pipelines for Analytics: Design and implement multi-agent workflows that automate end-to-end data operations β from ingestion and transformation to quality validation and distribution β defining clear specs, acceptance criteria, and quality gates so agents execute reliably with minimal human intervention.
β’ Enable AI-Powered Data Products: Drive the adoption of LLM-augmented data capabilities within the SIXT Data Shop, including RAG pipelines, semantic search over the Data Catalogue, and AI-assisted self-service experiences for internal consumers.YOUR SKILLS MATTER
β’ Education&Professional Experience: Bachelor's or Master's degree in Computer Science, Data Engineering, or related field preferred. At least 5 years of professional experience as senior data engineer.
β’ Data Architecture Expertise: Strong experience with hybrid Data Lake and Data Warehouse architectures, including end-to-end automation of data models from source systems to analytical dashboards using ELT methodologies.
β’ Cloud Data Warehousing: Proven expertise with analytical cloud data warehouses (Redshift, Snowflake), data transformation using dbt, and orchestrating interdependent workflows with Apache Airflow.
β’ AI Proficiency: Hands-on experience with AI coding assistants and agentic workflow automation. Ability to leverage LLMs with contextual data and solid understanding of RAG systems and demonstrated experience designing agentic architectures where LLM-driven agents are orchestrated across coding, validation, and data quality stages with structured quality gates. Experince with vector and/or graph databases feeding RAG is a strong plus.
β’ Agentic Development Maturity: Defines specs and acceptance criteria that autonomous agents can execute end-to-end, reviews diffs rather than line-by-line output, and encodes quality standards as automated pipeline gates. Experience building or evaluating multi-agent orchestration (coding β QA β validation agents) is a strong plus.
β’ Specification & Direction Skills: Ability to decompose complex data problems into structured, measurable specs β including dependencies, non-functional requirements, and edge cases β that agent systems can consume and execute with high reliability. Capable of identifying architectural drift in agent-generated outputs and course-correcting before it compounds.
β’ Engineering Excellence: Strong foundation in engineering best practices throughout the development lifecycle, including agile methodologies, code reviews, source control (GitHub), CI/CD pipelines (Jenkins), testing, and operations. Deep understanding of data management fundamentals, distributed systems, and data storage/compute principles. 
β’ Programming Fluency: Advanced proficiency in Python and SQL (required).