Orion Innovation is a premier, award-winning, global business and technology services firm. Orion delivers game-changing business transformation and product development rooted in digital strategy, experience design, and engineering, with a unique combination of agility, scale, and maturity. We work with a wide range of clients across many industries including financial services, professional services, telecommunications and media, consumer products, automotive, industrial automation, professional sports and entertainment, life sciences, ecommerce, and education.
Job Overview:
We are seeking an experienced Senior Data Engineer to build and optimize scalable data platforms using Microsoft Fabric, Databricks (and/or Snowflake). The role focuses on designing reliable data pipelines, lakehouse and warehouse models, and semantic layers that enable enterprise analytics, BI, and AI/Gen AI use cases.
You will work closely with analytics, BI, and data science teams to deliver high-quality, performant, and governed data solutions, while driving best practices in data engineering, optimization, and platform design.
Key Responsibilities
β’ Design, build, and maintain end-to-end data solutions on Microsoft Fabric, Databricks, including Pipelines, Notebooks, Lakehouse, Data Warehouse, and Semantic Models.
β’ Implement scalable data ingestion, transformation, and loading (ETL/ELT) using Fabric Pipelines and PySpark.
β’ Develop robust data models and schemas optimized for analytics, reporting, and AI-driven consumption.
β’ Create and maintain semantic models to support Power BI and enterprise BI solutions.
β’ Engineer high-performance data solutions that meet requirements for throughput, scalability, quality, and security.
β’ Author efficient PySpark and SQL code for large-scale data transformation, data quality management, and business rule processing.
β’ Build reusable framework components for metadata-driven pipelines and automation.
β’ Optimize Lakehouse and Data Warehouse performance, including partitioning, indexing, Delta optimization, and compute tuning.
β’ Develop and maintain stored procedures and advanced SQL logic for operational workloads.
β’ Design and prepare feature-ready datasets for AI and GenAI applications.
β’ Collaborate with data scientists and ML engineers to productionize AI pipelines.
β’ Implement data governance and metadata practices required for responsible AI.
β’ Leverage Fabric, Databricks capabilities to orchestrate and monitor AI-related data workflows.
β’ Apply data governance, privacy, and security standards across all engineered assets.
β’ Implement monitoring, alerting, and observability best practices for pipelines and compute workloads.
β’ Drive data quality initiatives, including validation frameworks, profiling, and anomaly detection.
β’ Partner with analytics, BI, data science, and product teams to understand requirements and translate them into technical solutions.
β’ Mentor junior engineers and contribute to engineering standards and patterns.
β’ Participate in architecture reviews, technical design sessions, and roadmap planning.
β’ Develop and deliver dashboards and reports using Power BI and Tableau.
Required Skills:
β’ Bachelorβs or Masterβs degree in Computer Science, Engineering, Data Science, or equivalent experience.
β’ 7+ years of professional experience in data engineering or a related discipline.
β’ Mandatory hands-on expertise with Microsoft Fabric, Databricks or Snowflake including:
β’ Pipelines
β’ Notebooks
β’ Lakehouse
β’ Data Warehouse
β’ Semantic Models
β’ Advanced proficiency in PySpark for distributed data processing.
β’ Strong command of SQL for analytical and operational workloads.
β’ Experience developing and optimizing stored procedures and complex SQL transformations.
β’ Understanding of GenAI architectures, vectorization, embedding pipelines, and data preparation for LLM use cases.
β’ Strong knowledge of data modeling, ETL/ELT patterns, and modern data lakehouse principles.
β’ Experience designing and optimizing large-scale data pipelines in cloud environments (Azure preferred).
β’ Excellent problem-solving skills with the ability to analyze complex data workflows.
β’ Proficiency in creating interactive dashboards and reports using Power BI and Tableau.
Orion is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, citizenship status, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
Candidate Privacy Policy
Orion Systems Integrators, LLC and its subsidiaries and its affiliates (collectively, βOrion,β βweβ or βusβ) are committed to protecting your privacy. This Candidate Privacy Policy (orioninc.com) (βNoticeβ) explains:
β’ What information we collect during our application and recruitment process and why we collect it;
β’ How we handle that information; and
β’ How to access and update that information.
Your use of Orion services is governed by any applicable terms in this notice and our general Privacy Policy.