Overview Â
Â
Model N is  seeking a highly skilled Finance Data Analyst to join our Business Intelligence (BI) team in India. This role partners closely with U.S. Finance and cross-functional teams to deliver data-driven insights that support financial planning, forecasting, and performance management. The ideal candidate combines strong analytical, technical, and business acumen with a passion for transforming data into clear, actionable insights.Â
Â
Â
ResponsibilitiesÂ
Â
Partner with Finance and cross-functional teams to translate business questions into clear, data-driven insights that support planning, forecasting, and performance management.Â
Execute end-to-end analytics projects: define requirements, explore financial and operational datasets, build data models, and deliver insights through dashboards, reports, and presentations.Â
Develop, enhance, and maintain Power BI dashboards and reporting assets to track key financial and go-to-market metrics (e.g., pipeline coverage, ARR/NRR, revenue retention, quota attainment, customer health).Â
Analyze customer, revenue, and product usage data to identify trends related to growth, churn, expansion, pricing, and performance that inform strategic decision-making.Â
Support development of predictive or statistical models (e.g., retention/expansion propensity, forecast drivers, segmentation) using Python or other analytical tools.Â
Validate data accuracy and reconcile financial and operational metrics across CRM, ERP, and data warehouse sources, partnering closely with BI/IT and Data Engineering teams.Â
Use SQL, Python, and Azure Fabric (or similar cloud data tools) to prepare datasets, automate reporting workflows, and scale financial analytics processes.Â
Ensure alignment on KPI definitions, data sources, and metric governance across Finance and GTM teams to drive consistent reporting and performance transparency.Â
Continuously improve analytics workflows, data visualization standards, and self-service reporting capabilities to enable efficient, insight-driven decision support across the organization.Â
Â
RequirementsÂ
5–8 years of experience in business intelligence, data analytics, or a related role.Â
Proven ability to deliver actionable insights from large, complex, and diverse datasets.Â
Experience working cross-functionally to support strategic initiatives.Â
Bachelor’s degree in a quantitative, technical, or business field (e.g., Statistics, Mathematics, Economics, Computer Science, Data Science, Business Analytics). Master’s degree preferred in Business Analytics, Data Science, or a related discipline.Â
Â
Technical SkillsÂ
Advanced proficiency in Power BI, DAX, data modelling, and dashboard design.Â
Strong proficiency in Python for data analysis, statistical modelling, and automation.Â
Experience working with Azure Fabric or other modern cloud-based data platforms.Â
Strong SQL skills for querying, joining, and transforming structured data.Â
Understanding of data warehousing concepts (e.g., star schema, dimensional modelling); experience with platforms like Azure Synapse, Snowflake, or Redshift.Â
Exposure to CRM platforms (e.g., Salesforce) and familiarity with version control tools (e.g., Git).Â
Â
Analytical & Business AcumenÂ
Solid foundation in descriptive and predictive analytics, including statistical methods and machine learning.Â
Ability to integrate, clean, and analyze data from disparate systems.Â
Skilled in translating complex data into clear narratives, actionable insights, and strategic recommendations.Â
Job Responsibilities
• Architect and Build Pipelines: Design, develop, and maintain automated ETL/ELT pipelines to ingest data from diverse sources (ERP, CRM, Billing systems).
• Data Modeling: Design and implement scalable data models (Star Schema, Data Vault, or OBT) that support complex financial reporting, ensuring high performance and data integrity.
• Workflow Orchestration: Lead the transition from legacy manual processes to robust, automated pipelines. Use Python and AWS native orchestration to engineer scalable infrastructure that powers high-availability data products.
• Optimization & Scaling: Continuously improve data ingestion throughput query performance to handle increasing volumes of Financial, Sales, and Marketing data.
• Data Governance & Quality: Implement custom Python-based validation frameworks and CloudWatch monitoring to ensure gold-standard accuracy for financial metrics like ARR, NRR, and Churn.
• Cross-Functional Collaboration: Partner with BI Analysts and functional teams to translate business requirements into technical data specifications and architectural designs.
• DevOps Integration: Maintain and promote code quality through version control (Git), CI/CD pipelines, and rigorous documentation of the data lineage.
• Semantic Layer: Institutionalize KPI definitions and metric governance by building a unified semantic layer; ensure data consistency across Finance and GTM systems to eliminate reporting silos and maintain a single source of truth.
• Security & Compliance: Ensure all financial data pipelines adhere to strict security standards, encryption, and access control policies.
Job Qualification
• 4+ years of experience in data engineering, backend development, or data architecture.
• Proven track record of building and scaling production-grade data pipelines.
• Experience working cross-functionally to support strategic initiatives.
• Bachelor’s degree in computer science, software engineering, or a related technical field. Master’s degree in a technical discipline preferred.
Technical Skills
• Advanced SQL: Expert-level ability to write complex, performant queries and stored procedures.
• Programming: Strong proficiency in Python for data engineering and API integrations.
• AWS Mastery: Strong hands-on experience building and scaling production-grade pipelines using the AWS stack (S3, Glue, Redshift, Lambda, or Athena).
• Data Architecture: Mastery of data warehousing concepts, dimensional modeling, and Lakehouse architecture.
• Data Pipeline Automation: Proven experience designing and managing complex task dependencies and distributed workflows. Proficiency in using industry-standard orchestration engines to ensure resilient, scalable, and observable data movement.
• BI Support: Expertise in developing robust backend data models to support enterprise reporting. Proficiency in optimizing analytical query performance, managing tabular schemas, and establishing unified metric definitions to ensure data consistency across visualization tools.
• DevOps: Solid experience with Git and an understanding of CI/CD practices for data deployments
About Model NÂ Â
Model N is the leader in revenue optimization and compliance for pharmaceutical, medtech and high-tech innovators. For more than 25 years, we have helped customers maximize revenue, streamline operations, and maintain compliance through cloud-based software, value-add services, and data-driven insights. With a focus on innovation and customer success, Model N empowers life sciences and high-tech manufacturers to bring life-changing products to the world more efficiently and profitably. Model N is trusted by over 150 of the world’s leading companies across more than 120 countries. For more information, visit www.modeln.com.