About Us:
Alfit is a new insurance venture being launched by AEGI, designed to establish a modern, customer-centric platform within the insurance sector. The business will initially focus on health, life, and travel insurance products, with a clear roadmap for future expansion across additional lines.
As we approach our official launch, we are entering a critical growth phase and are actively building a high-caliber team. We are seeking professionals who are motivated to be part of a next-generation insurance company, combining innovation, operational excellence, and customer focus, led by a leadership team comprising highly respected veterans of the insurance industry.
This is a unique opportunity to contribute to building a business from the ground up and to play a meaningful role in shaping its long-term success.
Job Purpose:
The Lead Data Engineer is responsible for the end-to-end design, build and operation of Alfit Insurance's data platform, ensuring the reliable, secure and cost-efficient flow of data from source systems into the analytical, reporting and AI layers used across the organization. As the sole data engineering specialist within the Data & Insights team, the role owns the data architecture, ingestion pipelines, transformation logic and storage layers, and partners closely with the Lead Business Intelligence, Lead Data Science & AI and Lead Governance & Compliance to enable data-driven decision-making across Alfit. The position plays a key role in laying the technical foundations for advanced analytics, regulatory reporting and AI use cases from inception through growth phases. The role is initially designed as a senior technical individual contributor with end-to-end ownership of the data engineering domain. As the Data & Insights function scales and additional data engineers are hired over time, the role may evolve to include people-management responsibilities.
Job responsibilities:
1. Data Architecture, Platform & Pipeline Engineering:
• Own the design, build and continuous evolution of Alfit's data architecture, including the Azure Databricks Lakehouse, Medallion layering (Bronze, Silver, Gold) and storage strategy.
• Build and maintain robust, scalable and cost-efficient data pipelines ingesting data from core source systems (PAS, TPA, CRM, telephony, claims, finance and third-party feeds), through batch, micro-batch and streaming patterns.
• Develop and maintain transformation logic using dbt, PySpark and SQL, and orchestrate workflows through Azure Data Factory and streaming pipelines via Azure Event Hubs.
• Design and implement data models (raw, curated, semantic) optimised for downstream BI, analytics and AI consumption.
• Own the Extract, Transform, Load (ETL/ELT) processes end-to-end, including source-system integration patterns, error handling, retries and recovery mechanisms.
• Act as the technical authority on data engineering matters within the Data & Insights team, in close collaboration with the Head of Data & Insights and peer Leads.
2. Internal Stakeholder Orientation:
• Partner with the Lead Business Intelligence, Lead Data Science & AI and Lead Governance & Compliance to understand their data needs and deliver fit-for-purpose datasets, pipelines and platform capabilities.
• Engage with business stakeholders across Operations, Underwriting, Actuarial, Finance, Risk and Compliance to translate business needs into robust data products and prioritise the engineering backlog accordingly.
• Promote a service-oriented engineering mindset and advocate for the business perspective in technical discussions.
3. Data Governance & Compliance Orientation:
• Implement data engineering practices that meet CBUAE, PDPL, ADHICS and applicable health-regulator data-protection and retention requirements, as well as Alfit's internal data governance standards.
• Implement and maintain data classification, lineage, cataloguing and access controls through Microsoft Purview and the relevant Azure security services, in coordination with the Lead Governance & Compliance.
• Apply data minimisation, masking, encryption and audit-logging principles by design across all pipelines and datasets, with particular attention to personal and health-sensitive data.
• Support internal and external audits by providing documentation, addressing queries and executing corrective actions on data flows, lineage and access controls.
4. Performance & Quality:
• Define, monitor and report on data platform service-level indicators, including pipeline freshness, completeness, accuracy, latency and uptime, taking corrective action where targets are at risk.
• Implement and maintain a data quality framework, including automated tests, anomaly detection and reconciliation routines against source systems, in coordination with the Lead Governance & Compliance.
• Monitor platform consumption and cost (Databricks DBUs, storage, compute, egress) and proactively optimise workloads to stay within agreed budgets.
• Maintain comprehensive technical documentation, runbooks and knowledge-base content covering pipelines, data models, lineage and incident response procedures.
• Maintain clear communication with internal stakeholders (Data & Insights peers, IT, Enterprise Architecture, Information Security, business sub-units) and external parties (Bytesforce, Nagarro, Microsoft and other technology partners).
• Provide regular performance and operational reports to the Head of Data & Insights to inform decision-making.
5. Continuous Improvement:
• Contribute to the development of Business Requirements and lead the technical design and delivery of new data products, integrations and platform enhancements.
• Lead User Acceptance Testing (UAT) for new data pipelines, models and platform components, ensuring business requirements are met before production release.
• Continuously evaluate emerging tools, frameworks and patterns in the data engineering ecosystem, and propose innovations that improve reliability, performance, cost efficiency or developer productivity.
• Contribute to the evolution of the overall data architecture in collaboration with Enterprise Architecture, Information Security and the wider IT team, including the roadmap for advanced analytics, machine learning and AI enablement.
6. Overarching Accountabilities:
• Adhere to Alfit's standards, policies and regulatory requirements at all times.
• Embody Alfit Insurance's core values and act as a role model for the team.
• Support the Head of Data & Insights in the execution of departmental objectives and deputise on data engineering matters as required.
• Undertake additional administrative and operational tasks as required to ensure business continuity.
• Bachelor's or Master's degree in Computer Science, Software Engineering, Information Systems, Data Engineering or a related field.
• Relevant certifications (e.g., Microsoft Azure DP-203 Data Engineer Associate, Databricks Certified Data Engineer Professional) are an advantage.
• Minimum 7 years of hands-on data engineering experience, with proven end-to-end ownership of production-grade data platforms.
• Proven track record delivering cloud-based data platforms, ideally on Azure.
• Prior experience in the insurance, banking or healthcare industry is an advantage, with exposure to regulatory data and reporting.
• Experience operating as a sole or principal data engineer in a small team is an advantage.
• Strong, hands-on proficiency in Python (including PySpark) for production-grade data engineering, with demonstrated experience building ETL/ELT pipelines at scale.
• Deep, hands-on expertise in the Azure data stack, including Azure Databricks (Lakehouse, Delta Lake, Unity Catalog), Azure Data Factory and Azure Event Hubs.
• Strong proficiency in SQL and solid understanding of data modelling techniques (dimensional, data vault, normalized) and their trade-offs.
• Experience with the Medallion architecture pattern (Bronze, Silver, Gold) in a production environment.
• Excellent oral and written communication skills in English.