Dyson -
UAE
--
Dyson

Job Details

About us

At Dyson, we’re driven by a relentless pursuit of innovation—pushing boundaries in engineering, AI, and robotics. Our new Data Intelligence team sits at the heart of this mission: shaping Dyson’s future through data. Here, we blend creativity, precision, and audacity to power intelligent products. We craft data strategies and pipelines that fuel the next generation of connected devices.


You’ll work alongside brilliant minds from Dyson global engineering team and external software/hardware partners in an environment built for exploration, discovery, delivery and impact.


About the role

We are seeking Data Intelligence MLOps Engineer to design, build, and maintain the backbone of our Machine Learning lifecycle. You will be responsible for the "industrialization" of AI, moving models from experimental notebooks into robust, production-grade pipelines. Your mission is to automate the journey from raw data curation to model deployment, ensuring our CI/CD cycles are fast, observable, and reproducible.


Key Responsibilities
  • End-to-End Pipeline Orchestration: Build and manage automated workflows for data preparation, feature engineering, model training, and evaluation.


  • Machine Learning CI/CD/CT Implementation: Develop Continuous Integration (code testing), Continuous Deployment (model serving), and Continuous Training (retraining triggers) systems.


  • Infrastructure as Code (IaC): Manage scalable Machine Learning infrastructure using tools like MLFlow


  • Model Monitoring & Observability: Implement dashboards and alerts for model drift, data skew, and system performance (latency/throughput).


  • Registry Management: Maintain the Model Registry and Feature Store to ensure versioning and lineage across all experiments.


  • Security & Compliance: Ensure data privacy and secure access controls throughout the ML lifecycle.


About you
  • 3+ years in DevOps, Data Engineering, or MLOps roles.


  • Proven Track Record: of taking at least one ML project from a research phase to a high-availability production environment.


  • Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related technical field.  


  • Orchestration Tools: Expertise in Kubeflow, Airflow, Dagster, or Prefect.


  • Containerization: Mastery of Docker and Kubernetes (K8s) for managing distributed training and inference.


  • Cloud Platforms: Deep experience with AWS (SageMaker), GCP (Vertex AI), or Azure ML.


  • Version Control: Advanced Git workflows and experience with DVC (Data Version Control) or MLflow.


  • CI/CD Frameworks: Experience with GitHub Actions, GitLab CI, or Jenkins specifically for ML artifacts.


  • Scripting: High proficiency in Python and Bash for automation.



Dyson is an equal opportunity employer. We know that great minds don’t think alike, and it takes all kinds of minds to make our technology so unique. We welcome applications from all backgrounds and employment decisions are made without regard to race, colour, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other any other dimension of diversity.


Similar Jobs