- Company Name
- Scalian
- Job Title
- DataOps / Azure - H/F
- Job Description
-
**Job Title**: DataOps Engineer – Azure & Big Data
**Role Summary**
Lead the industrialisation, deployment, and operation of data pipelines across hybrid environments (on‑premise, Azure, and Databricks). Apply DevOps practices to automate CI/CD, ensure data governance, and maintain system reliability while collaborating closely with data science, data, and infrastructure teams.
**Expectations**
- Proven experience in a distributed, hybrid technical ecosystem.
- Strong command of Azure, Databricks, Hadoop/Cloudera, Kafka, Spark, and container platforms.
- Ability to design and optimise data pipelines that meet performance, security, and compliance requirements, including GDPR.
**Key Responsibilities**
- Architect, build, and maintain end‑to‑end data pipelines on Azure, Databricks, and on‑premise clusters.
- Automate deployment and integration workflows using CI/CD tools (Jenkins, GitLab, Azure DevOps).
- Provision and manage cloud, big‑data, and containerised infrastructures (VMware, OpenShift, Kubernetes, Docker).
- Implement and monitor data flows, leveraging tools such as Airflow, Prometheus, Grafana, and Centreon.
- Collaborate with data science, data, and infrastructure teams to translate requirements into scalable solutions.
- Enforce DevOps best practices, version control, code reviews, and testing.
- Ensure compliance with GDPR, security policies, and penetration testing standards (e.g., Cyberwatch).
- Troubleshoot performance issues and optimise resource utilisation across environments.
**Required Skills**
- Big Data & Cloud: Azure, Databricks, Hadoop/Cloudera, Kafka, Spark, Nifi, Spark, SQL (Hive, MySQL, PostgreSQL, Oracle, MariaDB).
- Development & Integration: Python, Scala, Shell/Linux, Ansible, Jenkins, Git/GitLab, Docker, Kubernetes, Nexus, Nginx.
- Data formats: JSON, CSV, XML, Parquet.
- Orchestration & Monitoring: Airflow, Centreon, Dynatrace, Prometheus, Grafana.
- Familiarity with data tools: Dataiku, Tableau, Hue, Dremio.
- Knowledge of GDPR, data security, and cyber‑security practices.
**Required Education & Certifications**
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or related discipline.
- Professional certifications preferred: Microsoft Azure (AZ‑900, DP‑200/DP‑201 or equivalent), Cloudera Certified Associate/Professional, Kubernetes Certified Administrator (CKA), and/or DevOps Engineer certifications.
- Continuous learning and up‑to‑date knowledge of emerging cloud and big‑data technologies.
Neuilly-sur-seine, France
Remote
28-09-2025