- Company Name
- UNOWHY
- Job Title
- Ingenieur DataOps F/H
- Job Description
-
Job Title: DataOps Engineer (F/M)
Role Summary:
Engineer scalable data pipelines, automate ingestion and processing, and industrialize data solutions. Collaborate closely with Product, DevOps, and Integration teams to ensure data quality, security, and seamless production of analytics and machine‑learning models.
Expectations:
- Minimum 3 years’ experience as DataOps, Data Engineer, or data‑oriented DevOps professional.
- Proven ability to design, deploy, and maintain robust data pipelines (batch and real‑time).
- Proficiency in Python; knowledge of Go is a plus.
- Strong familiarity with cloud environments, CI/CD, automation, and container orchestration (Docker/Kubernetes).
- Demonstrated understanding of data quality, monitoring, and process documentation.
- Excellent analytical, autonomous, and collaborative skills.
Key Responsibilities:
- Develop, maintain, and monitor scalable data pipelines (batch or streaming with Kafka).
- Integrate data from disparate sources (e.g., CPaaS, Salesforce) to feed business analytics.
- Ensure data quality, security, and traceability across ingestion, transformation, storage, and exposure layers.
- Contribute to the architecture and upkeep of the data warehouse and related data platforms.
- Implement CI/CD practices for data projects: testing, versioning, integration, and deployment.
- Containerize data processes with Docker/Kubernetes and manage orchestration.
- Collaborate on productionizing machine‑learning models and analytical outputs.
- Optimize performance in cloud environments and document industrialization workflows.
- Provide technical support to Product, Marketing, and IT teams for data access and usage.
- Lead continuous improvement of data practices, including documentation, testing, monitoring, and security.
- Set up supervision and alerting for critical data processes.
Required Skills:
- Data pipeline design and implementation (ETL/ELT).
- Real‑time streaming technologies (Kafka, equivalent).
- Cloud platforms (public or hybrid).
- CI/CD tooling for data (Git, Jenkins/Argo workflows, unit/ integration testing).
- Containerization and orchestration (Docker, Kubernetes).
- Python programming; Go knowledge advantageous.
- Data quality strategies, monitoring frameworks, and alerting mechanisms.
- Strong analytical, troubleshooting, and documentation capabilities.
Required Education & Certifications:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field (or equivalent professional experience).
- No mandatory certifications required; relevant industry certifications (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer) considered a plus.
Neuilly-sur-seine, France
Hybrid
Junior
24-10-2025