- Company Name
- Doctolib
- Job Title
- Senior Data Ops Engineer (x/f/m)
- Job Description
-
Job Title: Senior Data Ops Engineer
Role Summary: Design, build, and maintain scalable, secure data infrastructure that supports acquisition, transformation, and analytics of large‑scale datasets, enabling data‑driven decisions across the organization.
Expectations: Minimum 5 years of professional experience as a DataOps Engineer or equivalent role, delivering complex, production‑grade data pipelines and infrastructure. Proven track record of automation, performance tuning, and cross‑functional collaboration.
Key Responsibilities:
- Architect and implement reliable data warehousing solutions (e.g., Redshift, BigQuery) and associated storage layers.
- Develop, test, and deploy end‑to‑end ETL/ELT pipelines using orchestration tools (e.g., Airflow) and containerization (Docker).
- Apply infrastructure‑as‑code (Terraform) to provision and manage cloud resources on AWS, Azure, or GCP.
- Automate data workflows, monitoring, alerting, and cost‑optimization across the data platform.
- Troubleshoot pipeline failures, performance bottlenecks, and data quality issues; implement proactive remedies.
- Collaborate with data engineers, scientists, product owners, and security teams to translate requirements into technical solutions.
- Communicate progress, risks, and technical decisions clearly to non‑technical stakeholders.
Required Skills:
- Python programming (data manipulation, API integration).
- Experience with modern data warehouses (Redshift, BigQuery, Snowflake, etc.).
- Proficient with Docker, Airflow, and CI/CD pipelines for data.
- Hands‑on cloud services (AWS, Azure, or GCP) and IAM/security best practices.
- Infrastructure‑as‑code with Terraform or equivalent.
- Strong analytical, problem‑solving, and debugging capabilities.
- Excellent written and verbal communication; ability to work cross‑functionally.
Bonus/Preferred Skills:
- API design experience (FastAPI, Flask, etc.).
- Knowledge of data governance, compliance, and security frameworks.
- Continuous integration & deployment expertise for data workflows.
Required Education & Certifications:
- Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related field (or equivalent practical experience).
- Certifications such as AWS Certified Solutions Architect, Azure Data Engineer Specialty, or Google Cloud Professional Data Engineer are a plus.