cover image
SOFTEAM Expertise Data & IA

SOFTEAM Expertise Data & IA

www.softeamgroup.fr

3 Jobs

62 Employees

About the Company

Softeam est la marque du Conseil et des Services de Docaposte. Son organisation s’articule autour d’une large gamme d’expertises en Data & AI pour vous accompagner dans la digitalisation de vos organisations en associant expertise métier et innovations technologiques. Nous disposons d’un savoir-faire unique dans la Data Intelligence, le Data Management, le management de la performance, la Robotic Process Automation, la Data Science, le machine learning et le deep learning.

Listed Jobs

Company background Company brand
Company Name
SOFTEAM Expertise Data & IA
Job Title
Ingénieur(e) IA / ML Ops Senior – Banque / CDI F/H
Job Description
**Job Title** Senior AI / ML Ops Engineer – Banking (Full Time, M/F) **Role Summary** Lead technical delivery of large‑scale, strategic Data & AI projects in a banking environment. Own end‑to‑end model development, production pipelines, and governance to ensure robust, explainable, and compliant AI solutions. **Expactations** - 5+ years of operational AI/ML experience, preferably in finance. - Deep expertise in ML Ops tooling (Kubernetes, MLflow, Argo, Docker, Terraform, Databricks). - Strong background in ML/Deep Learning frameworks (scikit‑learn, PyTorch, TensorFlow). - Proven CI/CD, orchestration, monitoring, and automation skills. - Knowledge of AI ethics, regulatory requirements (IA Act, GDPR). - Excellent technical communication, mentorship, and cross‑functional collaboration. - Experience in data‑centric banking systems is an advantage. **Key Responsibilities** - Define business needs, translate them into technical specifications. - Coach Data Scientists on ML/Deep Learning/GenAI/Agentic AI best practices. - Design and implement industrial‑ready, explainable AI models. - Build, test, and maintain ML Ops pipelines (CI/CD, versioning, automated tests). - Enforce ML Ops standards: reproducibility, governance, security, data quality, IaC. - Ensure compliance with regulations (IA Act, GDPR, internal norms). - Document and promote AI best practices. - Lead production deployment and post‑deployment monitoring. - Act as technical liaison for AI validation and audits. - Continuously scan for innovative solutions and upsell opportunities. - Mentor and develop junior team members. **Required Skills** - ML Ops technologies: Kubernetes, MLflow, Argo, Docker, Terraform, Databricks. - ML/Deep Learning: scikit‑learn, PyTorch, TensorFlow, generative AI, agentic AI. - CI/CD, Git, automated testing, performance tuning. - Monitoring & observability for ML models. - Data governance, security, and regulatory compliance. - Strong problem‑solving, analytical, and communication abilities. - Leadership and knowledge‑sharing mindset. **Required Education & Certifications** - Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or equivalent. - Professional certifications in AI/ML, DevOps, or cloud platforms preferred (e.g., TensorFlow, Kubernetes, MLflow).
Paris, France
On site
Senior
04-11-2025
Company background Company brand
Company Name
SOFTEAM Expertise Data & IA
Job Title
Spécialiste en intelligence artificielle
Job Description
**Job Title** Senior AI / ML Ops Engineer **Role Summary** Lead end‑to‑end development and deployment of AI/ML solutions for strategic financial projects. Serve as the technical champion for a portfolio of AI use cases, ensuring robust, explainable, and production‑grade models through rigorous ML Ops practices. **Expectations** - 5–8 years of proven experience in AI/ML engineering with a strong focus on ML Ops. - Demonstrated leadership in technical delivery and mentoring junior staff. - Deep knowledge of data, cloud, and regulation compliance (GDPR, AI Act). - Operational fluency in English. **Key Responsibilities** - Translate business requirements into technical AI & data specifications. - Guide model creation, validation, explainability, and industrialization. - Define and enforce ML Ops framework: automation, versioning, reproducibility, CI/CD, governance. - Design and implement CI/CD pipelines for model deployment and monitoring. - Ensure quality, compliance, and traceability of models and deliverables. - Promote best practices in feature engineering, data validation, code quality, security, and reusability. - Audit and validate models produced by data scientists. - Mentor junior engineers in AI/ML Ops practices. - Maintain active technical watch on emerging frameworks, architectures, and tools. **Required Skills** **Programming & Scripting** – Python, SQL, Bash **AI/ML Frameworks** – MLflow, Kubeflow, SageMaker, Vertex AI, Airflow, DVC **CI/CD & Containerization** – Docker, Kubernetes, GitHub Actions, Jenkins **DataOps & Streaming** – Spark, PySpark, Databricks, data pipelines **Cloud & Data Environments** – AWS, GCP, Azure (experience with relevant AI/ML services) **Governance & Compliance** – GDPR, AI Act, internal data governance **Soft Skills** – Leadership, technical rigor, clear communication, teamwork, problem‑solving **Required Education & Certifications** - Bachelor’s degree (or higher) in Computer Science, Data Engineering, or a related field. - Professional certifications in AI/ML or Cloud platforms (e.g., GCP Professional ML Engineer, AWS ML Specialty) are advantageous. ---
Paris, France
Hybrid
Mid level
06-11-2025
Company background Company brand
Company Name
SOFTEAM Expertise Data & IA
Job Title
DATA Engineer Big Data
Job Description
Job Title: Data Engineer Big Data Role Summary: Senior Java Big Data consultant responsible for designing, implementing, and administering large‑scale data platforms. Focus on data ingestion, processing, quality control, and performance optimization using Hadoop, Spark, and Scala. Contribute to architecture definition and support end‑to‑end data governance, visualisation, and AI workflows. Expectations: * Minimum 5 years of professional experience as a Java Big Data consultant. * Strong technical expertise in Java, Hadoop, Spark, and Scala. * Proven ability to lead data engineering projects from requirements gathering through deployment and maintenance. * Fluency in English (written and spoken). Key Responsibilities: * Analyze business requirements and translate them into technical solutions. * Participate in architecture design and definition of data platform components. * Develop and maintain data pipelines (Spark jobs, Java services) for ingestion, transformation, and loading into the Data Lake. * Ensure data quality through validation, monitoring, and testing (unit and load tests). * Perform performance tuning and optimisation of Spark workflows. * Conduct capacity planning, load tests, and performance benchmarking on big‑data clusters. * Maintain documentation and support knowledge transfer to team members. Required Skills: * Java development (core, streams, concurrency). * Big Data ecosystem: Hadoop, Spark, Scala, and related libraries (e.g., Spark SQL, DataFrames, Datasets). * Data ingestion tools (Kafka, Flume, Sqoop, or equivalent). * Cluster administration (YARN, Mesos, or Kubernetes on cloud/on‑premises). * Data quality verification and testing frameworks. * Familiarity with data governance concepts and metadata management. * Strong problem‑solving, debugging, and optimisation capabilities. * Excellent communication and collaboration skills. Required Education & Certifications: * Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. * Professional certifications in Big Data or Cloud (e.g., Cloudera Certified Associate, Hortonworks, Microsoft Azure Data Engineer, or equivalent) are desirable. ---
Paris, France
Hybrid
Mid level
03-12-2025