cover image
Aubay

Aubay

www.aubay.com

11 Jobs

3,923 Employees

About the Company

Present in 7 countries in Europe, Aubay has more than 7800 employees and has generated revenues of € 513,5 million in 2022, with year on year growth consistently.
From consulting and all types of technological projects, we provide our expertise in transformation and modernization of information systems.
We operate and add value in specific markets : banking, finance, insurance, industry, networks, telecom, services, utilities and distribution, amongst others.

Listed Jobs

Company background Company brand
Company Name
Aubay
Job Title
Stage IA – Deep Learning – Computer Vision / Reconnaitre les Coraux par l’IA
Job Description
**Job Title:** Deep Learning Computer Vision Intern – Coral Recognition via AI **Role Summary:** Develop AI systems to identify coral species and assess bleaching levels from images to support marine conservation. Focus on deploying deep learning models for biodiversity monitoring in coral reef ecosystems. **Expectations:** Deliver a machine learning solution that enhances coral reef data collection, analysis, and public engagement through scalable AI tools. **Key Responsibilities:** - Research and implement deep learning/computer vision techniques for coral species recognition and bleaching quantification. - Design and optimize algorithms for automated bleaching percentage calculation. - Curate, clean, and annotate datasets for model training and validation. - Develop web/mobile integration for a participatory coral monitoring platform (e.g., Flask/FastAPI). **Required Skills:** - Proficiency in Python, TensorFlow/PyTorch, OpenCV, and data labeling techniques. - Strong understanding of deep learning architectures and image processing workflows. - Experience with web frameworks (Flask/FastAPI) for deploying ML models. - Analytical problem-solving in AI model development for environmental datasets. **Required Education & Certifications:** - Master’s degree (or equivalent) in Computer Science, Data Science, or related fields. - Certifications in deep learning/computer vision methodologies (e.g., Deep Learning Specialization, OpenCV) preferred.
Boulogne-billancourt, France
Hybrid
05-01-2026
Company background Company brand
Company Name
Aubay
Job Title
Ingénieur DevOps – Nantes (H/F)
Job Description
Job Title DevOps Engineer Role Summary Design, deploy, and manage cloud‐native infrastructures and CI/CD pipelines to ensure reliable, secure, and performant application delivery across cloud platforms (AWS, GCP). Drive end‑to‑end lifecycle of projects from planning through post‑production support in an Agile squad environment. Expectations - Lead technical initiatives and provide expert guidance on DevOps best practices. - Collaborate closely with development, security, and operations teams to align on continuous delivery goals. - Mentor squad members, share knowledge, and foster a culture of continuous improvement. Key Responsibilities - Conduct feasibility studies, design architecture, automate deployments, and monitor post‑deployment performance. - Build, maintain, and scale cloud infrastructures (AWS, GCP) using IaC (Terraform, Ansible). - Create and maintain CI/CD pipelines with GitLab CI/CD, Jenkins, Docker, and Artifact Hub (Nexus/Artifactory). - Manage containerization (Docker, Kubernetes, Helm) and image lifecycle. - Ensure production stability, resilience, security, and incident resolution. - Deploy and enhance monitoring, observability, and logging solutions (Dynatrace, ServiceNow, ELK, SonarQube). - Produce clear, up‑to‑date documentation for stakeholders. - Participate in Agile ceremonies (Scrum, SAFe) and contribute to process optimization. Required Skills Technical: - Cloud platforms: AWS, GCP. - Infrastructure as Code: Terraform, Ansible. - CI/CD: GitLab CI/CD, Jenkins, Digital AI, Artifactory, Nexus. - Containerization: Docker, Kubernetes (including GKE), Helm. - Scripting: Python, Shell. - Monitoring & logging: Dynatrace, ServiceNow, ELK, SonarQube. - Basic familiarity with Java/Maven ecosystems and CI/CD pipelines. Soft: Agile mindset, strong communication, teaching ability, autonomous problem‑solving, diligence, and teamwork. Required Education & Certifications - Bachelor’s degree or higher in Computer Science, Engineering, or related field. - Professional certifications (e.g., AWS Certified DevOps Engineer, GCP Professional Cloud DevOps Engineer, Terraform Associate) are highly desirable.
Nantes, France
Hybrid
06-01-2026
Company background Company brand
Company Name
Aubay
Job Title
Développeur Backend Java – Nantes (F/H)
Job Description
**Job Title:** Backend Java Developer **Role Summary:** Design, develop, and maintain robust, scalable backend services for strategic applications using Java (21), Spring Boot, PostgreSQL, and Google Cloud Platform. Work within an Agile Scrum team, leveraging Docker, Kubernetes, and GitLab CI/CD pipelines to deliver high‑quality software. **Expectations:** - Proven experience in backend Java development (Java 11+). - Strong grasp of clean‑code principles, testability, and security best practices. - Ability to operate autonomously while collaborating effectively in an Agile environment. - Technical curiosity for modern architectures (microservices, cloud). - Professional proficiency in English (preferred). **Key Responsibilities:** - Design, implement, and document backend features that are robust and maintainable. - Utilize the established tech stack (Java 21, Spring Boot, GCP, Docker, Kubernetes) and adhere to client development standards. - Package and deploy deliverables to test environments; manage version control and CI/CD pipelines in GitLab. - Identify, analyze, and resolve defects detected during testing or in production. - Collaborate with Product Owners, architects, and developers to ensure technical coherence and challenge design choices. - Participate actively in Agile ceremonies: daily stand‑ups, sprint planning, reviews, retrospectives, and demos. - Write and execute SQL queries (PostgreSQL) to meet functional requirements. - Drive continuous improvement of development practices, including testing, performance, and documentation. **Required Skills:** - Java (11+, preferably 21) and Spring Boot (REST APIs, batch processing) - PostgreSQL and advanced SQL querying - Google Cloud Platform (GCP) services - Containerization with Docker and orchestration with Kubernetes - Git, GitLab CI/CD pipeline management - Agile Scrum methodology - Clean code, unit/integration testing, security best practices - Strong problem‑solving, communication, and teamwork abilities - Professional level English (optional but advantageous) **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Software Engineering, or a related field (or equivalent experience). - Relevant certifications (e.g., Oracle Certified Professional Java, Google Cloud, DevOps) are a plus.
Nantes, France
Hybrid
12-01-2026
Company background Company brand
Company Name
Aubay
Job Title
Data Engineer - Lille (H/F)
Job Description
**Job Title** Data Engineer **Role Summary** Design, build, and maintain data storage and ingestion solutions. Create scalable pipelines that transform raw data into production-ready formats, expose data via APIs, and support machine‑learning initiatives. Drive continuous improvement of data quality, performance, and automation while collaborating closely with stakeholders and data scientists. **Expectations** - Deliver reliable, high‑performance data pipelines and storage solutions on schedule. - Apply DevOps practices, including CI/CD, to ensure rapid, repeatable deployments. - Communicate clearly with technical and non‑technical audiences. - Work in Agile teams, actively participating in sprint planning, stand‑ups, and retrospectives. - Mentor junior engineers and contribute to knowledge sharing within the community. **Key Responsibilities** 1. Define storage strategy and implement appropriate technologies (SQL, NoSQL, distributed file systems). 2. Design and build data ingestion processes (batch, micro‑batch, real‑time) for structured, semi‑structured, and unstructured data. 3. Develop and maintain end‑to‑end data pipelines that include data cleansing, enrichment, quality checks, and metric calculations. 4. Create and expose APIs for data consumption by internal and external applications. 5. Collaborate with Data Scientists to industrialize and optimize ML algorithms. 6. Conduct workshops and requirement‑gathering sessions with business stakeholders. 7. Ensure pipeline performance, scalability, and reliability through monitoring, testing, and optimization. **Required Skills** - Proficient in Java, Scala, and Python. - Strong experience with Hadoop, Spark, and Kafka. - Expertise in database technologies (SQL and NoSQL) and distributed storage. - Familiarity with cloud platforms (Microsoft Fabric, Google Cloud, Snowflake, Databricks). - Knowledge of DevOps principles and CI/CD tooling. - Understanding of data quality, governance, and metadata management. - Agile methodology experience and effective communication skills. **Required Education & Certifications** - Bachelor’s (BAC) + 5 years of higher education (Master 2, engineering school) in Computer Science or related field. - Minimum 2 years of professional data‑engineering experience in Big Data or Cloud environments. - Relevant certifications (e.g., Microsoft Certified: Azure Data Engineer Associate, Databricks Certified Data Engineer, Snowflake SnowPro Core) are a plus.
Lille, France
Hybrid
Junior
26-01-2026