cover image
SQLI

SQLI

www.sqli.com

23 Jobs

2,744 Employees

About the Company

Founded in 1990, SQLI Digital Experience is a European full-service digital company that defines, builds and grows the digital business value of international A-brands.
Technical and creative thinkers, their teams are committed to delivering meaningful and engaging experiences by leveraging technologies, methodologies, skills and creativity to get closer to the customer or user and capture their attention. They design, develop and deploy solid and high-performing architectures that improve business agility, increase efficiencies and facilitate business growth.
Their 2,100 employees are located in 13 countries: France, Switzerland, Luxembourg, Belgium, The United Kingdom, Germany, Sweden, The Netherlands, Denmark, Spain, Morocco, Mauritius and Dubai. In 2021, the SQLI Group achieved revenue of €226m.

SQLI has been listed on Euronext Paris (SQI) since 21 July 2000.

For more information, visit our website: https://www.sqli.com

Listed Jobs

Company background Company brand
Company Name
SQLI
Job Title
Expert Technique JAVA_ Build Développement
Job Description
Job title: Java Build Development Expert Role Summary: Senior backend consultant within a digital factory, supporting a leading retail client. Responsible for defining project architectures, overseeing development teams, autonomously managing research‑and‑development studies, delivering complex modules, and ensuring adherence to best practices and quality standards. Expectations: - Provide technical leadership and expertise in backend development. - Ensure timely, high‑quality delivery of features and services. - Collaborate across teams, communicate clearly, and drive continuous improvement. - Remain proactive, autonomous, and maintain rigorous discipline in all tasks. Key Responsibilities: - Define and document project architectures in collaboration with technical leads. - Pilot and monitor ongoing development activities, enforcing coding standards and best practices. - Own upstream R&D studies and refactor existing project architectures independently. - Respond to technical inquiries (AVV) and provide expert guidance. - Conduct technology watch and develop Proof‑of‑Concepts. - Design and implement microservices/event‑driven solutions following Agile methodologies. - Champion software quality through automated testing, code coverage, and static analysis (SonarQube). Required Skills: - Proficiency in Java (17+) and Spring Boot. - Strong experience with Kafka. - Optional front‑end: Angular 18+, React JS, or Vue JS. - Relational database experience: PostgreSQL, SQL. - Version control and CI/CD: Git, GitLab CI/CD pipelines. - Cloud platforms: GCP (BigQuery, BigTable) and Azure. - Containerization: Docker. - Architecture: Clean Architecture, SOLID, YAGNI. - Testing: JUnit, Mockito, Karate, Gatling (performance). - Agile practices: Scrum. Required Education & Certifications: - Bachelor’s degree +5 (equivalent to an engineering school diploma) in Computer Science or related field, or equivalent professional experience.
Rabat, Morocco
Remote
11-09-2025
Company background Company brand
Company Name
SQLI
Job Title
Expert Data & Cloud GCP
Job Description
**Job Title:** Expert Data & Cloud (GCP) **Role Summary:** Senior data professional to design, develop, and deploy forecasting models and analytics tools within an agile Scrum team. Works closely with data scientists, data engineers, and operational teams to extract insights from client data, implement statistical and machine‑learning solutions, and support warehouse and supply‑chain forecasting needs on Google Cloud Platform. **Expectations:** - Minimum 7 years of hands‑on experience in Python and SQL for data modeling. - Proven ability to deliver end‑to‑end forecasting solutions in an agile environment. - Strong collaboration and communication skills to interact with cross‑functional stakeholders. - Ability to translate business requirements into technical specifications and testable deliverables. **Key Responsibilities:** 1. Explore, clean, and analyze large datasets from client datacenters. 2. Define and scope new forecasting functionalities with product owners. 3. Design, develop, and validate statistical and machine‑learning forecasting models (e.g., demand, warehouse load). 4. Implement models and related pipelines on GCP, preferably using BigQuery. 5. Write and execute unit and integration tests to ensure model quality. 6. Collaborate with operational teams to generate hypotheses and refine model assumptions. 7. Participate in Scrum ceremonies (sprint planning, daily stand‑up, retrospectives). **Required Skills:** - Advanced Python (pandas, scikit‑learn, statsmodels, etc.) - Expert SQL proficiency (query optimization, data transformation) - Experience with Google Cloud Platform services; BigQuery knowledge a plus - Solid understanding of statistical modeling, time‑series forecasting, and ML algorithms - Agile/Scrum methodology familiarity - Strong problem‑solving, analytical, and communication abilities **Required Education & Certifications:** - Master's degree (Bac+5) in Computer Science, Engineering, Data Science, or related field - Relevant certifications (e.g., Google Cloud Professional Data Engineer, Scrum Master) are advantageous but not mandatory.
Rabat, Morocco
Remote
25-09-2025
Company background Company brand
Company Name
SQLI
Job Title
Stage de pré-embauche - DATA/AI Engineer F/H
Job Description
Job title: Pre‑Employment Internship – Data/AI Engineer (F/M) Role Summary: A 6‑month internship focused on data engineering, AI model design, MLOps, data integration, and visualization. The candidate will participate in technical design, build and deploy data pipelines, develop and deploy AI/LLM models, and create dashboards using Power BI or equivalent tools, primarily on Azure and Talend environments. Expectations: * Current student in a final‑year engineering program (Data, AI, Computer Science, or related). * Working knowledge of relational (SQL) and non‑relational (NoSQL) databases. * Experience with data modeling (conceptual, logical, and AI model design). * Familiarity with cloud platforms (Azure or GCP) and data services. * Basic understanding of generative AI concepts (RAG, LLM). * Intermediate English (B1+) and proactive communication skills. * Willingness to learn new technologies, propose ideas, and collaborate in an agile team. Key Responsibilities: 1. Technical design & data modeling – develop data dictionaries, flow maps, architecture diagrams, and AI model specifications. 2. Data integration – design and implement ETL workflows using Azure Data services, Talend, and Python scripting (workflows, mappings, KPI calculations, aggregations). 3. MLOps/LLM Ops – train, configure, and deploy AI services (Azure Vision, OpenAI, Mistal AI), set up monitoring, and manage model lifecycle in production. 4. Data visualization – build and maintain dashboards and reports with Microsoft Power BI or other visualization tools. Required Skills: * SQL & NoSQL database systems * Python, Spark, R (data modeling) * Azure Data services, Talend, or equivalent ETL tools * Power BI, QlikSense, Looker (data viz) * Azure Vision, OpenAI, Mistal AI (AI services) * MLOps concepts, model deployment, monitoring * Cloud platform experience (Azure or GCP) * Familiarity with generative AI (RAG, LLM) * Basic scripting for data processing and KPI calculations Required Education & Certifications: * Enrolled in or recently graduated from an engineering program with focus on Data Science, AI, or related field. * Certifications preferred but not mandatory: Microsoft Azure Data Engineer Associate, Azure AI Engineer Associate, or similar.
Le grand-quevilly, France
On site
26-09-2025
Company background Company brand
Company Name
SQLI
Job Title
Ingénieur DevOps F/H
Job Description
Job title: DevOps Engineer (M/F) Role Summary: Responsible for designing, implementing, and maintaining CI/CD pipelines, infrastructure automation, and cloud environments. Collaborates with development and operations teams to enhance deployment quality, reliability, and security across multiple cloud platforms. Expactations: - Current Master’s student or recent graduate in Computer Science, DevOps, Cloud, or Security. - Minimum practical exposure through internships or student projects in DevOps. - 12–24 month apprenticeship, with an alternating 3 weeks on‑site / 1 week study schedule. Key Responsibilities: - Build and manage CI/CD pipelines using GitLab CI, Jenkins, and GitHub Actions. - Automate infrastructure provisioning with Terraform and configuration management via Ansible. - Provision and manage resources on AWS, Azure, or GCP. - Deploy monitoring and observability solutions (Grafana, Prometheus, ELK stack). - Perform DevSecOps activities: vulnerability analysis, secret management, and process hardening. - Work closely with development and ops teams to improve deployment workflows, quality, and reliability. - Document best practices and maintain configuration repositories. Required Skills: Technical: - Proficiency in Bash and Python scripting. - Hands‑on experience with Infrastructure as Code (Terraform, Ansible). - Familiarity with CI/CD tooling (GitLab CI, Jenkins, GitHub Actions). - Understanding of cloud services (AWS, Azure, GCP). - Knowledge of monitoring stacks (Grafana, Prometheus, ELK) and basic security practices (DevSecOps). Functional: - Strong teamwork and communication abilities. - Quick learning aptitude and adaptability. - Self‑sufficiency, meticulousness, and a service‑oriented mindset. Required Education & Certifications: - Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. - Relevant certifications such as AWS Certified Solutions Architect, Azure Administrator, or Kubernetes Administrator preferred but not mandatory.
Lyon, France
On site
01-10-2025