cover image
ENGIE - International Supply & Energy Management

ENGIE - International Supply & Energy Management

gems.engie.com

3 Jobs

1,470 Employees

About the Company

ENGIE supports global businesses in their decarbonization journey through reliable, flexible, and affordable energy solutions. The energy transition becomes tangible, scalable, and impactful by connecting clients to energy assets via global markets and local operations.
Through long-term supply, risk and flexibility management, and sustainable energy services, ENGIE helps clients navigate complex markets, reduce emissions, and secure their energy future -- leveraging deep market expertise and an extensive international footprint.

Key figures - International Supply & Energy Management perimeter:
* 500 TWh of energy delivered annually
* 59 GW of renewable & flexible assets managed
* 200,000+ global business clients
* 4.3 GW of green PPAs signed
* 250 LNG cargoes handled per year
* 19 offices around the world

Our commitments
* Deliver 24/7 carbon-free electricity for global business needs
* Help businesses manage energy risks and market exposure
* Enable smarter energy systems through data, digital tools, and expertise

Listed Jobs

Company background Company brand
Company Name
ENGIE - International Supply & Energy Management
Job Title
DevOps Engineer
Job Description
Job Title: DevOps Engineer Role Summary: Support design, development, and deployment of data pipeline applications and infrastructure for large-scale forecasting, trading models, and cloud-based solutions. Ensure scalable, secure, and high-performing systems through automation and collaboration with software, data, and operations teams. Expactations: - Automate provisioning and testing processes using Terraform and CI/CD pipelines. - Troubleshoot technical issues in development and production environments. - Enforce cloud security, compliance standards, and operational best practices. - Collaborate cross-functionally to deliver reliable software solutions. Key Responsibilities: - Design, develop, and deploy data pipeline application code and infrastructure. - Automate deployment and provisioning using Terraform and automation tools. - Optimize application testing and deployment processes. - Troubleshoot and resolve technical issues across environments. - Collaborate with development, operations, and security teams for efficient software delivery. - Ensure adherence to cloud security and compliance standards. Required Skills: - Proficiency in AWS and Linux-based systems. - Experience building CI/CD pipelines. - Strong working knowledge of Terraform and automation tools. -Proficiency in Python or equivalent programming language. - Familiarity with containerization, monitoring, and version control. - Advanced troubleshooting and problem-solving skills. - Strong communication and team collaboration abilities. Required Education & Certifications: - Bachelor’s degree in Computer Science, Software Engineering, Information Systems, or related field (or Associate’s degree with three years’ industry experience in DevOps/cloud infrastructure). - Minimum two years’ professional experience in DevOps or cloud infrastructure roles.
Chicago, United states
Hybrid
Junior
02-10-2025
Company background Company brand
Company Name
ENGIE - International Supply & Energy Management
Job Title
Senior Data Scientist
Job Description
**Job title** Senior Data Scientist **Role Summary** Lead advanced quantitative and machine‑learning projects to optimise short‑term power trading in the Western European markets (Belgium, France, Germany, Netherlands). Own the full model lifecycle—from data ingestion and feature engineering to algorithm deployment and performance monitoring—while driving cross‑functional collaboration with traders, analysts and IT to align solutions with business objectives. **Expactations** - Deliver robust, high‑frequency predictive models for price, volume and imbalance signals. - Translate market and asset data into actionable trading strategies and optimisation tools. - Maintain rigorous version control, documentation and CI/CD pipelines ensuring reproducible results. - Communicate complex model insights clearly to stakeholders with limited technical background. **Key Responsibilities** - Design, train and evaluate statistical/ML models using large, real‑time market datasets. - Extend and refine optimisation and trading algorithms to improve portfolio performance. - Monitor model performance, trigger alerts, and initiate model retraining when drift is detected. - Coordinate project milestones, risk assessments and deliverables with cross‑functional teams. - Engage traders and analysts to capture business requirements and translate them into analytical solutions. - Collaborate with IT and architecture teams on deployment, scaling and integration of models into production environments. **Required Skills** - Proficiency in Python (pandas, NumPy, scikit‑learn, PyTorch/TensorFlow). - Strong background in statistical modelling, time‑series analysis and optimisation techniques. - Experience with version control (Git), CI/CD (Azure DevOps, GitHub Actions) and cloud deployment. - Excellent written and verbal communication, capable of explaining technical concepts to non‑technical audiences. - Knowledge of power markets, electricity trading, and asset optimisation. **Required Education & Certifications** - MSc or PhD in Computer Science, Applied Mathematics, Statistics, Physics, or a related quantitative field. - Minimum 5 years of industry experience in data science, quantitative research or a similar technical role. ---
Brussels, Belgium
On site
Senior
10-10-2025
Company background Company brand
Company Name
ENGIE - International Supply & Energy Management
Job Title
Data Engineer
Job Description
Job Title: Data Engineer Role Summary: Design, develop, and maintain large‑scale data pipelines and data architecture for energy analytics. Ensure data availability, quality, and performance while collaborating with data scientists, business, and IT teams. Deploy solutions on cloud platforms (AWS, Azure, GCP) and optimize processing with big‑data frameworks (Spark, Databricks). Expactations: * Deliver robust, scalable pipelines that support real‑time and historical asset data analysis. * Maintain high data quality and governance standards. * Resolve incidents swiftly and implement continuous improvements. * Adapt pipeline designs to evolving data scientist requirements. * Uphold coding standards, documentation, and CI/CD practices. Key Responsibilities: * Architect and implement ETL/ELT processes using Python, SQL, Spark, and Databricks. * Configure and manage cloud services (AWS S3, Glue, Athena, or equivalents). * Build and optimize data warehouses, ensuring performance and reliability. * Collaborate with data scientists to translate analytical needs into technical solutions. * Monitor production pipelines, troubleshoot issues, and conduct root‑cause analysis. * Apply DevOps principles: Infrastructure as Code, CI/CD, and automated testing. * Participate in Agile ceremonies and sprint planning. * Enforce data governance and security best practices. Required Skills: * Strong programming in Python and SQL. * Experience with Spark and Databricks. * Proven knowledge of ETL tools and pipeline orchestration. * Cloud proficiency (AWS, Azure, or GCP) – S3, Glue, Athena, etc. * Data warehousing concepts and best practices. * DevOps, IaC, CI/CD fundamentals. * Agile methodology and team collaboration. * Analytical problem‑solving and detail orientation. * Excellent communication in English; French required. Required Education & Certifications: * Master’s degree in engineering or equivalent with IT focus. * Minimum 3 years of professional experience designing and maintaining data pipelines. * Demonstrated proficiency in big‑data and cloud technologies. ---
Brussels, Belgium
On site
28-11-2025