cover image
Exsolvæ

Exsolvæ

www.exsolvae.com

1 Job

12 Employees

About the Company

Data changed our world and the way we interact with the industry as professionals. The days when being an expert in one field only was sufficient are gone. The Data Era has come, with it’s opportunities, but also with it’s own problematics, as today, solving an issue with a monofocal expertise at one end of the Data Flow can result in a dramatic gap at the other.

Exsolvæ provides an environment of knowledge sharing for data related professionals in order to create a solid understanding of other experts activities along with the issues they face while working on the same data flow in order to help a new generation of pluridisciplinary engineers to emerge.

From the Data Generating Process at the shop-floor level until it’s Valorization through Dashboarding or Visualization tools for Top Management, the Automation Engineers, Data Scientists, Data Analysts, Computer Scientists, Machine Learning Engineers or Artificial Intelligence Engineers among our teams are not just consultants, they are Solvers.

Their expertise doesn't come just from books.

Their expertise comes from the experience and knowledge they shared.

Their expertise comes from each and everyone of them and the solutions they deployed.

Their expertise comes, EX-SOLVÆ.

Listed Jobs

Company background Company brand
Company Name
Exsolvæ
Job Title
Consultant - Data Engineering
Job Description
Job Title: Data Engineering Consultant Role Summary: Design, build, and optimize Azure‑native data solutions for enterprise clients, enabling scalable lakehouse and analytics environments. Lead end‑to‑end ETL/ELT pipelines, ensure data quality, and collaborate with cross‑functional teams to deliver AI‑ready data products. Expectations: * Deliver production‑grade data pipelines and lakehouse models in a fast‑paced, client‑focused setting. * Demonstrate expertise in Microsoft Azure data services, performance tuning, and governance. * Lead or support data‑engineering initiatives from concept through deployment, continuously improving automation and security. Key Responsibilities: 1. Architect and develop Azure‑based data pipelines using Data Factory, Azure Synapse, Azure Databricks, and Data Lake. 2. Build and optimize ETL/ELT workflows with PySpark and SQL. 3. Design and implement lakehouse solutions using Synapse and Delta Lake, ensuring scalability and performance. 4. Establish data quality, lineage, and reliability through automated testing and monitoring frameworks. 5. Integrate large‑scale datasets across hybrid and multi‑cloud environments. 6. Collaborate with data scientists, analysts, and business stakeholders to translate requirements into technical solutions. 7. Apply best practices for security, compliance, and cost efficiency in cloud architecture. 8. Stay current with Azure, Databricks, and Microsoft Fabric trends; evaluate new tools and approaches for automation and governance. 9. Mentor junior engineers and contribute to knowledge sharing within the team. Required Skills: * Deep experience with Azure Data Factory, Azure Synapse Analytics, Azure Databricks, and Azure Data Lake. * Proficiency in PySpark, SQL, and ETL/ELT design. * Strong understanding of lakehouse architecture, Delta Lake, and performance tuning. * Ability to build automated data quality and monitoring pipelines. * Familiarity with cloud security best practices and governance frameworks. * Excellent communication and collaboration skills for cross‑functional projects. * Knowledge of streaming data frameworks (Kafka, Azure Event Hub) and multi‑cloud proficiency (AWS, GCP) is a plus. Required Education & Certifications: * Bachelor’s degree in Computer Science, Electrical Engineering, or a related field (or equivalent professional experience). * Microsoft Fabric certification is preferred; hands‑on experience with Fabric components (OneLake, Data Factory, Synapse, Power BI) is desirable.
Brussels, Belgium
On site
04-11-2025