Job Specifications
Job Description
adesso Belgium, who are we? We are an IT (start-up) provider working under the adesso umbrella with proven track record in different industries. The new Belgian entity of adesso is part of the expansion and growth strategy of the adesso group. With more than 10.000 employees spread over 60 locations we are entering an exciting period with our presence in Brussels, the capital of Europe.
As part of adesso’s group ‘’grow together’’ mission, we want to attract talents who are ready for common growth. This means supporting the company’s vision, strategy, and growth in customers, industries, and, of course, the adessi community in Belgium.
Do you want to make data truly work? At adesso, we are looking for Data Engineers who know how to translate complex data issues into scalable, efficient solutions. Someone who embraces cloud technology, but also understands that good data solutions start with understanding the process behind the numbers.
Job requirements
As a Data Architect at adesso, you design and guide the implementation of modern data platforms on Azure, with a strong focus on Lakehouse architectures using Databricks and Microsoft Fabric. You translate business goals into pragmatic data architectures, guard the overall coherence, and help teams deliver scalable, secure and future‑proof solutions.
You will join our growing Data & AI team in Belgium, working closely with experienced colleagues from adesso in the Netherlands and across Europe. You get the opportunity to help build a new practice, while benefiting from the knowledge, assets and stability of an established international company.
Your Role In Projects
Define data architectures & roadmaps
Design end‑to‑end data architectures on Azure, covering ingestion, storage, processing, serving and governance.
Shape Lakehouse / data platform blueprints using technologies such as Azure Databricks, Microsoft Fabric, Azure Data Lake, Azure Synapse, Azure Data Factory.
Translate business objectives and requirements into architectural roadmaps, target architectures and transition steps.
Choose and align technologies
Evaluate and position Databricks vs Microsoft Fabric vs other Azure components depending on the client context and use cases.
Define reference patterns (e.g. medallion architecture, CDC patterns, data sharing, semantic layers) and promote them across projects.
Collaborate with cloud architects, security, and domain experts to ensure solutions are aligned with enterprise standards and constraints.
Guide implementation & support teams
Work closely with data engineers, analytics engineers, BI developers and data scientists to make architecture real.
Provide guidance on data modeling (dimensional modeling / star schema, data vault, operational models) and data product design.
Review solution designs, pipelines and models, and help troubleshoot complex issues when needed.
Data governance, security & quality
Define and promote data governance principles (ownership, cataloguing, lineage, metadata, classification).
Work with security teams to implement secure-by-design architectures (RBAC, least privilege, PII/PHI handling, encryption).
Advocate and help implement data quality and observability practices (DQ checks, monitoring, alerting).
Hands-on architecture & prototyping
Stay close to the technology by building proofs of concept, reference implementations and accelerators.
Use tools like Databricks (PySpark, SQL), Fabric Lakehouses, Notebooks, Data Pipelines, Power BI to validate patterns and decisions.
Contribute to internal frameworks and reusable components that speed up delivery and increase quality.
Community & pre-sales
Actively contribute to the Data & AI community within adesso (Belgium and Netherlands): knowledge sharing, talks, brown bags, internal documentation.
Support pre-sales activities: contribute to proposals, estimates and solution designs; join client workshops and architecture discussions.
Mentor data engineers and other team members who want to grow towards architecture roles.
Required Technical Skills & Knowledge
Azure databricks, Microsoft Fabric, Power BI, Apache Spark, Azure Synapse, Azure Data Lake, Azure Data Factory, Python, PySpark, SQL
Your profile
You don’t need to tick every box, but you recognize yourself in most of the following:
Technical & Architectural Skills
Solid experience designing and implementing data solutions on Azure, with exposure to Databricks and/or Microsoft Fabric.
Strong understanding of modern data platform and Lakehouse concepts (medallion architecture, Delta/Parquet, OneLake, Lakehouses, semantic models).
Strong data modeling skills for analytics: dimensional modeling / star schema is a must; data vault or 3NF is a plus.
Hands‑on background in data engineering (SQL, Python/PySpark, ETL/ELT, orchestration) and comfort diving into technical details when needed.
Familiarity with Power BI or similar analytics tools, and how architecture impacts reporting, self‑servi