- Company Name
- KPMG
- Job Title
- Expert Data Architect F/H
- Job Description
-
**Job title:** Expert Data Architect (F/H)
**Role Summary:** Design, implement, and deploy durable, scalable data solutions for diverse clients, shaping data strategies and steering technical architecture across cloud and on‑prem environments.
**Expectations:**
- Lead technical projects as a Tech Lead, guiding multi‑disciplinary teams.
- Provide deep expertise in data strategy, architecture, and engineering to align with business objectives.
- Maintain awareness of evolving cloud offerings, big‑data frameworks, and regulatory requirements.
- Foster skill development in data engineering and analytics teams.
**Key Responsibilities:**
- Audit existing data platforms; identify improvements and recommend target architectures (data warehouse, lake, lakehouse, data mesh, hub‑and‑spoke).
- Define and enforce design principles and compliance with performance, cost, security, confidentiality, and regulatory constraints.
- Own end‑to‑end data pipelines (code‑based, ETL/ELT), orchestrating with Airflow, Prefect, Azure Data Factory, etc.
- Implement CI/CD pipelines for automated deployment of data workflows.
- Advise on data governance, access policies, FinOps cost controls, encryption, masking, and anonymisation.
- Mentor and elevate competencies of Data Engineers and Analytics Engineers.
- Evaluate and advise on emerging cloud services, big‑data tools, and frameworks.
**Required Skills:**
- Advanced knowledge of reference data architectures (data warehouse, lake, lakehouse, Lambda/Kappa).
- Proficiency with at least one major cloud provider (AWS, Azure, GCP) and associated managed data services.
- Strong grasp of data engineering principles, pipeline construction, version control, and best coding practices.
- Experience with scheduling/orchestration tools (Apache Airflow, Prefect, Azure Data Factory).
- Competence in setting up CI/CD for data pipelines.
- Familiarity with GDPR, ISO standards, encryption, masking, and anonymisation techniques.
- Exposure to on‑premise solutions (RDBMS, proprietary ETL, big‑data clusters).
- Excellent communication, presentation, teamwork, curiosity, autonomy, and English proficiency.
**Required Education & Certifications:**
- Minimum of a Master’s degree (Bac +5) from an engineering school or technical program in statistics, mathematics, computer science, or Data/AI.
- 5+ years in data architecture, engineering, or big‑data roles, with multiple years as Tech Lead; experience in consulting or multi‑client environments preferred.