- Company Name
- JAKALA
- Job Title
- Data Engineer Senior
- Job Description
-
**Job Title:** Senior Data Engineer
**Role Summary:**
Lead the design, development, and industrialization of data platforms for enterprise clients within a Data & AI practice. Collaborate with data scientists, MLE/MLOps engineers, and stakeholders to deliver scalable, secure, and high‑performance ETL/ELT solutions, data pipelines, and data‑warehouse/lake architectures.
**Expectations:**
- Minimum 5 + years of professional data engineering experience.
- Proven ability to architect end‑to‑end data pipelines and cloud‑native data platforms.
- Strong problem‑solving mindset with autonomy to drive projects from concept to production.
- Commitment to technical documentation, knowledge sharing, and continuous learning.
**Key Responsibilities:**
- Contribute to the firm’s data‑service offering and industrialize client data platforms.
- Analyze business requirements and propose optimal technical solutions for digital platforms and internal projects.
- Design and define ETL/ELT architectures (UML, API specs) in collaboration with peers.
- Build connectors to ingest data from internal/external sources and develop batch & real‑time processing pipelines (Spark, Kafka, etc.).
- Implement data storage, modeling, cleansing, and governance in Data Lakes, Warehouses, and Lakehouse environments.
- Produce technical documentation (diagrams, API docs) and mentor team members.
- Perform data mapping, flow cataloguing, and R&D for emerging technologies.
- Ensure platform scalability, security, stability, and availability; set up monitoring, sequencing, and edge‑case handling.
- Deliver dashboards and reporting automations for business insights.
**Required Skills:**
- **Programming & Scripting:** Python (Pandas, REST APIs, FaaS), Java (Kafka Connect, SOAP), PySpark/Databricks.
- **ELT Tools:** dbt, Snowflake, PostgreSQL.
- **Data Warehousing & Lakehouse:** Snowflake, BigQuery, PostgreSQL, Delta Lake.
- **Messaging:** Kafka, RabbitMQ.
- **Cloud & Containerization:** AWS, GCP or Azure; Kubernetes; Docker; Terraform (IaC).
- **Architecture:** Design and sizing of managed cloud services, data‑platform architecture, data mapping.
- Strong analytical skills, ability to work with large‑scale datasets (Big Data, Data Lakes).
**Required Education & Certifications:**
- Bachelor’s or higher degree in Computer Science, Information Systems, Big Data, Engineering, or equivalent.
- Relevant certifications (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer, SnowPro, Kubernetes) are a plus.