- Company Name
- Believe
- Job Title
- Senior Data Engineer
- Job Description
-
**Job Title**
Senior Data Engineer
**Role Summary**
Lead the design, development, and operation of end‑to‑end data pipelines within the Data Platform squad, utilizing an AWS‑Snowflake stack to ingest, transform, and stage large volumes of data across Bronze, Silver, and Gold layers, ensuring performance, cost efficiency, and security while documenting best practices.
**Expectations**
- 6+ years of professional data engineering experience with AWS and Snowflake.
- Strong proficiency in Python, Spark (Scala or PySpark), SQL, and Databricks.
- Demonstrated experience designing automated ingestion pipelines using AWS Step Functions, S3, and related services.
- Proven ability to model data marts, write unit and integration tests, and implement monitoring and alerting.
- Familiarity with scaling, cost‑control, and security practices in cloud data environments.
- Comfortable working in a Scrum/Agile team, contributing to both Build and Run responsibilities.
- Fluent in English (written and spoken).
**Key Responsibilities**
- Architect and develop scalable ingestion pipelines on AWS (Step Functions, EC2, Lambda, S3).
- Execute data transformations using Python, Spark/Scala, Databricks notebooks, and SQL.
- Implement data quality and unit tests (Python unittest/pytest, Spark test frameworks).
- Model and maintain Bronze, Silver, and Gold data layers in Snowflake, applying proper data modeling techniques.
- Monitor pipeline performance, error rates, and data freshness; troubleshoot and optimize.
- Track and report on cloud costs and enforce security best practices (IAM, encryption, network isolation).
- Produce and maintain technical documentation (Data Galaxy, Confluence).
**Required Skills**
- AWS services: Step Functions, S3, Lambda, IAM, CloudWatch.
- Snowflake data warehouse.
- Python (ETL, scripting).
- Apache Spark (Scala and/or PySpark) or equivalent.
- Databricks notebook development.
- SQL (Snowflake dialect).
- Data modeling (star/snowflake schemas).
- Test frameworks (Python unit tests, Spark tests).
- Monitoring/alerting (CloudWatch, Snowflake alerting).
- Cost optimization and security controls in cloud.
- Agile/Scrum collaboration.
- Strong written and verbal communication in English.
**Required Education & Certifications**
- Bachelor’s (BAC+3) to Master’s (BAC+5) degree in Computer Science, Engineering, Data Science, or related field.
- Minimum 6 years of relevant industry experience.
---