- Company Name
- 1inch
- Job Title
- Senior Data Engineer
- Job Description
-
**Job title**: Senior Data Engineer
**Role Summary**:
Lead the development, optimization, and maintenance of high‑performance data pipelines and warehouse solutions for a fast‑growing DeFi analytics organization. Provide reliable, audit‑ready data to product, business, and strategic teams, ensuring data quality and actionable insights across on‑chain and off‑chain streams.
**Expectations**:
- Design and deliver robust, scalable ETL pipelines that ingest, transform, and store complex financial data.
- Champion data quality, implementing validation, monitoring, and audit processes to guarantee accuracy and compliance.
- Work cross‑functionally to transform business requirements into technical data solutions, presenting results to non‑technical stakeholders.
**Key Responsibilities**:
- Build and maintain scalable ETL pipelines for on‑chain and off‑chain data.
- Optimize data storage and retrieval on BigQuery, Trino, PostgreSQL, and other turn‑key warehouses.
- Implement validation rules, monitoring dashboards, and alerting for data quality.
- Conduct data quality audits and resolve any identified issues.
- Collaborate with product, engineering, and business teams to design and deliver data solutions.
- Present complex data concepts and insights to non‑technical audiences.
- Evaluate, adopt, and standardise new tools and best practices within the Data & Analytics team.
**Required Skills**:
- Minimum 5 years experience in data engineering or related role.
- Deep expertise in SQL (BigQuery, Trino, PostgreSQL).
- Proficiency in Python and TypeScript for data transformation and tooling.
- Experience with DBT, Airbyte, and modern ELT pipelines.
- Strong understanding of data modeling, ETL design, and performance tuning.
- Familiarity with cloud data platforms (AWS, GCP) and data warehouse concepts.
- Ability to document and present technical solutions to diverse audiences.
**Required Education & Certifications**:
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, Statistics, Data Engineering, or a closely related field.
- No mandatory certifications required, but professional knowledge of big‑data technologies (Spark, Hadoop, Snowflake) and data quality tools is advantageous.
**Nice to Have** (non‑mandatory):
- Experience with crypto/Web3 analytical stacks (e.g., Dune, Flipside).
- Understanding of low‑level blockchain data (traces, events).
- Exposure to Hadoop, Spark, Snowflake, or other big‑data ecosystems.
- Experience building dashboards and reports that drive business insights.