cover image
GemPool Recruitment

GemPool Recruitment

www.gempool.ie

2 Jobs

33 Employees

About the Company

GemPool - High-End IT recruitment company. GemPool™ was founded by established IT entrepreneurs that recognise that IT skills are at a premium, to service a neglected candidate segment in the Irish IT market, specifically high end IT Professionals in the Software, Telecoms, Gaming, IT & R&D segments.

The Irish software industry has moved up the “value chain” in the Global IT industry with more of emphasis on proof of concept & time to market IT solutions, which, in turn requires today’s IT professionals to not only be good coders but also carry additional responsibilities such as customer/project relationship mgt, pre sales and leadership skills. As an experienced technology professional you understand the value in expertise, so does GemPool, we’re putting trust back into the recruitment industry.

Listed Jobs

Company background Company brand
Company Name
GemPool Recruitment
Job Title
Senior Data Engineer
Job Description
Job Title: Senior Data Engineer Role Summary: Lead the design, development, and maintenance of enterprise-scale data pipelines and analytics solutions. Own end‑to‑end data flows from ingestion to data quality validation, ensuring reliable, performant, and scalable data architecture across cloud platforms. Expectations: - Deliver robust, production‑ready ETL/ELT pipelines that support business analytics and reporting. - Maintain clean, documented, and reproducible code with CI/CD pipelines. - Collaborate with data scientists, architects, and business stakeholders to translate requirements into technical specifications. - Ensure data integrity, security, and compliance across all stages of the data lifecycle. Key Responsibilities - Build and maintain data pipelines using Airflow, dbt, Prefect, and Python (pandas, pyarrow, pyspark). - Design and implement data lake/warehouse solutions on AWS (S3, Athena, Snowflake). - Model data using star, snowflake, and dimensional designs; create detailed metadata and lineage documentation. - Implement data validation, quality checks, and anomaly detection to guarantee trustworthy data sets. - Use version control (Git) and CI/CD practices to automate workflow deployments and tests. - Mentor junior engineers and promote knowledge sharing within the team. - Evaluate and recommend tooling and architecture changes to improve performance and scalability. Required Skills - Snowflake – 3+ years core experience. - Data Engineering, Data Migration, Database Technologies, Advanced SQL, Python – 5+ years each. - Strong proficiency in Python (pandas, pyarrow, pyspark). - Expertise with Airflow, dbt, Prefect. - Cloud: AWS (S3, Athena, Snowflake). - Data lake/warehouse architecture (S3 + Athena, Delta Lake). - Data modelling (star/snowflake, dimensional). - Data quality, validation, anomaly detection. - Git, CI/CD for data workflows. - Excellent communication, problem‑solving, documentation, and collaboration skills. Required Education & Certifications - Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field. - Certifications: Snowflake Certified Professional, AWS Certified Data Analytics – Specialty, or equivalent are highly desirable.
Paris, France
Hybrid
Senior
18-11-2025
Company background Company brand
Company Name
GemPool Recruitment
Job Title
Data Engineer
Job Description
Job Title: Data Engineer Role Summary: Senior Data Analytics Engineer responsible for designing, building, and maintaining scalable ETL/ELT pipelines that ingest, transform, and deliver large volumes of data to Snowflake data warehouse and associated reporting layers. Works across cross‑functional teams to ensure data quality, integrity, and accessibility for analytics and BI workloads. Expectations: - Minimum 5 years overall data engineering experience, including at least 3 years in Snowflake. - Proven track record of full‑stack data migration and lake/warehouse architecture using S3, Athena, Delta Lake. - Strong expertise in Python (pandas, pyarrow, pyspark) and advanced SQL. - Demonstrated ability to orchestrate data flows with Airflow, dbt, or Prefect. - Hands‑on CI/CD for data workflows (Git, automated testing, documentation). Key Responsibilities: - Design and implement robust, repeatable ETL/ELT pipelines that move data from source to Snowflake. - Develop and maintain data models (star/snowflake, dimensional) to support reporting and analytics. - Ensure data quality through validation, anomaly detection, and monitoring. - Create and maintain documentation, runbooks, and reproducibility artifacts. - Collaborate with data scientists, BI teams, and stakeholders to clarify requirements and deliver solutions. - Optimize pipeline performance and cost efficiency in cloud environment (AWS). - Conduct code reviews, enforce best practices, and maintain version control. Required Skills: - Snowflake (≥3 years) - Data migration (≥5 years) - Database technologies (≥5 years) - Advanced SQL (≥5 years) - Python programming (≥5 years) - Airflow / dbt / Prefect orchestration - AWS cloud services, S3, Athena, Delta Lake - Data modeling (star/snowflake, dimensional) - Data quality, validation, anomaly detection - CI/CD, Git, automated testing, documentation Required Education & Certifications: - Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience). - Certifications in Snowflake, AWS Data Analytics, or relevant data engineering technologies are a plus.
Paris, France
Hybrid
Mid level
19-11-2025