cover image
Fountain

Fountain

fountain.com

1 Job

370 Employees

About the Company

Fountain’s all-in-one high volume hiring platform empowers the world’s leading enterprises to find the right people through smart, fast, and seamless recruiting. Candidates can apply anytime, anywhere in minutes, right from their phone. Automated and customizable processes streamline the candidate experience and save time for recruitment teams so they can scale with growing hiring needs. Advanced analytics provide end-to-end process visibility so managers can make swift, data-driven decisions. Throughout the candidate journey, the openly integrated platform enables companies to find, qualify and convert more applicants. Fountain’s global customers hire over 1.2 million workers annually in 78 countries.

Why Work with Us:

- Generous compensation
- 100% Remote and Flexible Hours
- Unlimited PTO
- Equity grants
- Comprehensive healthcare benefits
- 16 weeks paid parental leave for all parents
- Wellness and gym reimbursement
- Home office stipend
- Phone stipend
- 401K plan
- Learning and development reimbursement
- Company events
- FSA & HSA
- Life Insurance & Long-Term Disability/Short-Term Disability

Visit us at https://get.fountain.com to learn more.

Listed Jobs

Company background Company brand
Company Name
Fountain
Job Title
Senior Data Engineer
Job Description
**Job Title:** Senior Data Engineer **Role Summary:** Build, optimize, and maintain scalable data infrastructure to support embedded analytics, internal BI, and product-driven analytics workflows. **Expectations:** Lead data pipeline development, collaborate on data modeling, migrate to streaming architectures, and ensure compliance with data governance standards. **Key Responsibilities:** - Design and optimize ETL pipelines for data movement from Postgres/MongoDB to Iceberg/S3 and ClickHouse using CDC (Debezium/ClickPipes). - Develop dbt models across ClickHouse, BigQuery, Snowflake, and Redshift to support analytic datasets. - Orchestrate transformations with DAGster, dbt, and custom Python ETL workflows. - Collaborate with teams to define data requirements and deliver datasets for analytics and product features. - Migrate from Fivetran to Kafka-based streaming pipelines, configure Kafka/Debezium connectors. - Implement GDPR-compliant data retention, anonymization, and backup workflows. - Monitor pipeline health, troubleshoot issues, and optimize query performance. - Use Terraform or similar tools for cloud infrastructure-as-code across AWS, GCP, and Azure. **Required Skills:** - 5+ years in data engineering/ETL roles. - Proficiency in SQL/Python; experience with dbt and orchestration tools (Dagster/Airflow/Prefect). - Expertise in relational (PostgreSQL) and NoSQL (MongoDB) databases. - Hands-on experience with data lakes (Iceberg), warehouses (BigQuery, Snowflake, Redshift, ClickHouse). - Knowledge of streaming technologies (Kafka, Debezium) and CDC pipelines. - Familiarity with cloud platforms (AWS, GCP, Azure) and storage services (S3/GCS/Azure Storage). - Terraform (or IaC tools) for infrastructure management. - Git for version control and collaboration. **Required Education & Certifications:** Not specified.
Paris, France
Remote
Senior
16-10-2025