cover image
Sifted

Sifted

sifted.eu

1 Job

66 Employees

About the Company

Sifted is the critical friend of the European startup ecosystem – providing startup founders, senior management teams, investors and service providers with the curated insights, in-depth analysis and networking opportunities they need to grow their business. Whatever your role is within the ecosystem, however you like to stay up to speed, we've got something for you: Sifted Daily: Our daily newsletter is the essential morning read for founders, investors and corporates who want a quick digest of the news that matters Sifted Pro: Our premium subscription unlocks access to quality, verified startup and deals data, an exclusive Deals newsletter, plus unlimited access to all our journalism. Sifted for Startups: Our new programme for startup founders and senior management teams. Join to get free subscription for your entire business, plus other exclusive perks. Specialist Newsletters: Get a weekly roundup of the latest news, scoops and insights on Climate tech, Fintech, Venture capital. We've also got a weekly Startup Life newsletter - essential for startup operators in need of practical resources and advice. Startup Europe Podcast: Monthly interviews with the founders, operators and investors behind Europe's fastest-growing tech companies on how they've built their businesses. Ready to join the sharpest minds in the European tech ecosystem? Sifted Newsletters: https://sifted.eu/newsletters Sifted Subscriptions: https://sifted.eu/subscribe Sifted for Startups: https://sifted.eu/sifted-for-startups Startup Europe podcast: https://podcast.sifted.eu/ P.S. Interested in partnering with us? Contact us here: marketing@sifted.eu

Listed Jobs

Company background Company brand
Company Name
Sifted
Job Title
Data Engineer
Job Description
**Job Title:** Data Engineer **Role Summary:** Design, develop, and maintain robust, automated data pipelines and data warehouse infrastructure that power subscription products and internal analytics. Use AI, web scraping, and API integration to enrich and structure diverse data sources, and collaborate cross‑functionally to translate product requirements into technical solutions. **Expectations:** - Deliver high‑quality, scalable ETL/ELT pipelines that support product features and user data contributions. - Own end‑to‑end data projects, from requirement gathering through deployment and monitoring, with minimal oversight. - Continuously improve processes through documentation, testing, and automation. **Key Responsibilities:** - Build and maintain accurate, well‑structured ETL/ELT pipelines for subscription services. - Design user‑submitted data ingestion flows for startups, investors, and accelerators. - Apply AI tools, APIs, scraping, and automation to extract, enrich, and normalize unstructured or semi‑structured data. - Collaborate with intelligence, editorial, and tech teams to define data needs and influence product roadmap. - Develop and maintain dashboards and internal tools for data exploration, QA, and monitoring. - Set up and manage a data warehouse/lakehouse (BigQuery, PostgreSQL, Databricks) ensuring efficient modeling and performance. - Implement CI/CD pipelines for data workflows, integrating GitHub, automated testing, and version control across environments. - Document architecture, processes, and best practices, fostering a culture of continuous improvement. **Required Skills:** - Strong proficiency in Python and SQL. - Experience with cloud platforms (AWS or Google Cloud). - Version control using GitHub. - Relational databases (PostgreSQL) and cloud data platforms (BigQuery, Databricks, Snowflake). - Familiarity with AI tools/APIs for data enrichment. - Hands‑on with ETL/ELT tools: Airbyte, Fivetran, DBT. - Ability to build data visualisations and dashboards. - Product‑oriented mindset, delivering projects from concept to production. - Excellent written and verbal communication; ability to explain technical concepts to non‑technical stakeholders. **Required Education & Certifications:** - Graduate degree in engineering, mathematics, computer science, or equivalent practical experience. - Minimum 2 + years of data engineering experience, building and optimizing pipelines and data warehouses/lakehouses.
London, United kingdom
Hybrid
Junior
08-12-2025