cover image
Believe

Believe

www.believe.com

5 Jobs

3,013 Employees

About the Company

Believe is one of the world’s leading digital music companies. Believe’s mission is to develop independent artists and labels in the digital world by providing them the solutions they need to grow their audience at each stage of their career and development. Believe’s passionate team of digital music experts around the world leverages the Group’s global technology platform to advise artists and labels, distribute and promote their music. Its 1,651 employees in more than 50 countries aim to support independent artists and labels with a unique digital expertise, respect, fairness and transparency. Believe offers its various solutions through a portfolio of brands including TuneCore, Nuclear Blast, Naïve, Groove Attack and AllPoints. Believe is listed on compartment A of the regulated market of Euronext Paris (Ticker: BLV, ISIN: FR0014003FE9). www.believe.com

Listed Jobs

Company background Company brand
Company Name
Believe
Job Title
Associate Engineer (Liverpool/Hybrid)
Job Description
**Job Title** Associate Engineer **Role Summary** A graduate/associate level software engineer responsible for designing, developing, and maintaining user-facing and back‑end features in a production environment. The role involves hands‑on coding using modern cloud technologies, participating in technical discussions, and growing through mentorship and code reviews. **Expectations** - Graduate or early‑career level software engineer. - Ability to work on‑site in a hybrid setting. - Right to work in the UK; visa sponsorship not provided. - Strong communication and teamwork skills. - Eagerness to learn new technologies and contribute to real‑world solutions. **Key Responsibilities** - Develop and implement new features for web applications. - Build and maintain secure, scalable back‑end services on production systems. - Participate in design discussions and propose technical solutions. - Deploy and manage applications using cloud platforms (e.g., AWS, Vercel). - Perform code reviews, test coverage, and refactor for performance. - Collaborate with cross‑functional teams to prioritize and deliver features. - Continuously learn and apply best practices in software engineering, data engineering, and cloud infrastructure. **Required Skills** - Solid programming fundamentals and data structures. - Experience with web development, REST APIs, and front‑end frameworks. - Familiarity with cloud services (AWS, Vercel, Snowflake). - Basic knowledge of C#, TypeScript, SQL, or equivalent languages. - Ability to write clean, maintainable code and use version control. - Strong analytical and problem‑solving abilities. - Effective written and verbal communication. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering, or related field (or equivalent practical experience). - No specific certifications required.
Liverpool, United kingdom
On site
04-10-2025
Company background Company brand
Company Name
Believe
Job Title
Senior Cloud Engineer (Liverpool/Hybrid)
Job Description
Liverpool, United kingdom
On site
Senior
09-10-2025
Company background Company brand
Company Name
Believe
Job Title
Data Engineer
Job Description
**Job Title:** Data Engineer **Role Summary:** Data Engineer working within the Data Office’s Data Platform squad, responsible for building and operating end‑to‑end data pipelines on AWS and Snowflake. Design, develop, test, monitor, and document data ingestion, transformation, and modeling workflows, delivering data layers (Bronze, Silver, Gold) to support business analytics. **Expectations:** - Deliver high‑quality, performant data pipelines that meet volume and latency requirements. - Apply best practices in data engineering, testing, and documentation. - Collaborate in a Scrum environment, taking ownership of both Build and Run responsibilities. - Communicate effectively in English (spoken and written). **Key Responsibilities:** - Design and develop ingestion pipelines using AWS Step Functions, S3, and Python. - Transform data with Python, Spark (Scala), Databricks, and SQL. - Implement unit testing (Python, Spark Scala). - Model data in a Snowflake data warehouse (Bronze, Silver, Gold layers). - Monitor, troubleshoot, and optimize pipeline performance, cost, and security. - Produce and maintain documentation in Data Galaxy and Confluence. **Required Skills:** - AWS (Step Functions, S3, EC2, Lambda) - Snowflake data warehouse - Programming: Python, Spark/Scala, SQL - Data modeling, ETL architecture, pipeline monitoring - Testing (unit tests for Python and Spark) - Continuous improvement mindset, craft‑focused approach - English fluency (oral and written) **Required Education & Certifications:** - Bachelor’s to Master’s level (Bac+3 to Bac+5) in Engineering, Computer Science, or related field (BTS/DUT/DESS). - Minimum 3 years of professional experience with the above stack.
Paris, France
On site
17-10-2025
Company background Company brand
Company Name
Believe
Job Title
Senior Data Engineer
Job Description
**Job Title** Senior Data Engineer **Role Summary** Lead the design, development, and operation of end‑to‑end data pipelines within the Data Platform squad, utilizing an AWS‑Snowflake stack to ingest, transform, and stage large volumes of data across Bronze, Silver, and Gold layers, ensuring performance, cost efficiency, and security while documenting best practices. **Expectations** - 6+ years of professional data engineering experience with AWS and Snowflake. - Strong proficiency in Python, Spark (Scala or PySpark), SQL, and Databricks. - Demonstrated experience designing automated ingestion pipelines using AWS Step Functions, S3, and related services. - Proven ability to model data marts, write unit and integration tests, and implement monitoring and alerting. - Familiarity with scaling, cost‑control, and security practices in cloud data environments. - Comfortable working in a Scrum/Agile team, contributing to both Build and Run responsibilities. - Fluent in English (written and spoken). **Key Responsibilities** - Architect and develop scalable ingestion pipelines on AWS (Step Functions, EC2, Lambda, S3). - Execute data transformations using Python, Spark/Scala, Databricks notebooks, and SQL. - Implement data quality and unit tests (Python unittest/pytest, Spark test frameworks). - Model and maintain Bronze, Silver, and Gold data layers in Snowflake, applying proper data modeling techniques. - Monitor pipeline performance, error rates, and data freshness; troubleshoot and optimize. - Track and report on cloud costs and enforce security best practices (IAM, encryption, network isolation). - Produce and maintain technical documentation (Data Galaxy, Confluence). **Required Skills** - AWS services: Step Functions, S3, Lambda, IAM, CloudWatch. - Snowflake data warehouse. - Python (ETL, scripting). - Apache Spark (Scala and/or PySpark) or equivalent. - Databricks notebook development. - SQL (Snowflake dialect). - Data modeling (star/snowflake schemas). - Test frameworks (Python unit tests, Spark tests). - Monitoring/alerting (CloudWatch, Snowflake alerting). - Cost optimization and security controls in cloud. - Agile/Scrum collaboration. - Strong written and verbal communication in English. **Required Education & Certifications** - Bachelor’s (BAC+3) to Master’s (BAC+5) degree in Computer Science, Engineering, Data Science, or related field. - Minimum 6 years of relevant industry experience. ---
Paris, France
On site
Senior
11-11-2025