cover image
DNEG

Data Engineer (Brahma)

Remote

London, United kingdom

Full Time

24-07-2025

Share this job:
Expired

Skills

Communication Python SQL Big Data Data Engineering CI/CD Docker Kubernetes Training Linux Machine Learning Programming Azure AWS Software Development cloud platforms SDLC GCP CI/CD Pipelines

Job Specifications

Brahma is a pioneering enterprise AI company developing Astras, AI-native products built to help enterprises and creators innovate at scale. Brahma enables teams to break creative bottlenecks, accelerate storytelling, and deliver standout content with speed and efficiency. Part of the DNEG Group, Brahma brings together Hollywood's leading creative technologists, innovators in AI and Generative AI, and thought leaders in the ethical creation of AI content.
Job Description
As a Data Engineer, you'll architect and maintain the pipelines that power our products and services. You'll work at the intersection of ML, media processing, and infrastructure; owning the data tooling and automation layer that enables scalable, high-quality training and inference. If you're a developer who loves solving tough problems and building efficient systems, we want you on our team.
Key Responsibilities
Design and maintain scalable pipelines for ingesting, processing, and validating datasets with main focus visual and voice data.
Work with other teams to identify workflow optimisation potential, design and develop automation tools, using AI-driven tools and custom model integrations and scripts.
Write and maintain tests for pipeline reliability.
Build and maintain observability tooling in collaboration with other engineers to track data pipeline health and system performance.
Collaborate with data scientists, operators, and product teams to deliver data solutions.
Debug and resolve complex data issues to ensure system performance.
Optimise storage, retrieval, and caching strategies for large media assets across environments.
Deploy scalable data infrastructure using cloud platforms as well as on-premise and containerization.
Deepen your knowledge of machine learning workflows to support AI projects.
Stay current with industry trends and integrate modern tools into our stack.
Must Haves
3+ years in data engineering or related backend/infrastructure role.
Strong programming skills in Python or similar languages.
Experience with software development lifecycle (SDLC) and CI/CD pipelines.
Proven experience building and testing data pipelines in production.
Proficiency in Linux.
Solid SQL knowledge.
Experience with Docker or other containerisation technologies.
Proactive approach to solving complex technical challenges.
Passion for system optimisation and continuous learning.
Ability to adapt solutions for multimedia data workflows.
Nice to Have
Experience with Kubernetes (k8s).
Knowledge of machine learning or AI concepts.
Familiarity with ETL tools or big data frameworks.
Familiarity with cloud platforms (e.g., AWS, GCP, Azure).
About You
Innovative
Like challenges
Adaptable
Calm under pressure
Strong communication abilities

About the Company

We are DNEG - delivering award-winning visual effects, animation, and creative technologies for film, TV, and immersive content. We have over 25 years of industry experience and relationships, and we are honoured to have won seven Academy Awards(r) for 'Best VFX' since 2011. Visit dneg.com to find out how we're creating the future of storytelling. Know more