cover image
AUDELA

AUDELA

audela.ca

1 Job

31 Employees

About the Company

Audela est une entreprise d'intelligence artificielle proposant des produits et des solutions de services aux Fournisseurs de Services de Communication. Nous nous concentrons sur le développement de solutions basées sur l'intelligence artificielle pour aider les organisations à construire des réseaux logiciels dégroupés, open source, agnostiques en termes de technologie, virtualisés et définis par logiciel. **Audela is an AI company providing products and services solutions to Communication Service Providers. We focus on developing solutions based on Artificial Intelligence to help organizations build unbundled, open-source, technology agnostic, virtualized, software-defined networks.

Listed Jobs

Company background Company brand
Company Name
AUDELA
Job Title
Intermediate Software Engineer (MLOps / Data Science Focus)
Job Description
Job Title: Intermediate Software Engineer (MLOps / Data Science Focus) Role Summary: Design, develop, test, and deploy scalable machine‑learning operations (MLOps) solutions in a cloud‑native environment. Collaborate with a senior data science engineer to bring research prototypes to production, ensuring reliability, traceability, security, and cost control. Expectations: - Integrate quickly into the team and become a trusted contributor. - Demonstrate ownership of components and workflows. - Consistently apply best software‑engineering practices and improve MLOps pipelines. Key Responsibilities: - Partner with a senior data scientist to architect and implement scalable software solutions. - Write clean, maintainable code governed by version control (Git) and peer review. - Develop and maintain comprehensive automated test suites (unit, integration, end‑to‑end). - Support and enhance CI/CD pipelines, primarily on Databricks. - Build and maintain MLOps workflows: model training, validation, deployment, and monitoring. - Ensure traceability, security, and cost optimization for models and agentic systems. - Contribute to automation and continuous delivery initiatives. - Gradually assume responsibility for selected components and workflows via a knowledge‑transfer plan. Required Skills: - Strong fundamentals in software engineering: version control, automated testing, code review, CI/CD. - Experience with CI/CD tools (e.g., GitHub Actions, Azure Pipelines, Jenkins) and cloud environments. - Familiarity with MLOps concepts; hands‑on experience with Databricks, Delta Lake, MLflow, or similar is preferred. - Basic knowledge of data science or machine‑learning workflows. - Ability to work effectively in a pair‑programming and collaborative setting. - Excellent communication and continuous‑learning mindset. - Knowledge of containerization (Docker) and optional orchestration (Kubernetes) is a plus. Required Education & Certifications: - Bachelor’s degree in Computer Science, Software Engineering, Data Science, or related field. - Relevant cloud or MLOps certifications (e.g., Databricks Certified Associate Developer for Apache Spark, AWS Certified Machine Learning – Specialty) are advantageous but not mandatory.
Montreal, Canada
On site
29-10-2025