- Company Name
- London Metal Exchange
- Job Title
- Senior Engineer
- Job Description
-
Job Title: Senior Engineer
Role Summary: Full‑stack data and software engineer supporting the Sustainability and Physical Markets team. Responsible for designing and maintaining scalable data pipelines, enterprise data models, and integrations across platforms such as Salesforce, LMEpassport, and LMEsword. Works closely with business stakeholders and external vendors to deliver production‑grade solutions, drive data quality, and ensure regulatory compliance.
Expectations: Deliver reliable, high‑quality data and software solutions that meet strategic objectives. Exhibit strong problem‑solving skills, lead root‑cause analysis, and continuously improve data infrastructure and governance. Serve as a technical liaison between development teams and non‑technical stakeholders, providing clear status updates and risk communication.
Key Responsibilities:
- Design, implement, and maintain robust data pipelines and infrastructure for sustainability and financial data modelling.
- Develop and automate data validations, monitoring, and testing using Python and modern data engineering practices.
- Own and evolve data models and integrations for Salesforce, LMEpassport, LMEsword, and external vendor systems.
- Produce internal analysis and reporting to support business and technology initiatives.
- Lead incident analysis, root‑cause investigations, and implement improvements for system stability.
- Represent SPM data requirements to the enterprise data team, manage SPM components of the Enterprise Data Model, and contribute to data governance.
- Develop and maintain a future data roadmap, incorporating AI, external data products, and evolving technologies.
- Act as liaison between technical and non‑technical stakeholders, ensuring clear communication of project status, risks, and requirements.
- Maintain accurate, up‑to‑date technical documentation.
Required Skills:
- 5+ years of data or software engineering experience, including production‑grade systems in regulated or financial sectors.
- Full‑stack development: Java (Spring Boot), React, Python.
- Proficiency with Apache Airflow, Spark, Kafka, dbt, Snowflake, or similar data engineering and analytics platforms.
- Expertise in data pipeline validation, governance, and quality assurance.
- Experience with containerisation (Docker, Kubernetes), CI/CD pipelines, and cloud platforms (AWS, Azure, GCP).
- Strong analytical, problem‑solving, and communication skills.
- Familiarity with regulatory frameworks (MiFID II, GDPR, SOX) and data governance best practices.
Required Education & Certifications:
- Bachelor’s degree in Computer Science, Software Engineering, Data Science, or a closely related field.
- Preferred: Certifications or substantial hands‑on experience with modern data pipeline tools (e.g., Airflow, Spark, Kafka, dbt).
- Desirable: Knowledge of financial services regulatory frameworks and best practices for data governance.