
Data Governance
Hybrid
London, United Kingdom
Full Time
31-01-2025
Job Specifications
Data Governance Analyst - Permanent - Lloyd's Market - Full time - £65,000
The Senior Data Governance Analyst will be responsible for implementing and improving the data governance framework, policies, and standards. They will work with various stakeholders to ensure data accuracy, completeness, and proper management. The role also involves creating and maintaining data dictionaries, metadata repositories, and contributing to KPI and KRI creation.
Responsibilities
Lead on aspects of the implementation and continuous improvement of the Data Governance framework, policies, procedures, and standards.
Partner with senior business stakeholders across various functions Regional Underwriting, Actuarial, and Operations etc. to ensure data is accurate, complete, and properly managed.
Lead on aspects of embedding the Data stewardship in the business, assist in the creation of decks for Data Governance forums, facilitate the agenda and contribute actively to these forums.
Help create content for data governance training programmes and assist in running these with business users to improve awareness.
Skills
Working knowledge of Data Governance methodologies and experience implementing these within the Lloyd's Insurance Market environment
Experience with data governance tools.
Strong communication and collaboration skills
Attention to detail and accuracy
Ability to work independently and as part of a team
Knowledge of relevant data governance methodologies (e.g. DAMA CDMP, DCAM or similar)
Candidates will preferably have a Lloyd's Insurance background
About the Company
Founded in 2001, Cornwallis Elt is now a highly respected £40M turnover recruitment business, based in the City of London. Our focus is on recruiting technology, digital and change management professionals into the financial markets, digital & media industries, legal and professional services sectors. Over the last twenty years, we have built an outstanding network of client relationships and an extensive, rigorously managed candidate database. Add to this a forensic delivery focus and you can see how we have created an... Know more
Related Jobs


- Company Name
- ** ********** *****
- Job Title
- Senior Data Engineer
- Job Description
- Job Title: AWS Senior Data Engineer Location: Manchester (Trafford) – 2 days a week Salary: £60,000 - £80,000 DOE The client is a technology provider to the market research industry. They use cutting edge technology to help our clients gain a deeper insight into the digital lives of consumers. We are looking for Data Engineer to join the team and play a crucial role in Data Engineer Team, ensuring the main function is developing, maintaining and improving the end-to-end data pipeline. The Role You will be working in the Data Engineering team whose main function is developing, maintaining and improving the end-to-end data pipeline that includes real-time data processing; extract, transform, load jobs; artificial intelligence; and data analytics on a complex and large dataset. Your role will primarily be to perform DevOps, backend and cloud development on the data infrastructure to develop innovative solutions to effectively scale and maintain the data platform. You will be working on complex data problems in a challenging and fun environment, using some of the latest Big Data open-source technologies like Apache Spark, as well as Amazon Web Service technologies including Elastic MapReduce, Athena and Lambda to develop scalable data solutions. Key Responsibilities: Adhering to Company Policies and Procedures with respect to Security, Quality and Health & Safety. Writing application code and tests that conform to standards. Developing infrastructure automation and scheduling scripts for reliable data processing. Continually evaluating and contribute towards using cutting-edge tools and technologies to improve the design, architecture and performance of the data platform. Supporting the production systems running the deployed data software. Regularly reviewing colleagues’ work and providing helpful feedback. Working with stakeholders to fully understand requirements. Be the subject matter expert for the data platform and supporting processes and be able to present to others to knowledge share. Here’s what we’re looking for: The ability to problem solve. Knowledge of AWS or equivalent cloud technologies. Knowledge of Serverless technologies, frameworks and best practices. Experience using AWS CloudFormation or Terraform for infrastructure automation. Knowledge of Scala or OO language such as Java or C#. SQL or Python development experience. High-quality coding and testing practices. Willingness to learn new technologies and methodologies. Knowledge of agile software development practices including continuous integration, automated testing and working with software engineering requirements and specifications. Good interpersonal skills, positive attitude, willing to help other members of the team. Experience debugging and dealing with failures on business-critical systems. Preferred: Exposure to Apache Spark, Apache Trino, or another big data processing system. Knowledge of streaming data principles and best practices. Understanding of database technologies and standards. Experience working on large and complex datasets. Exposure to Data Engineering practices used in Machine Learning training and inference. Experience using Git, Jenkins and other CI/CD tools. Benets Work in a market leading technology company that helps research and marketing professionals achieve unique insights into the mobile and digital lives of consumers. The client do everything they can to support our people so that they can be themselves and realise their potential. We love people who are hungry for learning and achievement! 25 days paid holiday plus bank holidays Purchase/sale of up to 5 leave days pa – after 2 years’ service Life insurance Workplace pension with employer contribution Performance based bonus scheme Informal dress code Cycle to work scheme Branded company merchandise


- Company Name
- McCabe & Barton
- Job Title
- Analytics Engineering Manager
- Job Description
- Leading Financial Services client is currently going through an exciting Data Transformation and are now seeking an experienced Analytics Engineering Manager to strategically lead the modernization of their Analytics & Business Intelligence function. The ideal Analytics Engineering Manager will have a true passion for data and has had experience leading teams ranging from 8 to 15 members. They will enjoy working at a strategic, enterprise-level and have had extensive experience in modernising an Analytics / Business Intelligence function. To be considered for this role you will need: Possesses a passion for data transformation, modernisation and automation Experience managing teams of 8 + data specialists who have a broad range of skills Strong knowledge of front-end visualisation tools such as Tableau, Qlik and/or Power BI Proven experience operating at a strategic level and has ‘hands on’ experience of modernising an Analytics / Business Intelligence function Strong understanding of cloud-based data solutions Extensive experience in one or more of the following languages: SQL, Python or Java Proficient with coding best practices, including version control via GitHub Results-oriented and has an eye for detail and a passion for data. Excellent communication and stakeholder management skills. If you are an experienced Analytics Engineering Manager with the required background, please respond to this ad in the first instance with an up to date version of your CV for review.


- Company Name
- McCabe & Barton
- Job Title
- Senior Data Engineer
- Job Description
- Leading Financial Services client is currently going through a large Data transformation and is looking to hire a number of Senior Data Engineers to join them on this exciting journey. The roles are offering a base between £60,000 and £75,000 + a strong benefits package and flexible working. Our client is looking for strategically minded Data Engineers that are naturally curious and understands a business’s drive to implement new data sources, data flows, automated processes and database structures. The ideal candidate will have strong knowledge of SQL and good skills in Snowflake, Azure and Python. However, this client does take an agnostic approach to technology and would be open to other skillsets too. Remit You will be tasked with building data sources, data flows, automated processes and database structures. Building & managing a Snowflake ecosystem to support streaming and batch workflows, that allows end user teams to access data, build reports, and leverage automation tools. Supporting adoption of GenAI tools and techniques and leading platform innovation with Snowflake and Microsoft. Ideal experience & Background Strong experience with Snowflake (SQL scripting DDL and DML) Good knowledge of Azure Data Factory and Azure DevOps Experience with Python Experience working in an Agile environment If you are an experienced Senior Data Engineer with the required skills, please respond with an up to date version of your CV for review.


- Company Name
- Harnham
- Job Title
- Principal Data Scientist
- Job Description
- Principal Data Scientist Up to £125,000 London (Hybrid, 3 days onsite per week) Company: A leading marketing and analytics agency is seeking a Principal Data Scientist to develop and deploy end-to-end AI solutions. You'll work on machine learning models for retention, price optimization, recommendation engines, NLP, and Computer Vision. Responsibilities: Design, develop, and productionize machine learning models across various applications. Work with Python (ideally production-level code) and other tools like SQL, Spark, and Databricks. Apply clustering, classification, regression, time series modelling, NLP, and deep learning. Develop recommendation engines and leverage third-party data enhancements. Implement MLOps/DevOps practices in cloud environments (AWS, Azure, GCP). Ensure explainability, bias detection, and algorithmic fairness in models. Collaborate with stakeholders to translate business challenges into data-driven solutions. Requirements: MSc or PhD Degree in Computer Science, Artificial Intelligence, Mathematics, Statistics or related fields. Strong Python skills (bonus: C++, SQL, Spark) Experience in ML algorithms (XGBoost, clustering, regression) Expertise in Time Series, NLP, Computer Vision, MLOps Knowledge of AWS/Azure/GCP, CI/CD, and Agile development Ability to own solutions and manage stakeholders How to Apply: Please register your interest by sending your CV to Emily Burgess via the Apply link on this page.