
Data Analyst Associate
Hybrid
London, United Kingdom
Full Time
21-01-2025
Job Specifications
My leading Financial Services client are looking for a Data Analyst to join their CLM and KYC team, within their Data & MI space. You'll be responsible for the production and maintenance of the Data & MI function, including uploading and oversight to the internal reporting repository.
This is a newly created role within a hugely growing area of the organisation. They are a supportive and high performing team, a brilliant opportunity!
The following skills / experience is required:
Strong Data Analyst background
Power BI
SQL, Excel, Python is desirable
KYC understanding is desirable
Previously worked in Financial Services or similar industry
Excellent communication skills
Salary: Up to £60,000 + bonus + package
Grade: Associate
Location: London (good work from home options available)
If you are interested and meet the above requirements please apply immediately.
About the Company
Hunter Bond is a global firm specialising in the finance and technology recruitment sectors with the aim to provide a thorough, effective and transparent solution to their client and candidates needs. Hunter Bond directors have 20 years experience specialising in financial and technology jobs. With this experience comes a desire to provide the best recruitment service. Integrity is delivered by Hunter Bond at its upmost. Clients and candidates alike will have transparency and dedication from start to finish. Founding Direct... Know more
Related Jobs


- Company Name
- ** ********** *****
- Job Title
- Senior Data Engineer
- Job Description
- Job Title: AWS Senior Data Engineer Location: Manchester (Trafford) – 2 days a week Salary: £60,000 - £80,000 DOE The client is a technology provider to the market research industry. They use cutting edge technology to help our clients gain a deeper insight into the digital lives of consumers. We are looking for Data Engineer to join the team and play a crucial role in Data Engineer Team, ensuring the main function is developing, maintaining and improving the end-to-end data pipeline. The Role You will be working in the Data Engineering team whose main function is developing, maintaining and improving the end-to-end data pipeline that includes real-time data processing; extract, transform, load jobs; artificial intelligence; and data analytics on a complex and large dataset. Your role will primarily be to perform DevOps, backend and cloud development on the data infrastructure to develop innovative solutions to effectively scale and maintain the data platform. You will be working on complex data problems in a challenging and fun environment, using some of the latest Big Data open-source technologies like Apache Spark, as well as Amazon Web Service technologies including Elastic MapReduce, Athena and Lambda to develop scalable data solutions. Key Responsibilities: Adhering to Company Policies and Procedures with respect to Security, Quality and Health & Safety. Writing application code and tests that conform to standards. Developing infrastructure automation and scheduling scripts for reliable data processing. Continually evaluating and contribute towards using cutting-edge tools and technologies to improve the design, architecture and performance of the data platform. Supporting the production systems running the deployed data software. Regularly reviewing colleagues’ work and providing helpful feedback. Working with stakeholders to fully understand requirements. Be the subject matter expert for the data platform and supporting processes and be able to present to others to knowledge share. Here’s what we’re looking for: The ability to problem solve. Knowledge of AWS or equivalent cloud technologies. Knowledge of Serverless technologies, frameworks and best practices. Experience using AWS CloudFormation or Terraform for infrastructure automation. Knowledge of Scala or OO language such as Java or C#. SQL or Python development experience. High-quality coding and testing practices. Willingness to learn new technologies and methodologies. Knowledge of agile software development practices including continuous integration, automated testing and working with software engineering requirements and specifications. Good interpersonal skills, positive attitude, willing to help other members of the team. Experience debugging and dealing with failures on business-critical systems. Preferred: Exposure to Apache Spark, Apache Trino, or another big data processing system. Knowledge of streaming data principles and best practices. Understanding of database technologies and standards. Experience working on large and complex datasets. Exposure to Data Engineering practices used in Machine Learning training and inference. Experience using Git, Jenkins and other CI/CD tools. Benets Work in a market leading technology company that helps research and marketing professionals achieve unique insights into the mobile and digital lives of consumers. The client do everything they can to support our people so that they can be themselves and realise their potential. We love people who are hungry for learning and achievement! 25 days paid holiday plus bank holidays Purchase/sale of up to 5 leave days pa – after 2 years’ service Life insurance Workplace pension with employer contribution Performance based bonus scheme Informal dress code Cycle to work scheme Branded company merchandise


- Company Name
- McCabe & Barton
- Job Title
- Analytics Engineering Manager
- Job Description
- Leading Financial Services client is currently going through an exciting Data Transformation and are now seeking an experienced Analytics Engineering Manager to strategically lead the modernization of their Analytics & Business Intelligence function. The ideal Analytics Engineering Manager will have a true passion for data and has had experience leading teams ranging from 8 to 15 members. They will enjoy working at a strategic, enterprise-level and have had extensive experience in modernising an Analytics / Business Intelligence function. To be considered for this role you will need: Possesses a passion for data transformation, modernisation and automation Experience managing teams of 8 + data specialists who have a broad range of skills Strong knowledge of front-end visualisation tools such as Tableau, Qlik and/or Power BI Proven experience operating at a strategic level and has ‘hands on’ experience of modernising an Analytics / Business Intelligence function Strong understanding of cloud-based data solutions Extensive experience in one or more of the following languages: SQL, Python or Java Proficient with coding best practices, including version control via GitHub Results-oriented and has an eye for detail and a passion for data. Excellent communication and stakeholder management skills. If you are an experienced Analytics Engineering Manager with the required background, please respond to this ad in the first instance with an up to date version of your CV for review.


- Company Name
- McCabe & Barton
- Job Title
- Senior Data Engineer
- Job Description
- Leading Financial Services client is currently going through a large Data transformation and is looking to hire a number of Senior Data Engineers to join them on this exciting journey. The roles are offering a base between £60,000 and £75,000 + a strong benefits package and flexible working. Our client is looking for strategically minded Data Engineers that are naturally curious and understands a business’s drive to implement new data sources, data flows, automated processes and database structures. The ideal candidate will have strong knowledge of SQL and good skills in Snowflake, Azure and Python. However, this client does take an agnostic approach to technology and would be open to other skillsets too. Remit You will be tasked with building data sources, data flows, automated processes and database structures. Building & managing a Snowflake ecosystem to support streaming and batch workflows, that allows end user teams to access data, build reports, and leverage automation tools. Supporting adoption of GenAI tools and techniques and leading platform innovation with Snowflake and Microsoft. Ideal experience & Background Strong experience with Snowflake (SQL scripting DDL and DML) Good knowledge of Azure Data Factory and Azure DevOps Experience with Python Experience working in an Agile environment If you are an experienced Senior Data Engineer with the required skills, please respond with an up to date version of your CV for review.


- Company Name
- Harnham
- Job Title
- Principal Data Scientist
- Job Description
- Principal Data Scientist Up to £125,000 London (Hybrid, 3 days onsite per week) Company: A leading marketing and analytics agency is seeking a Principal Data Scientist to develop and deploy end-to-end AI solutions. You'll work on machine learning models for retention, price optimization, recommendation engines, NLP, and Computer Vision. Responsibilities: Design, develop, and productionize machine learning models across various applications. Work with Python (ideally production-level code) and other tools like SQL, Spark, and Databricks. Apply clustering, classification, regression, time series modelling, NLP, and deep learning. Develop recommendation engines and leverage third-party data enhancements. Implement MLOps/DevOps practices in cloud environments (AWS, Azure, GCP). Ensure explainability, bias detection, and algorithmic fairness in models. Collaborate with stakeholders to translate business challenges into data-driven solutions. Requirements: MSc or PhD Degree in Computer Science, Artificial Intelligence, Mathematics, Statistics or related fields. Strong Python skills (bonus: C++, SQL, Spark) Experience in ML algorithms (XGBoost, clustering, regression) Expertise in Time Series, NLP, Computer Vision, MLOps Knowledge of AWS/Azure/GCP, CI/CD, and Agile development Ability to own solutions and manage stakeholders How to Apply: Please register your interest by sending your CV to Emily Burgess via the Apply link on this page.