cover image
PRIMUS Global Solutions (PRIMUS UK & Europe)

PRIMUS Global Solutions (PRIMUS UK & Europe)

primusglobal.com

1 Job

155 Employees

About the Company

Founded in 2002, PRIMUS Global Solutions is a premier provider of enterprise-grade IT Solutions and strategic Talent Delivery services. Headquarters for Europe in London, UK with global delivery hubs in the US, UK, Germany, Poland, Sweden and India, we empower organisations across North America, Europe, the Middle East, and India.

With 23+ years of experience, we support 400+ global clients with a delivery force of 3,000+ skilled professionals, combining deep domain knowledge, emerging tech, and agile execution.

Core Expertise:
Custom IT Solutions:
Salesforce Development & Integration
AI, ML & Data Science (NLP, Deep Learning, TensorFlow, PyTorch)
Application Development (Java, .Net, Python, Node.js, Angular, React)
Cloud Platforms (AWS, Azure, Google Cloud, OpenStack)
DevOps & Automation (Kubernetes, Jenkins, Terraform, GitLab, Ansible)
Cybersecurity (IAM, SOC, Red/Blue Teaming)
Enterprise Systems (SAP HANA, S/4HANA, Oracle EBS, PeopleSoft)

Success Highlights:
Liberty Global: Cloud & mobile app development

UAE Bank: Customer Data Platform enabling 40% faster onboarding

EU Bank: ML-powered predictive credit risk engine

Adidas: Real-time supply chain dashboard using Power BI

Talent Delivery:
Our Custom IT Solutions & Talent Delivery services ensure clients get access to pre-trained, deployment-ready talent across core technologies. With 25,000+ successful deployments for global leaders like Infosys, TCS, Dell, HCL, and LTI, we shorten onboarding cycles by 60% through our AI-enabled screening engine.

Whether it's delivering advanced AI-driven platforms or scaling your engineering teams, PRIMUS brings global strength with local execution.

Connect with us | www.primusglobal.com

Listed Jobs

Company background Company brand
Company Name
PRIMUS Global Solutions (PRIMUS UK & Europe)
Job Title
Data Architect
Job Description
**Job title** Data Architect **Role Summary** Design, implement, and maintain end‑to‑end streaming data pipelines and lakehouse architectures using Kafka (Confluent) and the AWS data stack. Lead schema governance, data quality, and event routing strategies to support real‑time analytics and data‑driven applications. **Expectations** - Minimum 10 + years of professional experience in data architecture, with a strong focus on streaming and event‑driven design. - Proven track record of delivering large‑scale, highly available data solutions in cloud environments. - Ability to mentor and influence cross‑functional teams on data best practices. **Key Responsibilities** - Architect and embed Kafka (Confluent) and AWS Kinesis (including Kinesis Firehose) streams for high‑throughput, ordered, and exactly‑once delivery. - Design and enforce schema evolution policies using Avro or Protobuf and leverage Confluent Schema Registry for compatibility checks. - Build event routing and filtering pipelines with AWS EventBridge. - Create and manage AWS data lake and warehouse resources: S3, Glue, Athena, Redshift, and Step Functions. - Implement DLQ (Dead Letter Queue) patterns for fault tolerance in streaming pipelines. - Deploy iceberg‑ready lakehouse patterns to support analytic workloads. - Optimize streaming flows: Kinesis → S3 → Glue, Glue Streaming jobs, and corresponding data cataloging. - Ensure end‑to‑end security, compliance, and monitoring of data streams and lakehouse assets. **Required Skills** - Expertise in Kafka (Confluent) and AWS Kinesis ecosystem. - Proficiency with AWS services: Kinesis Firehose, EventBridge, Glue, Athena, Redshift, S3, Step Functions, Lambda. - Strong knowledge of schema metadata (Avro, Protobuf) and Confluent Schema Registry best practices. - Experience designing iceberg‑ready lakehouse architectures. - Familiarity with DLQ patterns, event replay, and exactly‑once semantics. - Solid background in data modeling, ETL/ELT, and data governance. - Strong writing and communication skills for technical documentation and stakeholder engagement. **Required Education & Certifications** - Bachelor’s degree (or higher) in Computer Science, Information Systems, or a related field. - AWS Certified Solutions Architect – Associate or Professional (preferred). - Confluent Certified Developer for Apache Kafka (optional but advantageous).
London, United kingdom
Hybrid
Senior
09-12-2025