cover image
CommuniTech Recruitment Group

CommuniTech Recruitment Group

www.communitech.co.uk

5 Jobs

1 Employees

About the Company

CommuniTech are an exciting name in Tech Recruitment, seamlessly connecting the client & candidate communities to deliver exceptional technical talent to tech-driven companies. Ensuring that together, they will thrive, exceed, and achieve.

By striving to intertwine the communities, we get to know our clients and candidates better than ever before. Providing recruitment solutions that deliver an individual experience tailored to your needs.

Listed Jobs

Company background Company brand
Company Name
CommuniTech Recruitment Group
Job Title
C# Data Centric Developer with experience with real time analytical databases (non relational). 6 month rolling contract. £700/ Day Inside IR35. Hybrid 2 Days in Central London.
Job Description
**Job title** C# Data Centric Developer **Role Summary** Deliver backend solutions in .NET C# focused on real‑time analytical databases. Build and maintain data ingestion pipelines, enable analytics workflows and ensure high‑quality, maintainable code on an agile contract basis. **Expectations** - 6‑month rolling contract, day rate £700, IR35 in‑scope. - Hybrid schedule: 2 days onsite in London, remainder remote. - Active participation in Scrum ceremonies and continuous delivery pipelines. **Key Responsibilities** - Design, develop, and deploy C# backend services targeting real‑time analytical databases (ClickHouse, SingleStore, Rockset, TimescaleDB). - Implement data ingestion pipelines that load large batch and streaming data into analytical stores or open‑standard data lakes (Iceberg, Delta, Parquet). - Containerise applications (Docker), manage deployments on cloud or on‑premise platforms. - Apply SOLID principles, dependency injection, unit/integration testing, and automated CI/CD frameworks. - Configure and maintain monitoring, logging, and performance profiling for services. - Collaborate with data engineers, analysts and product owners to define data models and API contracts. **Required Skills** - 3+ years commercial .NET C# development (backend). - Strong experience with SQL Server and at least one non‑relational analytical database (ClickHouse, SingleStore, Rockset, TimescaleDB). - Proficiency in containerised application development (Docker) and deployment pipelines. - Familiarity with data lake formats (Parquet, Iceberg, Delta) and Spark or similar column‑store processing. - Solid understanding of test automation, IoC, SOLID, logging, and monitoring. - Knowledge of CI/CD (Azure DevOps or TFS), Git, and automated testing. - Experience with authentication protocols (OAuth2) is a plus. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering or related technical field, or equivalent work experience. - No specific certifications required; relevant Microsoft, Azure or database vendor certifications are advantageous.
London, United kingdom
On site
07-11-2025
Company background Company brand
Company Name
CommuniTech Recruitment Group
Job Title
Data Engineer - Data Bricks Greenfield Implementation. £800/ Day Inside IR35. 6 month rolling long term contract. Hybrid 2 Days/ week in Central London Office.
Job Description
Job Title: Data Engineer – Databricks Greenfield Implementation Role Summary: Lead the design, implementation, and optimisation of a Lakehouse architecture on Azure Databricks for a leading energy trading firm. Drive proof‑of‑concept development, hybrid data integration, streaming pipelines, and data governance to enable real‑time analytics and AI workloads. Expectations: * Deliver end‑to‑end Databricks solutions that meet business objectives within a 6‑month contract. * Collaborate with cross‑functional teams (data science, analytics, governance) and participate in agile ceremonies. * Maintain high data quality, lineage, security, and cost‑efficiency across all pipelines. Key Responsibilities: * Evaluate and showcase Databricks capabilities in proof‑of‑concept projects. * Design and build a scalable Lakehouse using Delta Lake, Unity Catalog, and Azure Data Lake Storage. * Integrate on‑premise data sources with Azure Databricks, addressing connectivity, security, and performance. * Develop structured streaming pipelines (Databricks Structured Streaming, Kafka/Event Hubs). * Build ingestion and transformation workflows with notebooks, Spark SQL, and automate with CI/CD. * Implement data lineage, governance, and observability; ensure compliance with regulatory standards. * Apply schema enforcement, validation, and error handling to maintain data quality. * Collaborate on schema design for analytics and AI workloads. * Participate in agile planning, stand‑ups, and iterative delivery. Required Skills: * Proven experience with Azure Databricks (cluster configuration, workspace management, cost/performance optimisation). * Expertise in lakehouse architecture (Delta Lake, partitioning, indexing, compaction). * Hybrid integration knowledge (secure connectivity, data movement, performance tuning). * Real‑time pipeline development (Structured Streaming, Kafka, Event Hubs). * Advanced transformation & lineage implementation (Databricks notebooks, Spark SQL, Unity Catalog). * Deep understanding of Apache Spark optimisation for batch and streaming. * Strong SQL skills; Delta Lake ACID, schema enforcement, time travel. * Git‑based version control, CI/CD in Azure DevOps or GitHub Actions. * Azure Data Lake Storage, RBAC, Unity Catalog, and cloud security fundamentals. Required Education & Certifications: * Bachelor’s degree in Computer Science, Engineering, or related field. * Relevant certifications such as Azure Data Engineer Associate, Databricks Certified Associate Developer for Apache Spark, and/or Certified Data Professional (CDP) are advantageous.
London, United kingdom
On site
25-01-2026
Company background Company brand
Company Name
CommuniTech Recruitment Group
Job Title
Databricks Technical Lead. Fintech. up to £1000/ Day inside IR35. Greenfield Project. 6 Months rolling contract. Hybrid 3 Days a week in Central London office.
Job Description
**Job Title** Databricks Technical Lead **Role Summary** Lead the design, build, and delivery of a green‑field Lakehouse platform on Azure Databricks for a fintech client. Provide technical ownership of the Azure Databricks implementation, from proof‑of‑concept to production, ensuring scalability, cost efficiency, and compliance. Mentor the data engineering team, contribute to Agile delivery, and serve as the bridge between data operations, governance, and analytics teams. **Expectations** - Minimum 5 years of data engineering experience with strong Azure Databricks exposure. - Proven track record delivering end‑to‑end cloud data platforms (Lakehouse, Delta Lake). - Hands‑on with Azure services (ADLS Gen2, Azure DevOps, Azure IAM). - Ability to work within a 6‑month rolling contract, supporting hybrid (remote + on‑site) engagement. **Key Responsibilities** - Own Azure Databricks implementation: evaluate capabilities, set success criteria, and recommend full‑scale adoption. - Architect and deploy Lakehouse on Azure Databricks using Delta Lake, Unity Catalog, and MLflow. - Design hybrid data integration strategies, integrating on‑premises sources with Azure. - Build real‑time and batch pipelines using Structured Streaming, Spark SQL, and notebooks. - Implement data lineage, governance, and quality controls with Unity Catalog or equivalent tools. - Optimize cluster configurations, partitioning, indexing, and compaction for performance and cost. - Mentor teammates, review code, and champion best practices. - Participate in Agile ceremonies, deliver PoC prototypes, and iterate toward production. **Required Skills** - Azure Databricks administration (cluster, workspace, cost & performance tuning). - Delta Lake, Unity Catalog, MLflow, and Lakehouse architecture design. - Design of partitioning strategies, indexing, and compaction for large‑scale workloads. - Real‑time streaming with Structured Streaming, Kafka or Event Hubs. - Spark SQL, Delta Lake ACID, schema enforcement, and time‑travel. - Git, CI/CD pipelines (Azure DevOps, GitHub Actions) for Databricks notebooks. - Azure Data Lake Storage Gen2, Azure RBAC, Azure AD, and data security compliance. - Strong SQL, data modeling, and workflow design. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Engineering, Data Science, or related field. - Azure Databricks Certified Developer or Azure Data Engineer Associate (preferred).
London, United kingdom
Hybrid
Senior
25-01-2026
Company background Company brand
Company Name
CommuniTech Recruitment Group
Job Title
Fullstack Angular/C#/ SQL/ Azure Developer. Commodities Trading. 6 month rolling contract. £700/ Day Inside IR35. Hybrid 2 days a week in Central London office.
Job Description
Job Title Full‑Stack Angular/C#/SQL/Azure Developer Role Summary Deliver end‑to‑end solutions for a commodities trading platform on a 6‑month rolling contract. Lead both front‑end (Angular) and back‑end (.NET/C#) development while integrating SQL and Azure services to support real‑time trading operations. Expectations - Work 48‑hour weeks, hybrid with 2 days onsite (Central London). - Participate in daily stand‑ups, sprint planning, and code reviews. - Provide high‑quality, maintainable code that meets business uptime requirements. - Communicate status, blockers, and solutions to product owners and stakeholders. Key Responsibilities - Design, develop, and maintain Angular interfaces with dynamic data binding. - Build scalable C# APIs, implement business logic, and expose secure endpoints. - Manage SQL Server databases: schema design, query optimisation, and data integrity. - Deploy and manage cloud resources (Azure App Services, Functions, CosmosDB, Azure SQL). - Implement CI/CD pipelines (Azure DevOps, GitHub Actions). - Troubleshoot performance, security, and data consistency issues. - Write unit, integration, and UI tests to ensure system reliability. Required Skills - 3+ years of full‑stack development in Angular (v12+) and .NET Core/Framework. - Strong knowledge of SQL Server database design, T‑SQL, and performance tuning. - Experience with Azure services: App Services, Functions, Logic Apps, Azure SQL, and monitoring (Application Insights). - Proficiency in Git, CI/CD, and automated testing frameworks. - Familiarity with DevOps principles and agile methodologies. Required Education & Certifications - Bachelor’s degree or equivalent in Computer Science, Software Engineering, or related field. - Optional: Microsoft certifications (AZ‑900, AZ‑204, or equivalent) are a plus.
London, United kingdom
Hybrid
29-01-2026