cover image
Orcan Intelligence

Orcan Intelligence

orcanintelligence.com

1 Job

14 Employees

About the Company

Enabling Transformation. Igniting Growth. Orcan delivers end-to-end AWS projects that help businesses move forward with clarity and control. From cloud migration and optimisation to reimagining infrastructure and deploying hybrid architecture, we build scalable systems and provide expert consultancy to unlock real business value. We're not here to overpromise or slow your progress. We deliver reliably, consistently, and with a focus on what matters most, whether that's reducing operational costs, improving performance at scale, or strengthening resilience across your environment. Every project is shaped around your goals, your teams, and your timelines. Our support continues well beyond delivery. We stay close to review, refine, and provide targeted AWS Training to help your internal teams build lasting capability that reduces reliance on external partners. For wider transformation needs - across Data & AI, Cybersecurity, and Financial Optimisation - we provide trusted consultants through our Managed Staffing model, giving you fast, flexible access to the right expertise when you need it. When it matters, Orcan makes it work. Let's talk.

Listed Jobs

Company background Company brand
Company Name
Orcan Intelligence
Job Title
GCP Data Engineer
Job Description
**Job Title**: GCP Data Engineer **Role Summary**: Freelance GCP Data Engineer to design, deploy, and optimize large-scale data pipelines for cloud infrastructure projects. Collaborate with a team of experienced GCP professionals to enhance data processing capabilities and support data migrations. **Expectations**: Minimum 5+ years as Data Engineer or equivalent, 3+ years hands-on GCP experience (production deployments required). Prioritize technical depth over general experience. **Key Responsibilities**: - Design, build, and maintain ETL/ELT pipelines for data ingestion, transformation, and warehousing. - Implement and manage high-performance, scalable data warehousing solutions (e.g., BigQuery). - Deploy GCP infrastructure using Terraform and manage resources via IaC. - Orchestrate workflows with Cloud Composer or Airflow for automation and scheduling. - Process batch, micro-batch, and real-time data using Apache Beam/Dataflow or Spark/Dataproc. - Author/optimise SQL queries for relational databases, including CDC integration. - Lead data migrations with focus on quality, testing, and DataOps practices (CI/CD, pipeline deployments). - Recommend and integrate tools to enhance data engineering ecosystem efficiency. - Communicate requirements and solutions to stakeholders and partners. **Required Skills**: - Python proficiency; Java/Scala optional. - Advanced SQL query authoring/optimisation. - Experience with BigQuery, Terraform, Cloud Composer/Airflow, Apache Beam/Dataflow, Spark/Dataproc. - Proven pipeline development in GCP for large datasets. - Familiarity with DataOps, CI/CD, and IaC practices. - Streaming architecture expertise (batch, micro-batch, real-time). - Strong verbal/written English communication. **Required Education & Certifications**: Not specified.
London, United kingdom
On site
Mid level
16-10-2025