cover image
Convertr

Convertr

www.convertr.io

68 Employees

About the Company

Convertr is the enterprise data integrity layer that ensures clean, compliant, and complete data flows across systems and teams to support AI-readiness.

Global leaders like Microsoft, Oracle, Amazon, Stripe, and HP use Convertr to improve data quality, accelerate operations, and gain full transparency across their lead and data supply chains. By automating complex workflows and standardizing data at scale, Convertr helps organisations drive better decisions, reduce risk, and unlock the full value of their data. The best convert with Convertr.

Listed Jobs

Company background Company brand
Company Name
Convertr
Job Title
Data Engineer
Job Description
Job Title Data Engineer Role Summary Design and build scalable data pipelines to ingest, transform, and deliver high‑quality data from operational systems to analytics and AI workloads. Serve as the owner of the data foundation, ensuring reliability, security, and compliance across cross‑functional teams. Expectations * Deliver robust, production‑grade pipelines that enable real‑time and batch analytics. * Maintain rigorous data quality and governance to support decision making. * Collaborate closely with Engineering, DevOps, and Data Science teams to align data architecture with product and business goals. * Continuously improve tooling, documentation, and process efficiency. Key Responsibilities - Build, maintain, and optimize ETL/ELT pipelines from operational sources to analytics platforms. - Work with DevOps to configure data replication, ingestion, and storage (e.g., DMS, AWS services). - Engineer analytics‑ready datasets in AWS Redshift using dbt, managing model structure, tests, and documentation. - Manage data storage in AWS S3, defining lake vs warehouse boundaries. - Enhance accessibility of platform logs and event data for analytics and machine learning use cases. - Implement data quality, monitoring, and alerting to reduce manual intervention. - Support ad‑hoc data requests and reporting needs. - Ensure all data handling complies with security standards, GDPR, and internal policies. Required Skills - Strong SQL proficiency (AWS Redshift or equivalent). - Python expertise for pipeline development, API integration, and semi‑structured data handling. - Hands‑on experience with dbt for building and maintaining analytics models. - Proficient with AWS services: Redshift, S3, RDS/Aurora, and data migration tools. - Experience ingesting and processing event/log data (Elasticsearch, OpenSearch, or similar). - Familiarity with version control (Git) and CI/CD practices for data projects. - Ability to collaborate effectively with cross‑functional teams (Engineering, DevOps). - Knowledge of data security, privacy compliance, and GDPR principles. Required Education & Certifications - Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field, or equivalent professional experience. - Optional: AWS Certified Data Analytics – Specialty or similar cloud‑data certification.
London, United kingdom
Remote
10-03-2026