Job Specifications
Akkodis is seeking a Data Engineer (DevOps – Tech Ops for a contract position with a client in Toronto, ON (Hybrid). Ideally looking for experience in programming languages such as Python, Java, or Scala, Expertise in data integration and ETL tools (e.g., Talend, Informatica, Apache NiFi). CI/CD pipelines and application monitoring Certifications in data engineering (e.g., Azure Data Engineer, AWS Certified Data Engineer) , Hands-on experience with distributed computing frameworks and cloud-based data services are a plus preference would be someone with the required skills and experience, particularly in large organizations.
Title: Data Engineer (DevOps – Tech Ops)
Location: 22 Adelaide Street W Ste 2720 Toronto ON M5H 4E3 Canada
JD:
The Data Engineer is a critical member of the engineering team, responsible for designing, building, and maintaining robust data systems and infrastructure. This role focuses on enabling efficient data processing, collection, storage, and analysis at scale to support data-driven decision-making and operational optimization.
Responsibilities
Build, Orchestrate and optimize data pipelines to extract, transform, and load (ETL) data from various sources into the organization's data warehouse or data lake.
Integrate data from diverse sources such as databases, APIs, streaming platforms, and file systems into cohesive data pipelines
Implement data integration solutions that support real-time, batch, and incremental data processing.
Implement data quality checks and validation processes to ensure the accuracy, completeness, and consistency of data.
Develop and maintain high-performance data pipelines that seamlessly integrate data from various sources into data warehouses, data lakes, and other storage solutions.
Develop monitoring and alerting mechanisms to identify and address data quality issues proactively.
Optimize ETL (Extract, Transform, Load) processes for efficiency, reliability, and data quality.
Implement and manage data storage solutions, including relational databases, NoSQL databases, and distributed file systems.
Manage the infrastructure and resources required to support data engineering workflows, including compute clusters, storage systems, and data processing frameworks
Implement security controls and data governance measures for an application to protect sensitive data and ensure compliance with regulatory requirements such as GDPR, CCPA, HIPAA, and PCI-DSS. Implement encryption, access controls, and auditing mechanisms to safeguard data privacy and integrity.
.
Data Engineering Best Practices:
Write production-ready, testable code that adheres to engineering best practices and accounts for edge cases and error handling.
Ensure high-quality deliverables through code reviews, design reviews, and adherence to architectural guidelines.
Develop comprehensive unit tests and integration tests to validate data pipeline functionality and data integrity.
Stay up to date with the latest data engineering tools, technologies, and methodologies, and evaluate their applicability to the team's needs.
Technical Leadership and Collaboration:
Provide technical guidance and mentorship to junior engineers on the team.
Collaborate with data scientists, business analysts, and other stakeholders to understand data requirements and translate them into robust engineering solutions.
Work closely with other engineering teams to integrate data solutions seamlessly into the overall technology ecosystem.
Participate actively in agile ceremonies, communicate progress, and manage dependencies effectively.
Education / Skills / Experience Education: A relevant bachelor's or master's degree in computer science, Data Engineering, or a related technical field.
Technical Skills:
Proficiency in programming languages such as Python, Java, or Scala.
Extensive experience with big data technologies (e.g., Hadoop, Spark, Kafka) and cloud-based data platforms (e.g., AWS, Azure).
Expertise in data integration and ETL tools (e.g., Talend, Informatica, Apache NiFi).
Strong understanding of data modeling, data warehousing, and data lake concepts and best practices.
Familiarity with CI/CD pipelines and application monitoring practices.
Certifications in data engineering (e.g., Azure Data Engineer, AWS Certified Data Engineer) are a plus.
Experience:
5+ years of experience in a data engineering role, with a focus on building scalable, reliable, and high-performance data systems.
Proven track record of designing and implementing data pipelines, data storage solutions, and data processing workflows.
Hands-on experience with distributed computing frameworks and cloud-based data services.
Demonstrated ability to collaborate with cross-functional teams and communicate technical solutions effectively.
Ability to travel to other offices or client locations as needed.
Strong communication skills
About the Company
Akkodis is a global digital engineering company and Smart Industry leader. We enable clients to advance in their digital transformation with Talent, Academy, Consulting, and Solutions services. Our 50,000 experts combine best-in-class technologies, R&D, and deep sector know-how for purposeful innovation. We are passionate about Engineering a Smarter Future Together.
With a shared passion for technology and talent, 50,000 engineers and digital experts deliver deep cross-sector expertise in 30 countries across North America, ...
Know more