Job Specifications
Hi,
This is Prashant From Triunity Software Inc
Follow Me on Linkedin- https://www.linkedin.com/in/usaprashantrathore/
Sr Data Engineer - AWS, Cloud Technologies, Snowflake, DBT
Location: Toronto/Markham, ON- Hybrid
Minimum Experience: 8 Years
Need Local Candidates Only
Job Description:
This role will be part of and a member of our Information Technology. You will be responsible for leading the architecture, high level and low-level solution engineering design, analysis, and implementation in a successful and experienced team. You’ll be required to apply your depth of knowledge and expertise with both modern and legacy data platforms to develop data ecosystems that will meet business requirements and align with Client's enterprise architecture goals and standards. Client has embarked on an exciting journey to modernize, craft, and build a next generation data platform Snowflake to support the growing data needs of the business and to enable the capabilities of AI, and GenAI to drive business value.
We embrace a culture challenging the status quo and constantly look to simplify processes efficiently, technology, and workflow.
This role will be reporting into AVP, Data Engineering.
What you'll do
As a Senior Data Engineer Lead, you will be instrumental in shaping and delivering enterprise-scale data solutions. You’ll define the technical roadmap, drive data strategy, and lead the design and implementation of robust, scalable data pipelines. This role requires a strong blend of technical leadership, hands-on engineering, and cross-functional collaboration.
Must Have skills:
1. Snowflake
2. DBT
3. AWS
Key Responsibilities
• Technical Leadership: Define and drive the data engineering strategy, standards, and best practices across the organization.
• Solution Design: Develop high-level and low-level solution architectures, ensuring alignment with business and technical goals.
• Data Pipeline Development: Lead the design and implementation of high-performance data pipelines using tools like dbt Core/Cloud, ensuring scalability and maintainability.
• Data Modeling: Design and review conceptual, logical, and physical data models to support business needs.
• Code Ownership: Write and maintain clean, reusable code in SQL, Python, Shell, and Terraform.
• Quality & Governance: Champion data quality, governance, and cataloging practices; create and review test plans to validate data solutions.
• Issue Resolution: Perform root cause analysis and implement effective solutions for complex data issues.
• Agile Delivery: Lead agile ceremonies, foster a delivery-focused mindset, and ensure timely execution of data initiatives.
• Mentorship: Guide and mentor Data Engineers and project team members, elevating team capabilities and engineering excellence.
• Collaboration: Work closely with architects, designers, QA engineers, and delivery teams to ensure cohesive and customer-centric data products.
• Documentation: Produce and maintain comprehensive technical documentation to support implementation and knowledge sharing.
• Talent Development: Contribute to hiring by designing technical challenges, conducting interviews, and supporting onboarding.
What you'll bring
Extensive Experience:
• Years of professional experience delivering over 10 high-impact data projects from inception through warranty.
• 5+ years of Snowflake, dbtCore/Cloud, and AWS Cloud Technologies.
• Years of experience with coding in multiple programming languages such as Python, Java, etc.
Technical Expertise: Deep knowledge of relational databases (Snowflake, PostgreSQL, Amazon Aurora), big data platforms (Hadoop), and NoSQL databases (e.g., MongoDB).
Data Visualization Proficiency: Skilled in tools such as Snowsight, Streamlit, Qlik, and SAP BusinessObjects to communicate insights effectively.
Advanced Coding Skills: Expert-level proficiency in SQL, Python, Shell, and Terraform, with a strong focus on performance, reusability, and maintainability.
Presentation & Communication: Strong technical and business presentation skills; able to identify and address gaps in data designs and processes with both internal and external stakeholders.
Pipeline & Orchestration Tools: Hands-on experience with orchestration tools like Zena and AWS Managed Airflow.
Resilience & Adaptability: Proven ability to thrive in fast-paced, ambiguous, and high-pressure environments.
Mentorship & Leadership: A track record of mentoring Data Engineers at all levels, fostering a culture of engineering excellence and continuous improvement.
Customer-Centric Mindset: Passion for solving real-world problems using data-driven insights to deliver impactful business outcomes.
Collaborative Approach: Strong interpersonal and communication skills, with the ability to lead teams and influence cross-functional stakeholders.
Domain Knowledge: Familiarity with insurance industry processes and systems is a strong asset.
AI/ML & GenAI Exposure: Experience in operationalizin
About the Company
Triunity is a Product Development, Staff Augmentation, and Consulting Services company providing solutions and services in North America. We provide IT services and technology solutions to various business verticals like Healthcare, Pharma, Banking, Finance, etc. Our goal is to develop a long-term partnership with businesses and help them get a competitive advantage by providing IT infrastructure and software platforms.
Lead by experts in the IT industry with a proven record of delivering software solutions, consulting, and...
Know more