cover image
Lotlinx

Lotlinx

www.lotlinx.com

1 Job

187 Employees

About the Company

The Lotlinx Platform provides automobile dealers and manufacturers with enhanced operational control over their retail business.

Leveraging state-of-the-art real-time data and machine learning technology, Lotlinx provides a Precision Retailing solution that enables dealers to automatically adapt to market dynamics, mitigating inventory risk through VIN-specific strategies.

Dealers benefit by optimizing their profitability per vehicle retailed with machine-enabled increases in volume, turnover, gross, and market share.

Listed Jobs

Company background Company brand
Company Name
Lotlinx
Job Title
Data Engineer
Job Description
**Job Title** Data Engineer **Role Summary** Design, build, and scale cloud‑first data pipelines to ingest, transform, and store massive automotive datasets. Serve as the primary architect for the data infrastructure that powers real‑time analytics and advertising insights, ensuring reliability, performance, and security across AWS/GCP environments. **Expectations** - Deliver end‑to‑end data flows that support analytics, product, and design teams. - Optimize pipelines for latency, throughput, and cost. - Enforce data governance, privacy, and security standards. - Demonstrate proactive problem‑solving and continuous improvement. **Key Responsibilities** - Architect highly available, scalable ELT/ETL pipelines on cloud data platforms (AWS, GCP). - Build ingestion from legacy and real‑time sources using Spark/Hadoop/Beam and streaming (Kafka, Pub/Sub, Kinesis). - Design data lake and cloud warehouse solutions (Snowflake, BigQuery, Redshift). - Implement workflow orchestration with Airflow, Dataflow and dbt, ensuring idempotency and monitoring. - Collaborate with DevOps and Security teams on CI/CD, containerization (Docker), and Kubernetes deployment. - Conduct performance tuning, cost optimization, and capacity planning. - Maintain data quality, lineage, and documentation for core data sources. - Enable cross‑functional teams by translating business requirements into technical specifications. **Required Skills** - 3+ years data engineering experience. - Proficiency in Python, Scala, or Java. - Advanced SQL and query optimization. - Cloud expertise: AWS (Glue, Redshift, S3, Athena) and/or GCP (Dataflow, BigQuery). - Big Data frameworks: Apache Spark, Hadoop, Beam. - Streaming platforms: Kafka, Pub/Sub, Kinesis. - Workflow orchestration: Airflow, Dataflow. - Transformation tools: dbt. - DevOps: CI/CD, Git, Docker, Kubernetes. - Strong analytical, problem‑solving, and communication skills. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience). - Relevant certifications (AWS Certified Data Analytics – Specialty, Google Cloud Data Engineer, etc.) preferred but not mandatory.
Oakville, Canada
Hybrid
Junior
23-02-2026