- Company Name
- Altak Group Inc.
- Job Title
- Sr Ab Initio Developer
- Job Description
-
**Job Title**
Sr. Ab Initio Developer
**Role Summary**
Senior developer responsible for designing, building, and maintaining complex Ab Initio ETL pipelines. Leads migration from on‑premise to cloud environments (AWS or Azure), ensures performance, reliability, security, and governance, and manages DevOps lifecycle and production operations.
**Expectations**
- Deliver scalable, maintainable Ab Initio graphs that meet business SLAs.
- Execute end‑to‑end migration to cloud data platforms with minimal downtime.
- Maintain high data quality, lineage, and security standards.
- Collaborate with architects, platform engineers, and analysts to define best practices.
**Key Responsibilities**
- Design and develop Ab Initio GDE graphs and EME metadata.
- Migrate existing pipelines to AWS (S3, Redshift, Glue, Lambda, EMR, MSK, Kinesis) or Azure (ADLS Gen2, Synapse, Data Factory, Event Hubs).
- Build connectors and CDC jobs for Oracle, SQL Server, DB2, flat files, MQTT/Kafka, and other sources.
- Perform performance tuning: memory, parallelism, partitioning, file/database I/O.
- Implement robust error handling, monitoring, SLA tracking, and alerting.
- Apply data quality checks (Ab Initio DQE) and enforce lineage/metadata practices.
- Enforce enterprise security controls (IAM, KMS/Key Vault, tokenization).
- Manage version control (Git, EME), automated build/deploy, environment promotion, and infrastructure coordination.
- Troubleshoot production issues, conduct capacity planning, and author runbooks.
- Contribute to standards, guidelines, and continuous improvement initiatives.
**Required Skills**
- Proficient in Ab Initio (Arcturus, GDE, EME, Co>Operator).
- Deep knowledge of cloud services: AWS (S3, Redshift, Glue, Lambda, EMR, MSK/Kinesis) or Azure (ADLS Gen2, Synapse, Data Factory, Event Hubs).
- Strong background with relational databases (Oracle, SQL Server, DB2) and message queues (MQ, Kafka).
- Experience with CDC patterns and incremental load implementation.
- Expertise in performance tuning, memory management, and partitioning strategies.
- Familiarity with Ab Initio DQE for data quality and data governance.
- Hands‑on experience with IAM, KMS, Key Vault, tokenization, and other security mechanisms.
- Knowledge of CI/CD pipelines, Git, automated build tools, and infrastructure as code.
- Strong troubleshooting, documentation, and runbook writing skills.
**Required Education & Certifications**
- Bachelor’s degree in Computer Science, Data Engineering, Information Systems or related field (or equivalent hands‑on experience).
- Ab Initio Certified Developer (or approachable equivalent).
- Optional: Cloud certifications (AWS Certified Developer, Azure Data Engineer Associate) are a plus.