- Company Name
- Queen Square Recruitment
- Job Title
- Mainframe Consultant
- Job Description
-
**Job title**
Mainframe Consultant
**Role Summary**
Lead critical mainframe data migration and modernization, designing CDC pipelines, translating Db2 schemas to Aurora PostgreSQL, and orchestrating a zero‑surprise cutover from mainframe to cloud data warehouses while ensuring data integrity, observability, and governance.
**Expactations**
- Deliver end‑to‑end migration from Db2 on z/OS to Aurora PostgreSQL with validated data transfer.
- Maintain high data quality and enforce validation, reconciliation, and rollback plans.
- Own infrastructure and deployment through Terraform and GitLab CI/CD.
- Mentor and coordinate cross‑functional teams across onsite and remote work.
**Key Responsibilities**
- Design & implement CDC pipelines (IBM CDC, Precisely, etc.) with subscription, bookmarks, and replay/backfill.
- Model logical/physical schemas, normalize/denormalize, and enforce referential integrity in Aurora PostgreSQL.
- Build reliable integration from Db2 to Aurora via Kafka/S3 with ordering guarantees and UPSERT/MERGE logic.
- Convert EBCDIC and packed decimal data to UTF‑8, validating with robust suites.
- Use migration tooling (e.g., Glue, Athena, Redshift) for schema conversion and analytics.
- Provision infrastructure-as-code (Terraform) and continuous integration pipelines (GitLab).
- Plan dual‑run cutovers, execute validation, and implement rollback and governance controls.
- Develop observability dashboards (CloudWatch, Grafana) for lag, throughput, error rates, and cost.
- Apply Domain‑Driven Design and event‑driven architecture patterns to CDC event streams.
**Required Skills**
- CDC knowledge (IBM, Precisely, or equivalent): subscription, bookmarks, replay, backfill.
- Db2 & z/OS fundamentals: catalog, batch windows, performance tuning.
- Relational modeling for PostgreSQL/Aurora: normalization, denormalization, indexing, partitioning.
- Integration patterns: Kafka, UPSERT/MERGE, Python/SQL troubleshooting.
- Data quality: validation tests, reconciliation against golden sources.
- Logical & physical data modeling (1NF–BCNF) and OLTP vs. analytics trade‑offs.
- Domain‑Driven Design, event‑driven architecture, CQRS, schema validation strategies.
- Optional: IBM zDIH patterns, zIIP tuning, COBOL copybook/VSAM ingestion.
**Required Education & Certifications**
- Bachelor’s degree or higher in Computer Science, Information Systems, or related field.
- Relevant certifications (e.g., IBM Certified Database Administrator – Db2, AWS Certified Database – Specialty, Terraform Certified Engineer) are preferred.