Job Specifications
Title: AWS Engineer – Tokenization & Data Security
Location: Mclean, VA (Onsite)
Type: Contract
Summary:
We are seeking a skilled AWS Engineer with strong experience in tokenization, data security, and containerized application management. The ideal candidate will have deep technical expertise in AWS cloud services, automation, scripting, and data protection frameworks. This role requires hands-on experience with EKS clusters, CFT updates, and secure data handling (PII/confidential data masking and tokenization).
________________________________________
Key Responsibilities
Design, build, and maintain containerized applications on AWS (EKS/ECS) using automation and custom scripts.
Implement and manage tokenization and detokenization processes to protect PII and confidential data.
Work with data security tools to ensure compliance with internal and external security standards (e.g., PCI DSS, GDPR).
Build and maintain infrastructure as code (IaC) using CloudFormation Templates (CFTs) and automation pipelines.
Develop and manage scripts (Python, Shell, Ansible, etc.) to automate application builds and deployments.
Collaborate with security and data engineering teams to implement data masking, token mapping, and encryption solutions.
Monitor, optimize, and troubleshoot EKS clusters, ensuring high performance and scalability.
Maintain documentation on infrastructure design, tokenization workflows, and data protection measures.
Participate in audits, reviews, and assessments of data security systems.
________________________________________
Required Skills & Experience
6–10 years of total IT experience with a strong focus on AWS Cloud Engineering and Data Security.
Hands-on experience with AWS services – EC2, EKS, Lambda, S3, IAM, CloudFormation, and KMS.
Proven experience in containerization and Kubernetes (EKS) management, including upgrades and patching.
Proficiency in Python scripting and automation for build/deployment processes.
Strong understanding of tokenization concepts, token mapping, and data masking techniques.
Experience with data security tools used for tokenization/detokenization and encryption key management (e.g., Protegrity, Thales CipherTrust, Voltage SecureData, or similar).
Deep knowledge of PII and confidential data protection standards.
Experience updating and maintaining CloudFormation Templates (CFTs) and other IaC frameworks.
Solid understanding of security compliance frameworks (PCI DSS, GDPR, HIPAA).
________________________________________
Nice-to-Have Skills
Exposure to ETL tools and data pipelines (Informatica, IICS).
Familiarity with DevSecOps and integrating security within CI/CD pipelines.
Knowledge of AWS KMS, encryption mechanisms, and key rotation policies.
________________________________________
Soft Skills
Strong analytical and problem-solving abilities.
Excellent communication and documentation skills.
Ability to collaborate with cross-functional teams (DevOps, Data, Security).
Self-driven with a proactive approach to automation and process improvement.
About the Company
Voto Consulting provides professional staffing, consulting, and solutions to all industries including: financial services, health care, logistics, retail, energy, etc.
Voto Consulting's niche based approach has been very successful and has consistently met and exceeded the expectations of our clients in both the US and Canada.
Given this success, we expanded into other verticals - health care, logistics, retail, energy - using the same focus, and client-centric model that has served us well with financial services.
We are...
Know more