cover image
Tata Consultancy Services

Tata Consultancy Services

www.tcs.com

64 Jobs

627,980 Employees

About the Company

Tata Consultancy Services is an IT services, consulting and business solutions organization that has been partnering with many of the world's largest businesses in their transformation journeys for over 56 years. Our consulting-led, cognitive powered, portfolio of business, technology and engineering services and solutions is delivered through our unique Location Independent Agile(tm) delivery model, recognized as a benchmark of excellence in software development.
A part of the Tata group, India's largest multinational business group, TCS has over 601,000 of the world's best-trained consultants in 55 countries.

*Caution against fraudulent job offers*: TCS doesn't charge any fee throughout the recruitment process. Refer here: on.tcs.com/3i9X5BU

Listed Jobs

Company background Company brand
Company Name
Tata Consultancy Services
Job Title
GCP APIGEE Engineer
Job Description
**Job title** Senior GCP Apigee Engineer **Role Summary** Lead the design, development, deployment, and maintenance of Google Cloud Apigee API Management solutions. Drive CI/CD automation, API security, and integration with GCP services while ensuring compliance, performance, and scalability across enterprise APIs. **Expectations** * Execute end‑to‑end Apigee API lifecycle from architecture to production. * Build and maintain secure, scalable APIs that meet business and regulatory requirements. * Mentor junior engineers and influence best practices within cross‑disciplinary teams. * Collaborate with DevOps, Cloud Architecture, and product stakeholders to deliver continuous value. **Key Responsibilities** * Develop and manage Apigee API proxies, policies, and configuration artifacts. * Design and automate CI/CD pipelines (Jenkins, GitHub Actions, Harness) for API deployment. * Implement API security (OAuth, JWT, mTLS), rate limiting, threat protection, and policy-as-code. * Automate Apigee deployments using Infrastructure‑as‑Code (Terraform, Terraform Cloud). * Integrate Apigee with GCP services (Cloud Functions, Pub/Sub, BigQuery, etc.) to build end‑to‑end solutions. * Create Python scripts for automation, workflow orchestration, and API operations. * Monitor and analyze API performance, logging, and analytics; apply optimizations (caching, quotas, routing). * Enforce GCP Organization policies and API governance standards. * Support legacy API migration to Apigee and define reusable templates. * Collaborate with container teams to deploy Apigee components on GKE. * Lead knowledge sharing, training, and technical guidance within the team. **Required Skills** * Hands‑on expertise with Google Apigee API Management. * Proficiency in CI/CD pipelines and DevOps practices for APIs. * Strong Python scripting for automation. * Familiarity with IaC tools (Terraform/Terraform Cloud). * Experience with Jenkins, GitHub, or Harness. * Knowledge of Google Kubernetes Engine (GKE) and container orchestration. * Deep understanding of API security standards, policy-as-code (OPA/Sentinel). * Excellent communication, collaboration, and leadership abilities. **Required Education & Certifications** * Google Cloud Certified – Apigee Certified API Engineer (preferred). * Additional Google Cloud certifications (e.g., Associate Cloud Engineer, Professional Cloud Architect) considered beneficial. * Bachelor’s degree in Computer Science, Engineering, or related technical field.
Leeds, United kingdom
Hybrid
22-12-2025
Company background Company brand
Company Name
Tata Consultancy Services
Job Title
Solutions Architect
Job Description
Job Title: Solutions Architect Role Summary: Architect and lead cloud‑based data solutions on AWS, transforming and migrating on‑premises systems to the AWS cloud while ensuring performance, cost efficiency, and security. Expectations: Design scalable, secure, and cost‑effective data architectures; provide end‑to‑end ownership from assessment through deployment and optimization; troubleshoot and optimize existing cloud data workloads. Key Responsibilities • Analyze and assess current on‑premises data environments and identify migration opportunities to AWS. • Design, implement, and operate data‑centric solutions in AWS using Amazon Redshift, DynamoDB, AWS Glue, and related services. • Conduct migration planning, data movement, transformation, and validation for large‑scale data sets. • Monitor, debug, and troubleshoot cloud‑based data systems; recommend and implement performance, cost, and security improvements. • Collaborate with data engineering, security, and DevOps teams to ensure alignment with best practices and organizational standards. • Document architecture designs, processes, and operational guidelines for stakeholders and future maintenance. Required Skills • Proven experience as an AWS Solutions Architect with hands‑on work on Amazon Redshift, DynamoDB, AWS Glue, and related data services. • Strong understanding of AWS migration strategies, tools (e.g., AWS Database Migration Service, Snowball, DataSync) and execution of large‑scale data migrations. • Expertise in designing data pipelines, ETL/ELT processes, data lake architectures, and performance tuning on AWS. • Ability to diagnose and resolve performance, cost, and security issues in cloud‑based data systems. • Familiarity with best practices for governance, compliance, and data protection in AWS environments. Required Education & Certifications • Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent practical experience). • Active AWS certifications, preferably AWS Certified Solutions Architect – Professional; additional data‑engineering certifications (e.g., AWS Certified Data Analytics – Specialty) are advantageous.
Toronto, Canada
On site
22-12-2025
Company background Company brand
Company Name
Tata Consultancy Services
Job Title
Generative AI Engineer
Job Description
**Job Title:** Generative AI Engineer **Role Summary:** Design, develop, and deploy generative AI solutions that enhance business analytics and automate software development. Leverage Python, Java, and cloud platforms to build scalable, reusable AI components and integrate them into enterprise workflows. **Expectations:** - Build end‑to‑end generative AI systems using Python, Java, and cloud services (AWS, Azure, GCP). - Apply machine learning, deep learning, and NLP techniques to real‑world business problems. - Write clean, efficient, and testable code; create reusable libraries. - Deploy and monitor models in cloud environments. - Collaborate with cross‑functional teams to translate business requirements into AI solutions. **Key Responsibilities:** 1. **Model Development** – Generate and fine‑tune large language models, text generation systems, and code‑generation models using frameworks such as TensorFlow or PyTorch. 2. **Integration & Deployment** – Package models into APIs, microservices, or cloud functions; manage CI/CD pipelines and monitor production performance. 3. **APIs & Prompt Engineering** – Design effective prompts for LLMs (OpenAI, Anthropic, etc.) and integrate LLM APIs into applications. 4. **Data Handling** – Execute preprocessing, feature engineering, and data pipeline creation for training and inference datasets. 5. **Testing & Evaluation** – Establish evaluation metrics (precision, recall, BLEU, etc.), conduct A/B tests, and iterate model improvements. 6. **Technical Collaboration** – Work with software engineers, DevOps, and product managers to align AI capabilities with business objectives. 7. **Documentation & Knowledge Sharing** – Produce clear technical documentation, best‑practice guides, and hold knowledge‑share sessions. **Required Skills:** - Proficiency in Python programming and Java for backend integrations. - Strong foundation in machine learning, deep learning, and NLP concepts. - Hands‑on experience with generative AI tools (e.g., Microsoft Co‑Pilot) and AI‑assisted development. - Familiarity with cloud technologies (AWS, Azure, or GCP) and deployment of AI models in cloud environments. - Ability to write reusable, efficient, and testable code. - Solid analytical and problem‑solving capabilities; ability to decompose complex challenges. **Desired Skills (additional):** - Experience with TensorFlow or PyTorch libraries. - Prompt engineering and LLM API usage (OpenAI, Anthropic). - Basic understanding of model evaluation metrics. - Knowledge of data preprocessing and feature engineering. - Exposure to Java‑based AI integrations or backend development. **Required Education & Certifications:** - Bachelor’s degree (or higher) in Computer Science, Data Science, Artificial Intelligence, or related field. - Professional certifications in cloud platforms (AWS, Azure, GCP) or machine learning are preferred.
Toronto, Canada
On site
23-12-2025
Company background Company brand
Company Name
Tata Consultancy Services
Job Title
DevOps Developer for Azure DevOps - Salesforce
Job Description
**Job Title** DevOps Developer – Azure DevOps & Salesforce **Role Summary** Design, build, and maintain CI/CD pipelines for Salesforce deployments using Azure DevOps, Git/GitHub, and scripting (PowerShell/Python). Automate metadata, XML, and Sandbox processes to ensure rapid, reliable releases. **Expectations** - Deliver end‑to‑end automated deployment workflows. - Maintain source control integrity and branching strategies. - Ensure high code quality through SonarQube and automated testing. **Key Responsibilities** - Create and maintain Azure DevOps pipelines for Salesforce releases. - Configure Git/GitHub Actions repositories for version control and release management. - Develop PowerShell/Python scripts to automate metadata API calls, XML deployments, and Sandbox provisioning. - Integrate SonarQube for continuous code quality analysis. - Manage Salesforce Sandbox lifecycle and coordinate with QA on environment readiness. - Monitor pipeline performance, troubleshoot failures, and implement rollback strategies. - Collaborate with development, QA, and operations teams to refine release processes. - Document pipeline configurations, scripting guidelines, and best‑practice procedures. **Required Skills** - Proven experience with Salesforce DevOps (SFDX, metadata API, XML deployments). - Strong Azure DevOps expertise: pipeline authoring, artifacts, and release management. - Proficiency in PowerShell and/or Python for deployment automation. - Experience with Git, GitHub Actions, and source‑code version control. - Familiarity with SonarQube and code‑quality metrics. - Knowledge of Salesforce Sandbox administration and release lifecycle. - Ability to troubleshoot complex deployment issues and optimize build processes. - Excellent communication and collaboration skills. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Information Technology, or related field. - Salesforce certifications (Administrator, Platform Developer I/II) or equivalent. - Azure DevOps Engineer Expert or related Azure DevOps certification preferred. ---
Toronto, Canada
On site
29-12-2025