cover image
Stefanini North America and APAC

Stefanini North America and APAC

www.stefanini.com

20 Jobs

1,697 Employees

About the Company

Stefanini Group is a multinational company with a broad portfolio of technological solutions. Our global presence spans 41 countries and 44 languages. Motivated by a shared entrepreneurial spirit, we help our clients transform their business through digital innovation, working in an agile manner and co-creating for a better future.
We believe technology can revolutionize a company and that innovation is essential to foster development and competitiveness. We also value new ideas and the power of an open mind, so we recognize every talent is essential to the quality of our projects and especially to our progress. We are big enough to invest in new technology while small enough to be both flexible and adapt to our clients’ needs. Driven by our entrepreneurial mindset, we help customers transform their business through digital innovations by working in an agile mode and co-creating for a better future.


We offer a broad portfolio of solutions, combining innovative consulting, marketing, mobility, personalized campaigns and artificial intelligence services with traditional solutions such as service desk, field service, and outsourcing (BPO). We maintain our excellence by investing in technological innovations, the best partnerships, acquisitions of companies worldwide, and the hiring of highly trained professionals.


Listed Jobs

Company background Company brand
Company Name
Stefanini North America and APAC
Job Title
Solutions Architect
Job Description
**Job Title:** Solutions Architect **Role Summary:** Design and lead data‑centric architecture across cloud environments, utilizing Databricks, Starburst, Immuta, Collibra, and AWS. Deliver reusable patterns, enforce governance, and collaborate with cross‑functional teams to translate business needs into scalable data & analytics solutions. **Expectations:** - Provide technical leadership to data engineering teams and stakeholders. - Own end‑to‑end solution design from conception to production. - Drive continuous improvement of data pipelines, quality, and performance. - Champion change management and architectural governance across the organization. **Key Responsibilities:** - Architect data mesh, data lake, ingestion, validation, and orchestration solutions in AWS. - Create, document, and review reusable data management and integration patterns. - Enforce data quality standards, governance policies, and security controls. - Optimize pipelines, queries, and infrastructure for performance. - Identify, assess, and mitigate engineering risks; ensure compliance. - Lead technical and business discussions on future architecture direction. - Develop business cases and domain‑wide policies; facilitate architecture governance. - Conduct research on emerging technologies and evaluate their applicability. - Mentor and build high‑performing cross‑functional teams. **Required Skills:** - 7+ years of full‑cycle project experience (design → production). - Advanced expertise with Databricks, Immuta, Starburst, Collibra, and AWS services. - Deep knowledge of data engineering & analytics pipelines, data mesh principles, and lakehouse architecture. - Strong problem‑solving, communication for results, and system‑thinking abilities. - Proven leadership, initiative, flexibility, and thoroughness. - Business acumen: understanding of enterprise processes and ability to translate business value. - Experience with emerging technology evaluation and research. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Information Systems, Computer Engineering, or related field. - Professional certifications in cloud (AWS), data platform (Databricks, Snowflake etc.), or data governance (Collibra, Immuta) are highly desirable.
San francisco, United states
Hybrid
Senior
12-11-2025
Company background Company brand
Company Name
Stefanini North America and APAC
Job Title
3DX Developer
Job Description
**Job title:** 3DX Developer **Role Summary:** Design, develop, and maintain software solutions on the Dassault Systèmes 3DEXPERIENCE platform, primarily ENOVIA, to support automotive PLM, BOM, configuration, and change management. Lead migration of legacy Java BOM functionalities, create integrations with CATIA, AI/ML tools, and deliver robust APIs and customizations in an Agile environment. **Expactations:** - Deliver high‑quality ENOVIA/3DSpace customizations, scripts, and APIs that meet automotive PLM requirements. - Translate business and engineering needs into functional 3DX solutions and detailed technical designs. - Participate fully in Agile ceremonies, provide timely code reviews, and support continuous integration/continuous deployment. - Ensure code quality through unit and integration testing, debugging, and production support. - Collaborate with cross‑functional teams (design, engineering, manufacturing, supply chain) and external partners (AI/ML, SAP) for seamless data flow and analytics. **Key Responsibilities:** - Develop and extend ENOVIA modules for BOM, variant, configuration, and change management. - Migrate existing Java BOM logic to 3DX, performing data transformation and API design. - Write EKL, MQL, Java server‑side code, JavaScript client‑side widgets, and REST/SOAP services. - Integrate ENOVIA with CATIA V5/V6, AI/ML platforms (Python, Qlik Sense, Alteryx, LLMs), and external systems (SAP, DELMIA, SIMULIA). - Create unit and integration test suites, troubleshoot production issues, and provide code fixes. - Manage GitHub repositories, support Tekton/GCP Cloud Build CI/CD pipelines, and participate in backlog grooming and stand‑ups. - Maintain and enhance system monitoring (Splunk, Dynatrace). **Required Skills:** - Expert in EKL, MQL, and Java for 3DSpace/ENOVIA server development. - Strong JavaScript (3DX widgets) and web service/API development (REST/SOAP). - Proficient SQL, database modeling, and complex query writing for PLM data. - Experience with CATIA integration and automotive PLM concepts. - Version control with GitHub; Agile methodologies (Scrum/Kanban). - Knowledge of GCP, Tekton, and CI/CD pipelines for 3DX deployments. - Python scripting for data processing or AI/ML integration (preferred). **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related technical field; automotive/engineering degree acceptable with strong programming skills. - Dassault Systèmes 3DEXPERIENCE certifications (ENOVIA, 3DSpace Developer) strongly preferred.
Dearborn, United states
On site
14-11-2025
Company background Company brand
Company Name
Stefanini North America and APAC
Job Title
Infrastructure Engineer (OpenShift Virtualization)
Job Description
Job Title: Infrastructure Engineer (OpenShift Virtualization) Role Summary: Design, implement, and maintain the OpenShift Virtualization platform, driving capacity planning, automation, SRE practices, and comprehensive observability to support developer productivity and platform stability. Expactations: 6+ years of experience in infrastructure engineering; strong background in OpenShift/Kubernetes, automation, cloud architecture, and root‑cause analysis. Demonstrated ability to work across global environments and collaborate with application teams. Key Responsibilities: * Conduct capacity planning and forecasting for compute, memory, storage, and network resources; develop capacity models and reports for strategic scaling. * Analyze resource utilization trends and recommend scaling, consolidation, or optimization strategies. * Develop and maintain automation solutions (scripts, playbooks, CI/CD pipelines) for OSV tasks including configuration changes, VM management, auditing, and ticketing integration. * Apply Site Reliability Engineering principles to improve platform stability, performance, and operational efficiency (RBAC, namespace/quotas). * Implement end‑to‑end observability (monitoring, logging, tracing) with Dynatrace, Prometheus/Grafana, and explore event‑driven architecture for real‑time insights. * Perform deep‑dive root‑cause analysis to rapidly identify and resolve platform incidents across global compute environments. * Monitor VM health, resource usage, and performance; detect anomalous activity that may indicate security or configuration issues. * Provide solution design and knowledge‑management consulting to application teams. Required Skills: * OpenShift Virtualization, Kubernetes, Docker, and cluster fundamentals * Scripting and automation: Python, PowerShell, Bash, Ansible, GitHub Actions/Tekton * Cloud platforms: Google Cloud Platform, experience with VMware/VMware ESXi * Monitoring & observability: Dynatrace, Prometheus, Grafana, logging & tracing tools * Site Reliability Engineering practices (alerting, incident response, SLIs/SLOs) * Root Cause Analysis, performance tuning, capacity planning, and resource optimization * Security fundamentals: RBAC, access control, monitoring for suspicious activity Required Education & Certifications: * Associate or Bachelor’s degree in Computer Science, IT, or related field (Bachelor’s preferred) * Relevant certifications (CKA/CKS, Red Hat Certified Engineer, GCP Associate Cloud Engineer, Ansible Automation Platform) are a plus.
Dearborn, United states
On site
Mid level
18-11-2025
Company background Company brand
Company Name
Stefanini North America and APAC
Job Title
Data Analytics Software Engineer
Job Description
Job title: Data Analytics Software Engineer Role Summary: Design, develop, and maintain data‑transformation solutions for large‑scale analytics pipelines. Lead architecture of data workflows using Python/Java, Spring Boot, and Big Data tools within an Agile environment, ensuring high‑quality, timely delivery of MVPs and scalable production services. Expectations: • 6+ years in data analytics or software engineering with a focus on data transformation. • 4+ years of experience programming in Python and Spring Boot. • Proven track record managing complex data workflows and delivering production‑ready solutions. Key Responsibilities: • Build and optimize data transformation pipelines for ingestion, processing, and delivery of analytical data. • Collaborate with cross‑functional teams to define requirements, prioritize tasks, and deliver incremental MVPs. • Apply agile practices to manage competing priorities and stakeholder expectations. • Mentor and educate business users on iterative development and data product value. • Ensure code quality, performance, and scalability of analytics services. Required Skills: • Python, Java, Spring Boot (backend development). • Big Data technologies: GCP Cloud Run, BigQuery, SQL. • Agile/Scrum methodology, continuous integration (e.g., Tekton). • Experience designing and troubleshooting large data pipelines. • Strong communication and stakeholder management. Required Education & Certifications: • Bachelor’s degree in Computer Science, Engineering, or related field. • Preferred: Master’s degree or relevant certification program (e.g., data engineering or cloud computing).
Dearborn, United states
On site
Mid level
20-11-2025