cover image
OSI Engineering

OSI Engineering

www.OSIengineering.com

3 Jobs

179 Employees

About the Company

OSI Engineering is a leading Technology Workforce Solutions provider across the globe. The company develops, implements and manages workforce solutions through every stage of the product lifecycle, from early application development through final production, delivering the highest-level technology outcomes. Our capabilities include managed service programs, contingent workforce services and recruitment process outsourcing (RPO) solutions.

With a remarkable track record spanning over 25 years, OSI has successfully delivered workforce solutions to industry leaders. We've earned a reputation as the trusted go-to resource for swiftly scaling effective teams that drive technological advancement and corporate growth. And we do so with the strength of our diversity and great pride.

At OSI, we foster strategic partnerships with leading global technology companies, delivering workforce solutions to make the future brighter for all of us.

Listed Jobs

Company background Company brand
Company Name
OSI Engineering
Job Title
AI Data & Python Tools Engineer for a well-known consumer device company in Austin, TX
Job Description
**Job title:** AI Data & Python Tools Engineer **Role Summary:** Design, develop, and deploy production‑ready AI applications and data pipelines using Python, React, and modern ML frameworks. Lead rapid prototyping, performance tuning, and scalable cloud deployment. Integrate Model Context Protocol (MCP) systems and maintain end‑to‑end technical documentation. **Expectations:** - Deliver production‑grade AI tools on a fast schedule. - Collaborate with product, engineering, and data science teams to solve complex problems. - Maintain code quality through reviews, documentation, and solid testing practices. - Operate within a cloud‑native, containerized environment. **Key Responsibilities:** 1. Build full‑stack AI applications with Python (backend) and React (frontend). 2. Design and orchestrate end‑to‑end data pipelines; stream, process, and transform data. 3. Train, validate, and deploy ML models; perform rigorous evaluation and monitoring. 4. Implement MCP server components and integrate with AI assistants and vector databases. 5. Provision and manage cloud resources (AWS/GCP/Azure) and big‑data platforms (Spark, Kafka). 6. Create and maintain data warehouses (Snowflake, BigQuery, Redshift) and data lakes. 7. Apply containerization (Docker, Kubernetes) and infrastructure‑as‑code for scalable deployments. 8. Contribute to API design, UX/UI for internal tools, and MLOps pipelines (Airflow). **Required Skills:** - Strong Python fundamentals; experience building backend services and data processing pipelines. - Familiarity with ML/DL frameworks: TensorFlow, PyTorch, scikit‑learn. - Knowledge of LLMs, vector databases, retrieval systems, and MCP integration. - Cloud expertise (AWS, GCP, or Azure) and big‑data technologies (Spark, Kafka). - Data warehouse and lake skills: Snowflake, BigQuery, Redshift. - Containerization (Docker, Kubernetes) and IaC. - Proficient in API development; basic front‑end experience with React (preferred). - Understanding of MLOps practices (Airflow) and real‑time analytics. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Software Engineering, Data Science, or related field. - Relevant cloud certification (AWS/Azure/GCP) is a plus.
Austin, United states
Hybrid
08-12-2025
Company background Company brand
Company Name
OSI Engineering
Job Title
Senior Backend Software Engineer (Data and ML Innovation)
Job Description
**Job title:** Senior Backend Software Engineer (Data and ML Innovation) **Role Summary:** Lead design, development, and scaling of backend systems that ingest, process, and serve training and inference data for large‑scale AI/ML models. Drive data augmentation strategies, streamline data pipelines across GCP and AWS, and support real‑time inference workloads. **Expectations:** - Deliver high‑performance, production‑ready Python services within a contract period (3 months, extension possible). - Modernize data infrastructure, ensuring reliability and scalability across cloud platforms. - Collaborate cross‑functionally to align data quality and availability with ML training needs. **Key Responsibilities:** 1. Design and implement data pipelines that transform raw datasets into formats suitable for downstream training jobs on GCP and AWS. 2. Build and optimize scalable backend services and REST APIs powering data inspection and access tools. 3. Develop large‑scale inference components using internal and open‑source stacks to run fine‑tuned LLMs on massive datasets. 4. Engineer efficient data filtering and processing workflows for noisy, high‑volume datasets. 5. Maintain and enhance storage solutions (Redis, VectorDB, or equivalent) for fast, scalable data retrieval. 6. Contribute to DevOps practices: CI/CD pipeline setup (Jenkins, Maven, Docker, Gradle) and monitoring. 7. Provide architectural guidance and code reviews for team members and stakeholders. **Required Skills:** - 10+ years of Python programming, including concurrency, parallelism, functional patterns, and decorators. - Proficient in REST API design and development. - Experience with large‑scale data stores (Redis, VectorDB, or similar). - Strong background in algorithms, data structures, OOP, and software design principles. - Familiarity with additional languages such as Java, Go, Rust, or Swift. - Cloud platform expertise: GCP and AWS data services. - Knowledge of message streaming (Kafka) and build tools (Jenkins, Maven, Docker, Gradle). - Excellent communication, critical thinking, and problem‑solving abilities. **Required Education & Certifications:** - Bachelor’s degree in Computer Science or equivalent experience. - (No specific additional certifications required.)
Cupertino, United states
Hybrid
Senior
19-12-2025
Company background Company brand
Company Name
OSI Engineering
Job Title
Software Engineer
Job Description
**Job Title** Software Engineer **Role Summary** Design, develop, and maintain production‑grade tooling that powers sensing algorithm development. Build plugins, libraries, APIs, and dashboards on top of existing Python DAG and distributed computing frameworks to enable algorithm developers to prototype, validate, and deploy efficiently. **Expectations** - 8‑month contract with potential extension. - Deliver fully tested, documented, and CI/CD‑enabled components used daily by multiple developers. - Maintain backward compatibility and version control of shared libraries. **Key Responsibilities** - Develop plugins and extensions for core Python DAG and distributed computing frameworks tailored to sensing workflows. - Create new Python tools and libraries when existing tooling is insufficient. - Design intuitive APIs to abstract complexity and enhance developer experience. - Build reusable repository structures, templates, and scaffolding for rapid project set‑up. - Build and deploy dashboard plugins that extend core visualization libraries for algorithm metrics and monitoring. - Write comprehensive unit tests and support CI/CD pipelines. - Document tools and APIs for self‑service adoption. - Manage library versioning and maintain backward compatibility. - Rapidly learn and extend internal data processing frameworks and APIs. **Required Skills** - Strong foundation in Python library development. - Experience designing production‑grade tooling. - Focus on user (developer) experience and usability. - Familiarity with distributed processing systems (e.g., Airflow, Prefect, Dagster). - Ability to write clean, maintainable, well‑documented code with comprehensive tests. - Experience with CI/CD workflows. - Knowledge of C++ is a strong plus. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering, or related field, or equivalent practical experience.
Culver city, United states
Hybrid
03-02-2026