cover image
Spice AI

Spice AI

spice.ai

1 Job

19 Employees

About the Company

Spice AI is an open-source data and AI platform that helps development teams build more responsive and intelligent applications.

Spice combines SQL query federation & acceleration, hybrid search & retrieval, and LLM inference in a high-performance, lightweight runtime—so you can query data in place, across operational and analytical data sources, without ETL or complex integrations. Deploy anywhere—edge, cloud, or on-premise—and ship faster applications with less infrastructure management and greater security.

Listed Jobs

Company background Company brand
Company Name
Spice AI
Job Title
Forward Deployed Engineer
Job Description
Job title: Forward Deployed Engineer Role Summary: Embed with enterprise customers to deploy, optimize, and extend Spice.ai solutions, accelerating adoption of a unified query federation, search, and AI runtime. Lead end‑to‑end architecture, production rollout, and feature feedback loops while maintaining high performance, security, and scalability. Expactations: - 5+ years engineering in data, search, and AI production systems. - Logically tackle ambiguous, complex customer problems. - Deliver measurable impact quickly and sustainably. - Collaborate with product, engineering, and leadership to shape feature direction. - Travel 20%+ as required. Key Responsibilities: - Deploy Spice.ai OSS/Enterprise or Cloud Platform for customers. - Configure, tune, and scale proof‑of‑concepts and production deployments. - Resolve technical blockers, debug issues, and expand Spice.ai capabilities on‑site or remotely. - Design full‑stack architectures integrating Spice.ai for data querying, search, and AI inference. - Capture customer feedback, influence core product improvements, and contribute to OSS updates. - Present technical solutions to technical and executive stakeholders. - Drive adoption of new features and identify expansion opportunities. Required Skills: - Deep understanding of distributed databases, data warehouses, data lakes, and search engines. - Proficiency with ECOS such as Databricks, Snowflake, Starburst, Dremio, ElasticSearch, or equivalent. - Strong SQL, ETL/ELT, data integration, and performance tuning experience. - Knowledge of HDFS, Amazon S3, relational databases, and cloud‑native stacks (Apache, CNCF). - Experience designing secure, scalable data pipelines and managing data security, privacy, and authentication. - Excellent communication, presentation, and customer‑relationship skills. - Proven track record of architectural decisions in customer‑facing contexts; open‑source contributions preferred. Required Education & Certifications: - Bachelor’s or higher degree in Computer Science, Engineering, or related field. - Relevant certifications in cloud platforms (e.g., AWS, GCP, Azure) or database technologies are a plus.
Seattle, United states
Remote
02-02-2026