- Company Name
- Braze
- Job Title
- Senior Forward-Deployed Data Scientist, AI Deployment
- Job Description
-
**Job Title**
Senior Forward-Deployed Data Scientist, AI Deployment
**Role Summary**
Partner with customer analytics and BI teams to design, implement, and optimize AI solutions that drive business outcomes. Extend product capabilities through reusable data pipelines, APIs, and reinforcement learning modules, while providing expert guidance to ensure successful adoption and measurable impact.
**Expactations**
- 3–5+ years as a Data Scientist or Machine Learning Engineer in large‑scale, production environments, preferably in a customer‑facing or consulting role.
- Bachelor’s degree in Computer Science, Data Science, Mathematics, Engineering, or related field; Master’s or PhD preferred.
- Proven ability to work independently with accountability and adapt to changing priorities.
**Key Responsibilities**
- Collaborate with customer analytics/BI teams to define use cases, integrate data, set up pipelines, and configure ML models.
- Develop and maintain reusable data pipelines, APIs, and components that extend product functionality.
- Work closely with reinforcement learning pipeline engineers to refine and advance self‑learning algorithms.
- Provide technical leadership to shape product strategy through customer insights and technical expertise.
- Deliver ongoing support to ensure successful adoption, measurable outcomes, and long-term customer success.
**Required Skills**
- Proficiency in Python (Pandas) and core ML libraries (TensorFlow, Keras, scikit‑learn, CatBoost, XGBoost).
- Strong SQL skills for querying and manipulating large datasets.
- Experience building ML pipelines and deploying models to production.
- Solid engineering practices: modular, well‑documented code, Git, CI/CD, testing, type‑hinting, code reviews.
- Excellent communication: translate technical concepts to both technical and non‑technical stakeholders.
- Ability to troubleshoot, identify risks, and propose creative, scalable solutions.
*NICE‑TO‑HAVE*
- DevOps tools (Airflow, Kubernetes, Terraform, GCP).
- Data integration/ETL and pipeline optimization expertise.
- Knowledge of reinforcement learning algorithms.
**Required Education & Certifications**
- Bachelor’s degree in Computer Science, Data Science, Mathematics, Engineering, or related discipline.
- Master’s or PhD in a relevant technical field (preferred).
---