cover image
Monks

Monks

monks.com

1 Job

6,835 Employees

About the Company

Monks combines an extraordinary range of global marketing and technology services to accelerate business possibilities and redefine how brands and businesses interact with the world. It delivers unfettered content production, scaled experiences, enterprise-grade technology and data science fueled by AI to help the world’s trailblazing companies outmaneuver and outpace their competition.

Listed Jobs

Company background Company brand
Company Name
Monks
Job Title
Data Engineer
Job Description
**Job Title**: Data Engineer **Role Summary** Design, develop, and maintain scalable data pipelines and architectures that deliver accurate, high‑quality data to enable rapid business insight. Focus on ingesting diverse data sources, building reusable libraries, and continuously optimizing system performance and data quality for global sales and finance teams. **Expectations** - 14+ years of professional data engineering experience. - Deep expertise in Snowflake, including architecture, query tuning, and security. - Proven experience handling large aggregate datasets (5–10 billion records, ~200 columns). - Strong Python skills for orchestration and automation. - Proficiency with S3/DataLake architectures and open data formats (Parquet, Iceberg). - Hands‑on experience building data pipelines with DBT. - Optional: experience with Dataiku and Dremio. **Key Responsibilities** 1. Architect and implement scalable, efficient data ingestion and processing solutions from heterogeneous sources. 2. Build, maintain, and document reusable libraries and frameworks that increase team productivity. 3. Optimize and tune existing data pipelines for performance, cost, and data quality. 4. Monitor and troubleshoot data workflows to ensure operational excellence and data reliability. 5. Collaborate with data, analytics, and business teams to define data quality standards and governance practices. **Required Skills** - Data engineering fundamentals (ETL/ELT, data modeling, pipeline design). - Advanced Snowflake administration and query optimization. - Python programming for orchestration (e.g., Airflow, Prefect, or custom scripts). - S3/DataLake management and expertise with Parquet/Iceberg formats. - DBT pipeline creation, version control, and testing. - Experience with large‑scale datasets and performance tuning. - Optional: Dataiku and Dremio knowledge. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or related field. - Snowflake certifications (e.g., SnowPro Core) highly preferred.
Austin, United states
Hybrid
Senior
24-02-2026