Job Specifications
Job Title: Solution Architect
Location: Remote
Employment Type: W2 Contract (No c2c)
About the Role
We are seeking a seasoned Solution Architect with a strong background in designing and implementing scalable data and cloud architectures. The ideal candidate will have hands-on experience with Databricks, modern data platforms, and enterprise-grade cloud solutions. You’ll collaborate with cross-functional teams to define technical roadmaps, architect high-performing solutions, and drive innovation across analytics and data-driven platforms.
Key Responsibilities
Design and lead implementation of end-to-end cloud-native data platforms leveraging tools such as Databricks, Delta Lake, and MLflow.
Define architecture for large-scale ETL/ELT pipelines, data lakes, and real-time/streaming data solutions.
Collaborate with data engineers, data scientists, and stakeholders to convert business goals into scalable technical solutions.
Integrate Databricks notebooks, Apache Spark, and cloud-native services (e.g., AWS Glue, Azure Data Factory) for batch and real-time data processing.
Implement governance and data security using tools like Unity Catalog, IAM, and encryption at rest/in transit.
Define integration patterns using REST APIs, event-driven messaging (Kafka/Pub/Sub), and distributed systems design.
Participate in architectural reviews and performance tuning across distributed compute frameworks.
Stay updated on emerging tools and technologies in data architecture, cloud infrastructure, and ML Ops.
Required Qualifications
Bachelor’s or Master’s in Computer Science, Data Engineering, or related field.
10+ years of experience in enterprise software or data architecture roles.
Strong hands-on experience with Databricks, Apache Spark, and Delta Lake.
Proficiency with multiple cloud platforms (AWS and GCP preferred), with experience in services like S3, ADLS, BigQuery, or Redshift.
Familiarity with streaming platforms such as Kafka or Kinesis
Experience designing and deploying data lakehouses or analytics platforms.
Strong understanding of data modeling, data governance, and pipeline orchestration (Airflow, dbt, or similar).
Skilled in performance optimization, data security best practices, and cloud cost management.
Excellent communication skills for stakeholder management and cross-functional collaboration.
Preferred Skills
Preferred certifications in Databricks, AWS/Azure/GCP Solution Architecture, or TOGAF.
Knowledge of ML/AI workflows, model versioning, and ML Ops practices.
Experience with Unity Catalog, Great Expectations, or data quality frameworks.
Prior work in regulated environments (e.g., healthcare, finance, insurance, energy) is a plus.
About the Company
Comprehensive Staffing Solutions for Specialized Engineering Roles: Software Development Engineering: We offer staffing solutions with skilled software engineers proficient in various programming languages and technologies. They are equipped to develop top-tier, scalable, and efficient software solutions that align with your project needs. Site Reliability Engineering (SRE): Our expert SRE staff specialize in ensuring the high availability and reliability of software systems. They excel in continuous integration and deployme...
Know more