Find Your Next Job
Data Scientist With Snowflake-Offshore
Posted on May 4, 2026
- Up, India
- 0 - 0 USD (yearly)
- Full Time
Tailor Your Resume for this Job
Job Description: We are seeking an experienced Snowflake Data Scientist with deep expertise in building, deploying, and optimizing Large Language Models (LLMs) and advanced AI workloads on the Snowflake Data Cloud.
- Responsibilities: Design, build, and deploy LLM-based solutions on Snowflake , leveraging Cortex AI functions (e.g., Snowflake LLM inference, fine‑tuning, embeddings).
- Create custom LLM pipelines using Snowpark ML and external models integrated through Snowflake.
- Build and operationalize machine learning models , including NLP, predictive modeling, generative AI, and recommendation systems.
- Develop and apply prompt engineering, RAG pipelines , and semantic search patterns using Snowflake features and Vector Embeddings.
- Conduct exploratory data analysis (EDA) , feature engineering, and statistical modeling using Snowflake & Snowpark Python.
- Deploy, monitor, and optimize model performance directly within the Snowflake environment.
- Integrate Snowflake ML solutions with cloud platforms (GCP) and downstream applications.
- Work with business stakeholders, product teams, and analytics leaders to identify opportunities for AI-driven transformation.
- Deliver high-quality, interpretable insights and communicate complex technical results to non-technical audiences.
- Partner with data engineers, architects, and analysts to build end‑to‑end Snowflake-native AI solutions.
- Qualifications: 2-4 years of experience in Data Science, Machine Learning, or Applied AI roles.
- Hands-on expertise with Snowflake , including Snowpark Python, UDFs, Warehouses, Tasks, Streams, and performance optimization.
- Strong experience with LLMs , including fine‑tuning, embeddings, prompt engineering, evaluation, and inference.
- Proven experience using Snowflake Cortex AI or similar cloud LLM platforms.
- Proficiency in Python , SQL, and ML frameworks (scikit‑learn, transformers, LangChain, etc.).
- Familiarity with cloud platforms like GCP and integrating it with SnowFlake.
- Experience with NLP techniques, vector databases, and semantic search.
- Familiarity with MLOps workflows: model deployment, monitoring, CI/CD pipelines.
- Strong understanding of data governance, security , and ML best practices.
Tailor Your Resume for this Job
Share with Friends!