Skip to content

Interactive environment to explore DeepSeek-R1 on Google Colab, featuring Gradio UI and LangChain pipelines, with Ollama as the model runtime.

License

Notifications You must be signed in to change notification settings

edoardoavenia/deepseek-r1-gradio-env

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 

Repository files navigation

🤖 DeepSeek-R1 LangChain-Ollama Chat | Gradio UI

Open In Colab

Click above to open the fully implemented notebook in Colab ☝️

Interactive environment to explore DeepSeek-R1 models on Google Colab, featuring Gradio UI and LangChain pipelines, using Ollama as the model runtime.

Available models:

  • deepseek-r1:1.5b (default - distilled from Qwen-2.5-Math-1.5B)
  • deepseek-r1:7b (distilled from Qwen-2.5-Math-7B)
  • deepseek-r1:8b (distilled from Llama-3.1-8B)
  • deepseek-r1:14b (distilled from Qwen-2.5-14B)

All models are compatible with Google Colab's free T4 GPU (16GB).

✨ Features

  • Multiple DeepSeek-R1 models (1.5B to 14B variants for free T4 GPU)
  • Chat interface with Gradio
  • LangChain conversation pipeline
  • XML parsing for reasoning separation
  • Automated Ollama setup

🚀 Usage

  1. Open notebook in Colab
  2. Set T4 GPU runtime
  3. Run cells in order

You can also find the Jupyter notebook in this repo: deepseek_r1_gradio_env.ipynb

📋 Requirements

  • Google Colab (T4 GPU)
  • GPU RAM: 16GB

🔗 Resources


Created by Edoardo Avenia

About

Interactive environment to explore DeepSeek-R1 on Google Colab, featuring Gradio UI and LangChain pipelines, with Ollama as the model runtime.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published