Interactive environment to explore DeepSeek-R1 models on Google Colab, featuring Gradio UI and LangChain pipelines, using Ollama as the model runtime.
Available models:
- deepseek-r1:1.5b (default - distilled from Qwen-2.5-Math-1.5B)
- deepseek-r1:7b (distilled from Qwen-2.5-Math-7B)
- deepseek-r1:8b (distilled from Llama-3.1-8B)
- deepseek-r1:14b (distilled from Qwen-2.5-14B)
All models are compatible with Google Colab's free T4 GPU (16GB).
- Multiple DeepSeek-R1 models (1.5B to 14B variants for free T4 GPU)
- Chat interface with Gradio
- LangChain conversation pipeline
- XML parsing for reasoning separation
- Automated Ollama setup
- Open notebook in Colab
- Set T4 GPU runtime
- Run cells in order
You can also find the Jupyter notebook in this repo: deepseek_r1_gradio_env.ipynb
- Google Colab (T4 GPU)
- GPU RAM: 16GB
Created by Edoardo Avenia