Skip to content

Latest commit

 

History

History
13 lines (9 loc) · 591 Bytes

File metadata and controls

13 lines (9 loc) · 591 Bytes

Run Open LLMs with the OpenAI API

This demo explores how to run local and serverless LLMs using the OpenAI API, so that you can easily swap between models.

Part 1: Concept Overview: Using the OpenAI API with Ollama and Together.ai
Part 2: Code Examples: Tool Calling, Streaming, LangChain Compatability
Part 3: Code Examples: Powering Agents

You can watch a walkthrough on YouTube

Project Setup

You will need to create a .env file with your own keys. I have provided an example at .env.example

The demo files can all be run as Jupyter notebooks.