You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Phidata is a toolkit for building AI powered software. It enables you to build:
26
+
-**RAG Apps**: Connect LLMs to your knowledge base and build context-aware applications.
27
+
-**Autonomous Apps**: Give LLMs the ability to call functions and build autonomous applications.
28
+
-**Multi-Modal Apps**: Build apps that can process text, images, audio and video.
29
+
-**Workflow Specific AI**: Build AI for specific workflows like data engineering, customer support, sales, marketing, etc.
30
30
31
-
## 🎯 Motivation
31
+
It achieves this by providing:
32
+
- Building blocks: `Conversations`, `Tools`, `Agents`, `KnowledgeBase`, `Storage`
33
+
- Tools for serving AI Apps: `FastApi`, `Django`, `Streamlit`, `PgVector`
34
+
- Infrastructure for running AI Apps: `Docker`, `AWS`
35
+
36
+
To simplify development further, phidata provides pre-built templates for common use-cases that you can clone and run with 1 command.
37
+
38
+
## 🎯 Goal: Provide a paved path to production-ready AI
39
+
40
+
## ✨ Motivation
32
41
33
42
Most AI Apps are built as a house of cards because engineers have to build the Software, Application and Infrastructure layer separately and then glue them together.
34
-
This leads to technical debt and makes it hard to build production-grade AI Apps.
43
+
This leads to brittle systems that are hard to productionize and maintain.
35
44
36
-
Phidata bridges the 3 layers of software development and provides production-ready templates for AI Apps that you can run with 1 command.
45
+
Phidata bridges the 3 layers of software development and provides a paved path to production-ready AI.
37
46
38
47
## 🚀 How it works
39
48
@@ -42,6 +51,7 @@ Phidata bridges the 3 layers of software development and provides production-rea
42
51
- Run your app on AWS: `phi ws up prd:aws`
43
52
44
53
## ⭐ Features:
54
+
45
55
-**Powerful:** Get a production-ready AI App with 1 command.
46
56
-**Simple**: Built using a human-like `Conversation` interface to language models.
47
57
-**Production Ready:** Your app can be deployed to aws with 1 command.
@@ -52,13 +62,9 @@ Phidata bridges the 3 layers of software development and provides production-rea
52
62
- Chat with us on <ahref="https://discord.gg/4MtYHHrgA8"target="_blank"rel="noopener noreferrer">Discord</a>
53
63
- Email us at <ahref="mailto:help@phidata.com"target="_blank"rel="noopener noreferrer">help@phidata.com</a>
54
64
55
-
## 💻 Example: Build a RAG LLM App
65
+
## 💻 Quickstart
56
66
57
-
Let's build a **RAG LLM App** with GPT-4. We'll use PgVector for Knowledge Base and Storage and serve the app using Streamlit and FastApi. Read the full tutorial <ahref="https://docs.phidata.com/examples/rag-llm-app"target="_blank"rel="noopener noreferrer">here</a>.
58
-
59
-
> Install <ahref="https://docs.docker.com/desktop/install/mac-install/"target="_blank"rel="noopener noreferrer">docker desktop</a> to run this app locally.
60
-
61
-
### Installation
67
+
### Create a virtual environment
62
68
63
69
Open the `Terminal` and create an `ai` directory with a python virtual environment.
64
70
@@ -69,12 +75,43 @@ python3 -m venv aienv
69
75
source aienv/bin/activate
70
76
```
71
77
78
+
### Install
79
+
72
80
Install phidata
73
81
74
82
```bash
75
83
pip install phidata
76
84
```
77
85
86
+
### Create a conversation
87
+
88
+
Conversations are a human-like interface to language models and the starting point for every AI App.
89
+
We send the LLM a message and get a model-generated output as a response.
90
+
91
+
Conversations come with built-in Memory, Knowledge, Storage and access to Tools.
92
+
Giving LLMs the ability to have long-term, knowledge-based Conversations is the first step in our journey to AGI.
93
+
94
+
Copy the following code to a file conversation.py
95
+
96
+
```python
97
+
from phi.conversation import Conversation
98
+
99
+
conversation = Conversation()
100
+
conversation.print_response('Share a quick healthy breakfast recipe.')
101
+
```
102
+
103
+
### Run your conversation
104
+
105
+
```bash
106
+
python conversation.py
107
+
```
108
+
109
+
## 🤖 Example: Build a RAG LLM App
110
+
111
+
Let's build a **RAG LLM App** with GPT-4. We'll use PgVector for Knowledge Base and Storage and serve the app using Streamlit and FastApi. Read the full tutorial <ahref="https://docs.phidata.com/examples/rag-llm-app"target="_blank"rel="noopener noreferrer">here</a>.
112
+
113
+
> Install <ahref="https://docs.docker.com/desktop/install/mac-install/"target="_blank"rel="noopener noreferrer">docker desktop</a> to run this app locally.
114
+
78
115
### Create your codebase
79
116
80
117
Create your codebase using the `llm-app` template pre-configured with FastApi, Streamlit and PgVector. Use this codebase as a starting point for your LLM product.
0 commit comments