Skip to content

Commit a350f8f

Browse files
committedDec 14, 2023
README
1 parent ee4292d commit a350f8f

File tree

2 files changed

+53
-16
lines changed

2 files changed

+53
-16
lines changed
 

‎LICENSE

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
Copyright (c) 2022 Phidata, Inc.
1+
Copyright (c) Phidata, Inc.
22

33
Mozilla Public License Version 2.0
44
==================================

‎README.md

+52-15
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
phidata
33
</h1>
44
<h3 align="center">
5-
AI Toolkit for Engineers
5+
Full Stack AI Toolkit
66
</h3>
77
<p align="center">
88
⭐️ for when you need to spin up an AI project quickly.
@@ -22,18 +22,27 @@
2222
</a>
2323
</p>
2424

25-
Phidata is a toolkit for building AI powered software. It provides:
26-
- Pre-built templates for AI Apps.
27-
- Building blocks: `Conversations`, `Agents`, `KnowledgeBase`, `Storage`
28-
- Tools: `FastApi`, `Django`, `Streamlit`, `PgVector`
29-
- Infrastructure: `Docker`, `AWS`
25+
Phidata is a toolkit for building AI powered software. It enables you to build:
26+
- **RAG Apps**: Connect LLMs to your knowledge base and build context-aware applications.
27+
- **Autonomous Apps**: Give LLMs the ability to call functions and build autonomous applications.
28+
- **Multi-Modal Apps**: Build apps that can process text, images, audio and video.
29+
- **Workflow Specific AI**: Build AI for specific workflows like data engineering, customer support, sales, marketing, etc.
3030

31-
## 🎯 Motivation
31+
It achieves this by providing:
32+
- Building blocks: `Conversations`, `Tools`, `Agents`, `KnowledgeBase`, `Storage`
33+
- Tools for serving AI Apps: `FastApi`, `Django`, `Streamlit`, `PgVector`
34+
- Infrastructure for running AI Apps: `Docker`, `AWS`
35+
36+
To simplify development further, phidata provides pre-built templates for common use-cases that you can clone and run with 1 command.
37+
38+
## 🎯 Goal: Provide a paved path to production-ready AI
39+
40+
## ✨ Motivation
3241

3342
Most AI Apps are built as a house of cards because engineers have to build the Software, Application and Infrastructure layer separately and then glue them together.
34-
This leads to technical debt and makes it hard to build production-grade AI Apps.
43+
This leads to brittle systems that are hard to productionize and maintain.
3544

36-
Phidata bridges the 3 layers of software development and provides production-ready templates for AI Apps that you can run with 1 command.
45+
Phidata bridges the 3 layers of software development and provides a paved path to production-ready AI.
3746

3847
## 🚀 How it works
3948

@@ -42,6 +51,7 @@ Phidata bridges the 3 layers of software development and provides production-rea
4251
- Run your app on AWS: `phi ws up prd:aws`
4352

4453
## ⭐ Features:
54+
4555
- **Powerful:** Get a production-ready AI App with 1 command.
4656
- **Simple**: Built using a human-like `Conversation` interface to language models.
4757
- **Production Ready:** Your app can be deployed to aws with 1 command.
@@ -52,13 +62,9 @@ Phidata bridges the 3 layers of software development and provides production-rea
5262
- Chat with us on <a href="https://discord.gg/4MtYHHrgA8" target="_blank" rel="noopener noreferrer">Discord</a>
5363
- Email us at <a href="mailto:help@phidata.com" target="_blank" rel="noopener noreferrer">help@phidata.com</a>
5464

55-
## 💻 Example: Build a RAG LLM App
65+
## 💻 Quickstart
5666

57-
Let's build a **RAG LLM App** with GPT-4. We'll use PgVector for Knowledge Base and Storage and serve the app using Streamlit and FastApi. Read the full tutorial <a href="https://docs.phidata.com/examples/rag-llm-app" target="_blank" rel="noopener noreferrer">here</a>.
58-
59-
> Install <a href="https://docs.docker.com/desktop/install/mac-install/" target="_blank" rel="noopener noreferrer">docker desktop</a> to run this app locally.
60-
61-
### Installation
67+
### Create a virtual environment
6268

6369
Open the `Terminal` and create an `ai` directory with a python virtual environment.
6470

@@ -69,12 +75,43 @@ python3 -m venv aienv
6975
source aienv/bin/activate
7076
```
7177

78+
### Install
79+
7280
Install phidata
7381

7482
```bash
7583
pip install phidata
7684
```
7785

86+
### Create a conversation
87+
88+
Conversations are a human-like interface to language models and the starting point for every AI App.
89+
We send the LLM a message and get a model-generated output as a response.
90+
91+
Conversations come with built-in Memory, Knowledge, Storage and access to Tools.
92+
Giving LLMs the ability to have long-term, knowledge-based Conversations is the first step in our journey to AGI.
93+
94+
Copy the following code to a file conversation.py
95+
96+
```python
97+
from phi.conversation import Conversation
98+
99+
conversation = Conversation()
100+
conversation.print_response('Share a quick healthy breakfast recipe.')
101+
```
102+
103+
### Run your conversation
104+
105+
```bash
106+
python conversation.py
107+
```
108+
109+
## 🤖 Example: Build a RAG LLM App
110+
111+
Let's build a **RAG LLM App** with GPT-4. We'll use PgVector for Knowledge Base and Storage and serve the app using Streamlit and FastApi. Read the full tutorial <a href="https://docs.phidata.com/examples/rag-llm-app" target="_blank" rel="noopener noreferrer">here</a>.
112+
113+
> Install <a href="https://docs.docker.com/desktop/install/mac-install/" target="_blank" rel="noopener noreferrer">docker desktop</a> to run this app locally.
114+
78115
### Create your codebase
79116

80117
Create your codebase using the `llm-app` template pre-configured with FastApi, Streamlit and PgVector. Use this codebase as a starting point for your LLM product.

0 commit comments

Comments
 (0)
Please sign in to comment.