Skip to content

Commit 6eb9b03

Browse files
committed
Turning into public repo
0 parents  commit 6eb9b03

File tree

104 files changed

+19689
-0
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

104 files changed

+19689
-0
lines changed

.eslintrc.json

+3
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
{
2+
"extends": "next/core-web-vitals"
3+
}

.example.env

+4
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
TOGETHER_API_KEY=
2+
3+
# Optional, if you want observability
4+
HELICONE_API_KEY=

.gitignore

+37
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
2+
3+
# dependencies
4+
/node_modules
5+
/.pnp
6+
.pnp.js
7+
.yarn/install-state.gz
8+
9+
# testing
10+
/coverage
11+
12+
# next.js
13+
/.next/
14+
/out/
15+
16+
# production
17+
/build
18+
19+
# misc
20+
.DS_Store
21+
*.pem
22+
23+
# debug
24+
npm-debug.log*
25+
yarn-debug.log*
26+
yarn-error.log*
27+
28+
# local env files
29+
.env*.local
30+
.env
31+
32+
# vercel
33+
.vercel
34+
35+
# typescript
36+
*.tsbuildinfo
37+
next-env.d.ts

.npmrc

+1
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
force=true

.prettierrc

+3
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
{
2+
"plugins": ["prettier-plugin-tailwindcss"]
3+
}

CONTRIBUTING.md

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# Contributing Guide
2+
3+
Thank you for your interest in contributing to this project! We accept contributions via bug reports, feature requests and pull requests. We also have a roadmap outlined below.
4+
5+
For simple fixes or small items on the roadmap below, feel free to submit a pull request. For anything more complex, please open an issue first to discuss the changes you want to make.
6+
7+
## Running the repo
8+
9+
To run the repo locally, simply `npm install` to install dependencies and then `npm run dev` to run the app.
10+
11+
## Roadmap
12+
13+
- [ ] Add self-correcting to the app so it can fix its own errors
14+
- [ ] Compressing prompt: Use small model like llama 3.1 70B to retain what happened in the past, good memory management is key
15+
- [ ] Add evals with Braintrust to be able to measure how good the system is over time and when making new changes
16+
- [ ] Add more good examples to the shadcn-examples.ts file (single components that span a whole app and use shadcn)
17+
- [ ] Add dynamic OG images to the specific generations & include the prompt + a screenshot in the image
18+
- [ ] Show a "featured apps" section on /gallery (or have some at the bottom of the homepage as templates). Have a /id/${prompt} dynamic route that can display a bunch of nice example apps in the sandbox ready to go
19+
- [ ] Try finetuning a smaller model on good prompts from deepseek-v2 or o1/Claude
20+
- [ ] Add dark mode to the site overall, nice design change
21+
- [ ] Support more languages starting with Python (like streamlit) and see if I can run them on CSB SDK
22+
23+
## License
24+
25+
By contributing, you agree that your contributions will be licensed under the project's license.

LICENSE

+21
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) 2024 Hassan El Mghari
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.

README.md

+27
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
<a href="https://www.llamacoder.io">
2+
<img alt="Llama Coder" src="./public/og-image.png">
3+
<h1 align="center">Llama Coder</h1>
4+
</a>
5+
6+
<p align="center">
7+
An open source Claude Artifacts – generate small apps with one prompt. Powered by Llama 3 on Together.ai.
8+
</p>
9+
10+
## Tech stack
11+
12+
- [Llama 3.1 405B](https://ai.meta.com/blog/meta-llama-3-1/) from Meta for the LLM
13+
- [Together AI](https://togetherai.link/?utm_source=example-app&utm_medium=llamacoder&utm_campaign=llamacoder-app-signup) for LLM inference
14+
- [Sandpack](https://sandpack.codesandbox.io/) for the code sandbox
15+
- Next.js app router with Tailwind
16+
- Helicone for observability
17+
- Plausible for website analytics
18+
19+
## Cloning & running
20+
21+
1. Clone the repo: `git clone https://github.com/Nutlope/llamacoder`
22+
2. Create a `.env` file and add your [Together AI API key](https://togetherai.link/?utm_source=example-app&utm_medium=llamacoder&utm_campaign=llamacoder-app-signup): `TOGETHER_API_KEY=`
23+
3. Run `npm install` and `npm run dev` to install dependencies and run locally
24+
25+
## Contributing
26+
27+
For contributing to the repo, please see the [contributing guide](./CONTRIBUTING.md)

app/(main)/actions.ts

+206
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,206 @@
1+
"use server";
2+
3+
import { getPrisma } from "@/lib/prisma";
4+
import {
5+
getMainCodingPrompt,
6+
screenshotToCodePrompt,
7+
softwareArchitectPrompt,
8+
} from "@/lib/prompts";
9+
import { notFound } from "next/navigation";
10+
import Together from "together-ai";
11+
12+
export async function createChat(
13+
prompt: string,
14+
model: string,
15+
quality: "high" | "low",
16+
screenshotUrl: string | undefined,
17+
) {
18+
const prisma = getPrisma();
19+
const chat = await prisma.chat.create({
20+
data: {
21+
model,
22+
quality,
23+
prompt,
24+
title: "",
25+
shadcn: true,
26+
},
27+
});
28+
29+
let options: ConstructorParameters<typeof Together>[0] = {};
30+
if (process.env.HELICONE_API_KEY) {
31+
options.baseURL = "https://together.helicone.ai/v1";
32+
options.defaultHeaders = {
33+
"Helicone-Auth": `Bearer ${process.env.HELICONE_API_KEY}`,
34+
"Helicone-Property-appname": "LlamaCoder",
35+
"Helicone-Session-Id": chat.id,
36+
"Helicone-Session-Name": "LlamaCoder Chat",
37+
};
38+
}
39+
40+
const together = new Together(options);
41+
42+
async function fetchTitle() {
43+
const responseForChatTitle = await together.chat.completions.create({
44+
model: "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
45+
messages: [
46+
{
47+
role: "system",
48+
content:
49+
"You are a chatbot helping the user create a simple app or script, and your current job is to create a succinct title, maximum 3-5 words, for the chat given their initial prompt. Please return only the title.",
50+
},
51+
{
52+
role: "user",
53+
content: prompt,
54+
},
55+
],
56+
});
57+
const title = responseForChatTitle.choices[0].message?.content || prompt;
58+
return title;
59+
}
60+
61+
async function fetchTopExample() {
62+
const findSimilarExamples = await together.chat.completions.create({
63+
model: "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
64+
messages: [
65+
{
66+
role: "system",
67+
content: `You are a helpful bot. Given a request for building an app, you match it to the most similar example provided. If the request is NOT similar to any of the provided examples, return "none". Here is the list of examples, ONLY reply with one of them OR "none":
68+
69+
- landing page
70+
- blog app
71+
- quiz app
72+
- pomodoro timer
73+
`,
74+
},
75+
{
76+
role: "user",
77+
content: prompt,
78+
},
79+
],
80+
});
81+
82+
const mostSimilarExample =
83+
findSimilarExamples.choices[0].message?.content || "none";
84+
return mostSimilarExample;
85+
}
86+
87+
const [title, mostSimilarExample] = await Promise.all([
88+
fetchTitle(),
89+
fetchTopExample(),
90+
]);
91+
92+
let fullScreenshotDescription;
93+
if (screenshotUrl) {
94+
const screenshotResponse = await together.chat.completions.create({
95+
model: "meta-llama/Llama-3.2-90B-Vision-Instruct-Turbo",
96+
temperature: 0.2,
97+
max_tokens: 1000,
98+
messages: [
99+
{
100+
role: "user",
101+
content: [
102+
{ type: "text", text: screenshotToCodePrompt },
103+
{
104+
type: "image_url",
105+
image_url: {
106+
url: screenshotUrl,
107+
},
108+
},
109+
],
110+
},
111+
],
112+
});
113+
114+
fullScreenshotDescription = screenshotResponse.choices[0].message?.content;
115+
}
116+
117+
let userMessage: string;
118+
if (quality === "high") {
119+
let initialRes = await together.chat.completions.create({
120+
model: "Qwen/Qwen2.5-Coder-32B-Instruct",
121+
messages: [
122+
{
123+
role: "system",
124+
content: softwareArchitectPrompt,
125+
},
126+
{
127+
role: "user",
128+
content: fullScreenshotDescription
129+
? fullScreenshotDescription + prompt
130+
: prompt,
131+
},
132+
],
133+
temperature: 0.2,
134+
max_tokens: 3000,
135+
});
136+
137+
userMessage = initialRes.choices[0].message?.content ?? prompt;
138+
} else if (fullScreenshotDescription) {
139+
userMessage =
140+
prompt +
141+
"RECREATE THIS APP AS CLOSELY AS POSSIBLE: " +
142+
fullScreenshotDescription;
143+
} else {
144+
userMessage = prompt;
145+
}
146+
147+
let newChat = await prisma.chat.update({
148+
where: {
149+
id: chat.id,
150+
},
151+
data: {
152+
title,
153+
messages: {
154+
createMany: {
155+
data: [
156+
{
157+
role: "system",
158+
content: getMainCodingPrompt(mostSimilarExample),
159+
position: 0,
160+
},
161+
{ role: "user", content: userMessage, position: 1 },
162+
],
163+
},
164+
},
165+
},
166+
include: {
167+
messages: true,
168+
},
169+
});
170+
171+
const lastMessage = newChat.messages
172+
.sort((a, b) => a.position - b.position)
173+
.at(-1);
174+
if (!lastMessage) throw new Error("No new message");
175+
176+
return {
177+
chatId: chat.id,
178+
lastMessageId: lastMessage.id,
179+
};
180+
}
181+
182+
export async function createMessage(
183+
chatId: string,
184+
text: string,
185+
role: "assistant" | "user",
186+
) {
187+
const prisma = getPrisma();
188+
const chat = await prisma.chat.findUnique({
189+
where: { id: chatId },
190+
include: { messages: true },
191+
});
192+
if (!chat) notFound();
193+
194+
const maxPosition = Math.max(...chat.messages.map((m) => m.position));
195+
196+
const newMessage = await prisma.message.create({
197+
data: {
198+
role,
199+
content: text,
200+
position: maxPosition + 1,
201+
chatId,
202+
},
203+
});
204+
205+
return newMessage;
206+
}

0 commit comments

Comments
 (0)