Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More indepth instructions for dumb idiots like me #1

Open
SkiddyDiddy opened this issue Feb 21, 2024 · 3 comments
Open

More indepth instructions for dumb idiots like me #1

SkiddyDiddy opened this issue Feb 21, 2024 · 3 comments

Comments

@SkiddyDiddy
Copy link

Mostly just the title. I really wanna try this cause it seems really cool but i can't really figure out how to do the uvicorn part. I've cloned the repo to a folder with git but i uvicorn elludes me. python, cmd, and miniconda3 all give a generic "not recognized" command when typing uvicorn. I knows its installed i can see it in my python folders i just don't know to use it :c

Also would this work with local models? Like running my own llm model through Text Generation Webui and using that api link instead of openai's api link. You can edit system prompts, termperature, cfg all dat stuff through Text Gen Webui

@ghost
Copy link

ghost commented Feb 24, 2024

pip install uvicorn

@uwoneko
Copy link
Owner

uwoneko commented Mar 2, 2024

if the webui supports openai spec, sure, if no then no
try to call uvicorn through python, like python -m uvicorn server:app

@ririinsta
Copy link

Using your own llm "works". It'll generate stuff most of the time but its kind of screwy. I am using LM Studio with gemma.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants