Skip to content

[Feature Request] Replace LLM instruction with an MCP server #1824

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
mozharovsky opened this issue Mar 20, 2025 · 2 comments
Open

[Feature Request] Replace LLM instruction with an MCP server #1824

mozharovsky opened this issue Mar 20, 2025 · 2 comments

Comments

@mozharovsky
Copy link

Hey, thanks for the great framework — love building APIs with Encore!

Just wondering if you've considered providing an MCP server with Encore docs and examples. This way, we could avoid including large LLM instructions (~11K tokens) in the system prompt. Agents within MCP-compatible clients such as Cursor could dynamically fetch the necessary docs, significantly minimizing (or even completely eliminating) lengthy LLM instructions, as MCP effectively handles most use cases.

For reference, I highly recommend checking out Mastra's MCP docs server configuration, which offers a super smooth experience!

@marcuskohlberg
Copy link
Member

Agreed this would be great, we're going to do it as soon as we can.

@encoredev encoredev deleted a comment Apr 12, 2025
@MuhtasimM
Copy link

@mozharovsky recommend using the context7 mcp server It has the docs for encore

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants