-
Notifications
You must be signed in to change notification settings - Fork 43
Feature: Token reporting usage #46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I'm glad you like the project. Token counting is possible and will be helpful. There are two ways we can report the results back:
I like the second approach better as it doesn't hinder the user conversation flow with the model. What is your opinion? I'm not sure if by "Claude Desktop conversation prompt" you mean the tool output or resource? |
It's an interesting idea, also I've enabled the discussions. |
I updated the OP with additional details and findings. I'm inclined also to select the second approach. Related to your question about conversation prompt, what I like in Zed editor is that you can see into upper right corner the token usage, so it does not hinder the conversation. We could have a switch like Edit: @rusiaaman Note Claude's suggestion for implementation: Add token usage display in Claude Desktop interface. That means is possible to display the token usage into Claude Desktop interface, like Zed does it. See below what Clause suggests. We might need to use the first approach, as tool output that Claude can also read, in order to display it into Claude Desktop interface. Second conversation ask:
Which option would you pick?
About the suggested format |
@rusiaaman I'm almost done with a PR, I will finish it tomorrow. This is what the implementation will look like, Claude does not render its Desktop well, the implementation will look much better live: IMO this is the best location for counter (next to Send button), as is non-invasive for end-user. I also added into The MCP server stops sending the For dependencies, we use # Fallback to tiktoken, if anthropic_tokenizer not available
try:
from anthropic_tokenizer import count_tokens
except ImportError:
import tiktoken
def count_tokens(text: str) -> int:
"""Count tokens using tiktoken fallback"""
enc = tiktoken.get_encoding("cl100k_base")
return len(enc.encode(text)) |
First, thank you for the great product, it works wonders. I was wondering if is possible to implement a token reporting usage, on each prompt? Something like Zed does, see 13/200k screenshot.
Claude's Suggested Implementation
My ask:
As a side note, it might be beneficial to enable Discussions into your repository, for general questions.
The text was updated successfully, but these errors were encountered: