-
-
Notifications
You must be signed in to change notification settings - Fork 266
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: better anthropic cost logging #321
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 Looks good to me! Reviewed everything up to d0c1c8b in 13 seconds
More details
- Looked at
48
lines of code in1
files - Skipped
0
files when reviewing. - Skipped posting
2
drafted comments based on config settings.
1. gptme/llm/llm_anthropic.py:97
- Draft comment:
Combine logging ofchunk.type
andchunk.usage
into a single log entry for better readability and performance.
logger.info(f"Chunk type: {chunk.type}, Usage: {chunk.usage}")
- Reason this comment was not posted:
Confidence changes required:50%
The logging ofchunk.type
andchunk.usage
should be combined into a single log entry for better readability and performance.
2. gptme/llm/llm_anthropic.py:75
- Draft comment:
The import statement foranthropic.types.beta.prompt_caching
is unnecessary here since it's already imported at the module level under TYPE_CHECKING. - Reason this comment was not posted:
Confidence changes required:50%
The import statement foranthropic.types.beta.prompt_caching
is unnecessary in thestream
function since it is already imported at the module level under TYPE_CHECKING.
Workflow ID: wflow_Qev6fcPd4ztv9D4m
You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet
mode, and more.
Codecov ReportAll modified and coverable lines are covered by tests ✅
✅ All tests successful. No failed tests found. Additional details and impacted files@@ Coverage Diff @@
## master #321 +/- ##
==========================================
+ Coverage 72.49% 72.57% +0.07%
==========================================
Files 67 67
Lines 4912 4922 +10
==========================================
+ Hits 3561 3572 +11
+ Misses 1351 1350 -1
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
now -v output is actually readable
d0c1c8b
to
f074706
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 Looks good to me! Incremental review on f074706 in 15 seconds
More details
- Looked at
64
lines of code in2
files - Skipped
0
files when reviewing. - Skipped posting
2
drafted comments based on config settings.
1. gptme/llm/llm_anthropic.py:68
- Draft comment:
The logging level forresponse.usage
should beinfo
instead ofdebug
to match the PR description. - Reason this comment was not posted:
Comment did not seem useful.
2. gptme/llm/llm_anthropic.py:127
- Draft comment:
The logging level forchunk.message.usage
should beinfo
instead ofdebug
to match the PR description. This also applies tochunk.usage
on line 130. - Reason this comment was not posted:
Marked as duplicate.
Workflow ID: wflow_Cr8cOkrTD24ehuxo
You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet
mode, and more.
Also set a bunch of loglevels for libs higher, so now
-v
output is actually readable.Important
Enhance logging in
llm_anthropic.py
to capture usage data and chunk types, and adjust logging levels ininit.py
.llm_anthropic.py
:logger.debug(response.usage)
inchat()
to log usage data.logger.debug(chunk.message.usage)
instream()
formessage_start
case.logger.debug(chunk.usage)
instream()
formessage_delta
case.init.py
:anthropic
logger level toINFO
to reduce debug log spam.httpcore
logger level toWARNING
.This description was created by
for f074706. It will automatically update as commits are pushed.