Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

o3-mini - "Unsupported parameter: 'max_tokens'" #423

Closed
ledilson opened this issue Mar 10, 2025 · 2 comments · Fixed by #510
Closed

o3-mini - "Unsupported parameter: 'max_tokens'" #423

ledilson opened this issue Mar 10, 2025 · 2 comments · Fixed by #510

Comments

@ledilson
Copy link

Can't use openai o3-mini

config:
[llm]
model = "o3-mini"
base_url = "https://api.openai.com/v1"
api_key = "sk-proj-valid_key"
#max_completion_tokens = 4096 ### --->(tried also)
#max_tokens = 4096 ### --->(tried to coment line)
temperature = 0.0

| ERROR | app.llm:ask_tool:260 - API error: Error code: 400 - {'error': {'message': "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.", 'type': 'invalid_request_error', 'param': 'max_tokens', 'code': 'unsupported_parameter'}}

I confirm that work's with gpt-4o model

@the0807
Copy link
Contributor

the0807 commented Mar 11, 2025

#411
Use the code from this PR

@aixiaoxin123
Copy link

本地已经复现,用的deepseek模型,谷歌搜索,需要开通代理;

Global LLM configuration

[llm]
model = "deepseek-chat"
base_url = "https://api.deepseek.com"
#api_key = "sk-cpstrqpeumbxgdojvibrgtmmkrhsgmqafvwywfflzwchopat"
api_key = "sk-123"
max_tokens = 4096
temperature = 0.0

Optional configuration for specific LLM models

[llm.vision]
model = "deepseek-chat"
base_url = "https://api.deepseek.com"
api_key = "sk-123"

写了一个完整的教程,可以参考这个:
https://mp.weixin.qq.com/s/G1wbK_7SmjMDC_zQ1xx7dA

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants