We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
31个schema,80多条数据测试耗时190秒 模型是按照文档使用命令: python deploy/python/infer_cpu.py --model_path_prefix export/inference 生成的,该如何提高推理性能?
The text was updated successfully, but these errors were encountered:
可以根据机器CPU配置,适当调大batch_size参数
Sorry, something went wrong.
#3496
这里有个LIte的INT8部署的工作,你可以看看自己的cpu是否支持INT8部署
没用
This issue is stale because it has been open for 60 days with no activity. 当前issue 60天内无活动,被标记为stale。
This issue was closed because it has been inactive for 14 days since being marked as stale. 当前issue 被标记为stale已有14天,即将关闭。
No branches or pull requests
使用的uie_nano预训练模型,但是速度还是很慢
31个schema,80多条数据测试耗时190秒
模型是按照文档使用命令:
python deploy/python/infer_cpu.py --model_path_prefix export/inference
生成的,该如何提高推理性能?
The text was updated successfully, but these errors were encountered: