Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Paddle2ONNX不支持导出半精度ONNX模型 #1148

Closed
tiandou-tangdou opened this issue Sep 11, 2023 · 4 comments
Closed

Paddle2ONNX不支持导出半精度ONNX模型 #1148

tiandou-tangdou opened this issue Sep 11, 2023 · 4 comments

Comments

@tiandou-tangdou
Copy link

tiandou-tangdou commented Sep 11, 2023

Describe the bug

paddle框架提供的paddle.onnx.export 不支持导出半精度ONNX模型,会直接报错[ERROR] Float16 is not supported.
同paddle2onnx也不支持导出半精度ONNX模型,期望提供方法能够支持这个需求。

related issue: PaddlePaddle/Paddle#57194

最小的实现示例:

import paddle
from paddlenlp.transformers import UIEX # 从模型代码中导入模型
model = UIEX.from_pretrained("uie-x-base") # 实例化模型
model.to(dtype="float16") # 加载预训练模型参数
model.eval() # 将模型设置为评估状态

input_spec = [
paddle.static.InputSpec(shape=[None, None], dtype="int64", name="input_ids"),
paddle.static.InputSpec(shape=[None, None], dtype="int64", name="token_type_ids"),
paddle.static.InputSpec(shape=[None, None], dtype="int64", name="position_ids"),
paddle.static.InputSpec(shape=[None, None], dtype="int64", name="attention_mask"),
paddle.static.InputSpec(shape=[None, None, 4], dtype="int64", name="bbox"),
paddle.static.InputSpec(shape=[None, 3, 224, 224], dtype="float16", name="image"),
] # # 定义输入数据

print("Exporting ONNX model to %s" % "./uiex_fp16.onnx")
paddle.onnx.export(model, "./uiex_fp16", input_spec=input_spec) # ONNX模型导出
print("ONNX model exported.")

另,网上提供的工具https://zenn.dev/pinto0309/scraps/588ed8342e2182 将FP32的onnx模型转为FP16后,此FP16的onnx模型存在问题不能在onnxruntime上使用。

Informations (please complete the following information):
image

9f720274481b376adb75dae6f6ffd5f
18a74c76f2f2e7a65d7106b57e8c926

@hl-gl
Copy link

hl-gl commented Sep 21, 2023

请问解决了吗?

@Zheng-Bicheng
Copy link
Collaborator

会在下面的PR中修复Paddle2ONNX 无法原生导出FP16模型的Bug

Copy link

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Jan 14, 2025
Copy link

This issue was closed because it has been inactive for 14 days since being marked as stale.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants