你好,我现在有个 torch定义的模型,利用 torch.onnx.export 方法导出为 onnx 模型,但是在用Taro.createInferenceSession 加载模型时会报如下的错误: createInferenceSession:fail create session fail : xnet error:6: Failed to convert ONNX model to XNet modelFailed to convet onnx to xnet 我定位到,在经过了 x = torch.cat((a, b), dim=1) 该层之后,模型就加载失败,若不经过该层,模型就能正常的被加载,a.shape==(1, 96, 10, 10) b.shape==(1, 160,10,10), 这个的原因是什么呢 ps: torch模型和onnx模型都能正常预测,模型中的所有算子在官方文档中都显示支持
安卓 wx.createInferenceSession 报错[图片] 引入人脸识别模型时报错,真机调试时,IOS 加载没问题,安卓报错 报错信息 SystemError (appServiceSDKScriptError) Cannot read properties of undefined (reading 'CreateAsync') TypeError: Cannot read properties of undefined (reading 'CreateAsync') at new ov (WAServiceMainContext.js:1:771212) at Object.Jb (WAServiceMainContext.js:1:810165) at I (WAServiceMainContext.js:1:951470) at Object.p (WAServiceMainContext.js:1:953587) at I.forEach.v.<computed> (WAServiceMainContext.js:1:951107) at p (WAServiceMainContext.js:1:167370) at Object.success (WAServiceMainContext.js:1:167813) at a (WAServiceMainContext.js:1:668807) at Yu (WAServiceMainContext.js:1:668990) at rd (WAServiceMainContext.js:1:670124)
10-12你好,我现在有个 torch定义的模型,利用 torch.onnx.export 方法导出为 onnx 模型,但是在用Taro.createInferenceSession 加载模型时会报如下的错误: createInferenceSession:fail create session fail : xnet error:6: Failed to convert ONNX model to XNet modelFailed to convet onnx to xnet 我定位到,在经过了 x = torch.cat((a, b), dim=1) 该层之后,模型就加载失败,若不经过该层,模型就能正常的被加载,a.shape==(1, 96, 10, 10) b.shape==(1, 160,10,10), 这个的原因是什么呢 ps: torch模型和onnx模型都能正常预测,模型中的所有算子在官方文档中都显示支持
createInferenceSession failed 转换ONNX模型失败?createInferenceSession:fail create session fail : xnet error:6: Failed to convert ONNX model to XNet modelFailed to convet onnx to xnet 背景: 测试用的模型来自于torchvision ssdlite通过torch.onnx.export导出该模型在onnxruntime上可以成功运行如何获得模型转换失败的更详细原因? 什么是XNet模型?
10-11