createInferenceSession(modelPath) { return new Promise((resolve, reject) => { this.session = wx.createInferenceSession({ model: modelPath, /* 0: Lowest precision e.g., LS16 + A16 + Winograd A16 + approx. math 1: Lower precision e.g., LS16 + A16 + Winograd off + approx. math 2: Modest precision e.g., LS16 + A32 + Winograd A32 + approx. math 3: Higher precision e.g., LS32 + A32 + Winograd A32 + approx. math 4: Highest precision e.g., LS32 + A32 + Winograd A32 + precise math Higher precision always require longer time to run session */ precisionLevel : 0, allowNPU : false, // wheather use NPU for inference, only useful for IOS allowQuantize: false, // wheather generate quantize model }); // 监听error事件 this.session.onError((error) => { console.log("onError") console.error(error); reject(error); }); this.session.onLoad(() => {console.log("onload") this.ready = true; resolve(); }); this.session.offError(()=>{ console.log("offErr") }); this.session.offLoad(()=>{ console.log("offLoad") }); // wx.getFileSystemManager().access({ // path: modelPath, // success: (res) => // { // console.log("test succ") // }, // fail:(res)=>{ // console.log("test err") // } // }) }) }
和体验版不同https://developers.weixin.qq.com/miniprogram/dev/devtools/remote-debug-2.html 真机调试下 wx.createInferenceSession 可以正确执行 session.onLoad方法,可是在体验版下无法执行session.onLoad,哪个是正常的呢?
2024-07-15createInferenceSession(modelPath) { return new Promise((resolve, reject) => { this.session = wx.createInferenceSession({ model: modelPath, /* 0: Lowest precision e.g., LS16 + A16 + Winograd A16 + approx. math 1: Lower precision e.g., LS16 + A16 + Winograd off + approx. math 2: Modest precision e.g., LS16 + A32 + Winograd A32 + approx. math 3: Higher precision e.g., LS32 + A32 + Winograd A32 + approx. math 4: Highest precision e.g., LS32 + A32 + Winograd A32 + precise math Higher precision always require longer time to run session */ precisionLevel : 0, allowNPU : false, // wheather use NPU for inference, only useful for IOS allowQuantize: false, // wheather generate quantize model }); // 监听error事件 this.session.onError((error) => { console.log("onError") console.error(error); reject(error); }); this.session.onLoad(() => { console.log("onload") this.ready = true; resolve(); }); this.session.offError(()=>{ console.log("offErr") }); this.session.offLoad(()=>{ console.log("offLoad") }); }) } 真机调试可以执行到onLoad,体验版和线上都无法执行
这个onLoad方法执行不了https://developers.weixin.qq.com/minigame/dev/api/ai/inference/InferenceSession.onLoad.html
2024-07-15自己顶
ai推理的项目中wx.createInferenceSession onLoad无法加载?在 ai/mobilenet/index 那个官方项目中,classify.js中的推理函数 createInferenceSession方法中 的 this.session.onLoad(() => { this.ready = true; resolve(); }); 这个onLoad方法只有在 真机调试 中可以正常调用,在开发模式,体验模式,线上模式都无法正常加载(并不报错)
2024-07-13真机调试可以,体验版即不执行也不报错 this.createInferenceSession(modelPath).then(() => {console.log("张小龙"); resolve(); })
小程序wx.createInferenceSession能用吗?InferenceSession wx.createInferenceSession(Object object)这个使用onnx模型推理接口还能使用吗,为什么我点不进去,而且编译显示wx.createInferenceSession not found
2024-07-12this.createInferenceSession(modelPath).then
调用wx.createInferenceSession真机上不能用?一直报错调用wx.createInferenceSession,真机上运行报错,报错提示: SystemError (appServiceSDKScriptError) Cannot read properties of undefined (reading 'CreateAsync') TypeError: Cannot read properties of undefined (reading 'CreateAsync') 开发工具上加载正常,真机就报错,更换过很多onnx模型都不行,模型上传到云,然后下载本地创建还是报错,到底是啥问题? 微信版本8.0.49
2024-07-11