You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The code works great with sam_l (both URI and binary). I guess it's because it generates two files. However, since sam_h is a large model, it generates numerous files, and I can't make it load through either URI or binary.
You can download my repository and place the sam_l and sam_h ONNX formats in the public folder, then run yarn start. Uncomment the following lines in src/App.tsx:
Describe the issue
Cannot load bigger models that generates multiple blocks neither from the binary or
uri
To reproduce
I have a repository, a simple Vite front-end application. You can take a look at it in here
basically in the App.tsx i have
You need to download the SAM models that I exported to ONNX:
sam_l
: https://ohif-assets.s3.us-east-2.amazonaws.com/SAM/sam_h.zipsam_h
: https://ohif-assets.s3.us-east-2.amazonaws.com/SAM/sam_l.zip(i just exported the onnx files using this repo)
The code works great with
sam_l
(both URI and binary). I guess it's because it generates two files. However, sincesam_h
is a large model, it generates numerous files, and I can't make it load through either URI or binary.You can download my repository and place the
sam_l
andsam_h
ONNX formats in thepublic
folder, then runyarn start
. Uncomment the following lines insrc/App.tsx
:You can see that the decoder still runs, but not the encoder, as it is probably referencing the smaller blocks.
Urgency
it is urgent i guess?
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.17.3
Execution Provider
'webgpu' (WebGPU)
The text was updated successfully, but these errors were encountered: