Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors while removing background on deployed server #106

Open
Rolstenhouse opened this issue Feb 23, 2024 · 7 comments
Open

Errors while removing background on deployed server #106

Rolstenhouse opened this issue Feb 23, 2024 · 7 comments

Comments

@Rolstenhouse
Copy link

When I attempt to remove the background from a file on my server I get the following issue

corrupted size vs. prev_size
OR
free(): invalid size
OR
munmap_chunk(): invalid pointer

Have not been able to identify what triggers which, but feels like it might be an issue with the ML image (I'm using the small one)

Docker host: 20.10.12 linux x86_64
Node version: Node.js v21.6.2
Package version: 1.4.4

@Rolstenhouse Rolstenhouse changed the title Issue with background-removal-js-node Errors while removing background on deployed server Feb 23, 2024
@LevwTech
Copy link

Same issue

@DanielHauschildt
Copy link
Contributor

Would you be open to share the minimal example as I cannot reproduce it.

@Rolstenhouse
Copy link
Author

Rolstenhouse commented Feb 27, 2024

Sure - here's some more context

Snippet

         mediaUrl = "https://api.twilio.com/2010-04-01/Accounts/ACfbfe2e1e70ce74b02a4151bf91b23693/Messages/MM3fa6329883117973ec3cd7b180c6caca/Media/ME76f45b7483238aac2516ab5429c5018a"
          try {
            ort.env.debug = true;
            ort.env.logLevel = "warning";

            logger.info("Removing background for image", { mediaUrl });
            const localPath = `file://${process.cwd()}/public/imgly/`;
            logger.info("localPath", { localPath });
            const blob: Blob = await removeBackground(mediaUrl, {
              publicPath:
                process.env.NODE_ENV === "production"
                  ? "file:///myapp/public/imgly/"
                  : localPath,
              // publicPath: "https://stickerfy.xyz/imgly/",
              debug: true,
              model: "small",
              progress: (key, current, total) => {
                logger.warn(`Downloading ${key}: ${current} of ${total}`);
              },
            });
            buffer = Buffer.from(await blob.arrayBuffer());
          } catch (error) {
            logger.error("Error while removing background for image", {
              mediaUrl,
              error,
              errorMessage: error.message,
              errorStack: error.stack,
              errorName: error.name,
            });
          }
        }
        
        // Write the buffer to S3
        
        if (buffer) {
          // Upload to S3
          logger.info("Uploading image to S3", {
            info: {
              key: mediaSid!,
              contentType: "image/png",
              userId: user?.autoId || 0,
              buffer: buffer.length,
            },
          });
          backgroundRemovedImage = await uploadImageToS3({
            key: mediaSid!,
            buffer,
            contentType: "image/png",
            userId: user?.autoId || 0,
          });
        }

Here's a screenshot of the logs (and included a CSV with the log output)
image

Also note: this snippet includes the local file path, but I also ran into this issue when referencing the hosted model.

Deployed server is running on fly.io btw (not sure if that might be an issue)

extract-2024-02-27T00_49_29.879Z.csv

@LevwTech
Copy link

LevwTech commented Feb 27, 2024

My server is running on digital ocean with the same issue.
Droplet info:
Ubuntu 23.10 x64
Node 20
No gpu

@n3m3s7s
Copy link

n3m3s7s commented Mar 7, 2024

Hi,
I get the same error on WSL2 (Ubuntu).

Could be related to "onnxruntime-node" or WASM and the fact that TensorFlow and the model need a GPU and on server or a remote environment they are not available ?

I noticed that in the source, the function:

async function createOnnxSession(model, config) {
  if (config.debug) {
    ort.env.debug = true;
    ort.env.logLevel = "verbose";
    console.debug("ort.env.wasm:", ort.env.wasm);
  }
}

on my WSL2 environment actually prints an empty object to the console:

fetch /models/medium 100%
ort.env.wasm: {}
free(): invalid size

Thanks!

@DanielHauschildt
Copy link
Contributor

onnxruntime-node should work without a GPU.
ort.env.wasm seems wrong and the Node Version does not yet support the wasm backend.
So it seems ok that it's empty.

I have no access to such a machine at the moment, so unfortunately I cannot reproduce the error.
Also, I have no idea what the cause is.

@Rolstenhouse
Copy link
Author

Thanks for looking into it. For other devs that might encounter this, I used a different package: rembg on replicate and just paid the small out of pocket cost.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants