Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fetch faild when run demo by node #709

Open
1 of 5 tasks
xiaobaichiliangpi opened this issue Apr 11, 2024 · 3 comments
Open
1 of 5 tasks

fetch faild when run demo by node #709

xiaobaichiliangpi opened this issue Apr 11, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@xiaobaichiliangpi
Copy link

System Info

transformerjs version: 2.16.1
system: mac os
node version: v18.20.1

Environment/Platform

  • Website/web-app
  • Browser extension
  • Server-side (e.g., Node.js, Deno, Bun)
  • Desktop app (e.g., Electron)
  • Other (e.g., VSCode extension)

Description

import { pipeline, env } from '@xenova/transformers';

env.allowLocalModels = false;

process.env.HTTP_PROXY = 'http://your.proxy.server:port';

// Create a feature extraction pipeline
const extractor = await pipeline('feature-extraction', 'nomic-ai/nomic-embed-text-v1', {
quantized: false, // Comment out this line to use the quantized version
});

// Compute sentence embeddings
const texts = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?'];
const embeddings = await extractor(texts, { pooling: 'mean', normalize: true });
console.log(embeddings);

It return errors when run 'node index.js', seems to can not fetch the model because of internet limit, how can i set internet proxy?

node:internal/deps/undici/undici:12618
Error.captureStackTrace(err, this);
^

TypeError: fetch failed
at node:internal/deps/undici/undici:12618:11
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async getModelFile (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/utils/hub.js:471:24)
at async getModelJSON (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/utils/hub.js:575:18)
at async Promise.all (index 1)
at async loadTokenizer (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/tokenizers.js:61:18)
at async AutoTokenizer.from_pretrained (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/tokenizers.js:4398:50)
at async Promise.all (index 0)
at async loadItems (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/pipelines.js:3206:5)
at async pipeline (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/pipelines.js:3146:21) {
cause: Error: read ECONNRESET
at TLSWrap.onStreamRead (node:internal/stream_base_commons:217:20) {
errno: -54,
code: 'ECONNRESET',
syscall: 'read'
}
}

Node.js v18.20.1

Reproduction

do like above

@xiaobaichiliangpi xiaobaichiliangpi added the bug Something isn't working label Apr 11, 2024
@JP-HoneyBadger
Copy link

following, simular issue

2 similar comments
@honorsuper
Copy link

following, simular issue

@AngeloCarnevale
Copy link

following, simular issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants