Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error: unexpected argument ‘–max-input-tokens’ found #1903

Open
2 of 4 tasks
moruga123 opened this issue May 15, 2024 · 1 comment
Open
2 of 4 tasks

error: unexpected argument ‘–max-input-tokens’ found #1903

moruga123 opened this issue May 15, 2024 · 1 comment

Comments

@moruga123
Copy link

moruga123 commented May 15, 2024

System Info

Version: ghcr.io/huggingface/text-generation-inference:1.4

If I add --max-input-tokens 14000 to the docker arguments, it gives this error:

error: unexpected argument ‘–max-input-tokens’ found

When the container is starting, it reports out various arguments that it is using (including defaults), one of these is the prohibitively low default value max_input_length: 1024 .

Information

  • Docker
  • The CLI directly

Tasks

  • An officially supported command
  • My own modifications

Reproduction

docker run --gpus all \
    --shm-size 1g \
    -e HUGGING_FACE_HUB_TOKEN=$token \
    -p $myport:80 \
    -v $volume:/data ghcr.io/huggingface/text-generation-inference:1.4 \
    --model-id $model \
    --max-total-tokens 18000 \
    --max-input-tokens 14000

Expected behavior

It should run without errors, since various other CLI commands appear to work fine as docker arguments.

@moruga123
Copy link
Author

Using --max-input-length instead. It seems that the newer --max-input-tokens did not yet exist in ghcr.io/huggingface/text-generation-inference:1.4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant