Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(build): expose pip_preheat_packages #4723

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

aarnphm
Copy link
Member

@aarnphm aarnphm commented May 9, 2024

With the addition of #4690, added a field under docker.pip_preheat_packages under bentofile.yaml allowing user to specifying a list of dependencies to be preheated within the cache layers for improvement of build time

docker:
  pip_preheat_packages:
    - vllm==0.4.2
    - lmdeploy

Would be useful for openllm as openllm will lock specific vllm version.

docker

Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
@aarnphm aarnphm requested a review from a team as a code owner May 9, 2024 22:30
@aarnphm aarnphm requested review from bojiang and frostming and removed request for a team May 9, 2024 22:30
@aarnphm
Copy link
Member Author

aarnphm commented May 14, 2024

@bojiang any comments on this?

@eledhwen
Copy link

eledhwen commented May 15, 2024

This would be a great addition for us, another issue we have with pre-heating is that it does not play nice with pip markers, eg.

torch==2.3.0 ; platform_machine!='x86_64'
torch==2.3.0 --index-url https://download.pytorch.org/whl/cpu ; platform_machine=='x86_64'

..or with a specific wheel for a given target platform, the syntax throws the direct pip install {} || true off.

Would it be a possibility to generate an intermediary requirements file with the exact same definition found in the original, and pip install -r that instead?

Alternatively, having the option of deactivating pre-heating entirely would alleviate the issue somewhat.

@bojiang
Copy link
Member

bojiang commented May 16, 2024

@eledhwen The issue should be fixed: #4737 Thanks for contribution.

@bojiang
Copy link
Member

bojiang commented May 16, 2024

What are the packages you wanna cover if we supported custom pip_preheat_packages ?

@aarnphm
Copy link
Member Author

aarnphm commented May 16, 2024

@bojiang this is different right? #4737 doesn't support custom packages, rather just finding torch or vllm from our predefined packages right?

I guess we do parse the python packages for vllm and torch. But then it will be pretty hard for us to manage additional packages in the future. Should we let users to have control over this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants