You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to use storage initializer to pull the huggingface models which fits the KServe experience. StorageUri will be in this format: hf://<repo>/<model>:<hash(optional)>
Links to the design documents:
[Optional, start with the short-form RFC template to outline your ideas and get early feedback.]
[Required, use the longer-form design doc template to specify and discuss your design in more detail]
The text was updated successfully, but these errors were encountered:
Hello, I have some questions about kServe model storage.
If I have a model installed from huggingface and a python file to load model, how to use InferenceService CRD?
Like the following YAML, where is the python file to load model?
/kind feature
Describe the solution you'd like
I would like to use storage initializer to pull the huggingface models which fits the KServe experience. StorageUri will be in this format:
hf://<repo>/<model>:<hash(optional)>
Anything else you would like to add:
Another option is to support any
Git LFS
.Links to the design documents:
[Optional, start with the short-form RFC template to outline your ideas and get early feedback.]
[Required, use the longer-form design doc template to specify and discuss your design in more detail]
The text was updated successfully, but these errors were encountered: