Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Absolute paths not being respected in model configs #2200

Open
Prowler1000 opened this issue Apr 30, 2024 · 2 comments
Open

Absolute paths not being respected in model configs #2200

Prowler1000 opened this issue Apr 30, 2024 · 2 comments
Labels
bug Something isn't working unconfirmed

Comments

@Prowler1000
Copy link

LocalAI version:
v2.13.0

Environment, CPU architecture, OS, and Version:
Intel Xeon E5-2643 v4, GNU/Linux (Unraid 6.12.10)

Describe the bug
Absolute paths not being respected in model config yaml.

To Reproduce
Create a yaml config that points to an existing model or template file on the system with an absolute path. The path it tries to load will be relative to the models directory.

Expected behavior
The path is not interpreted as relative.

Additional context
Prior to version 2.13.0, I had model files, their configs, and their templates all stored in different directories for better organization (I had a lot, including various quants for different tasks). I had a separate issue with templates from an undocumented change that I was diagnosing when I discovered this.

@Prowler1000 Prowler1000 added bug Something isn't working unconfirmed labels Apr 30, 2024
@localai-bot
Copy link
Contributor

Hello Prowler1000, thank you for bringing this issue to our attention in the LocalAI project. I'm a bot, an experiment of @mudler, auditing the tickets opened in this Github project.

I understand that you are experiencing a problem where absolute paths in model configuration yaml files are not being respected, which causes incorrect loading paths when trying to load models or templates using absolute system paths. This issue seems to have started occurring in version 2.13.0.

In order to better assist you, please provide the following information:

  1. Can you provide a sample yaml config that demonstrates the issue? It would be helpful if we could reproduce the problem.
  2. Are there any specific dependencies or configurations related to your environment that might affect this issue? Any unique settings or configurations that could help us identify the root cause?

Once we have this information, we will be able to investigate the issue further and provide appropriate solutions or workarounds.

Thank you for your patience and your contribution to improving LocalAI.

Regards,
The Github Bot Auditor (Experiment of @mudler)

@Prowler1000
Copy link
Author

Here's a sample config for a model I use with Home Assistant

name: Llama3-HOAS
mmap: false
parameters:
  model: /models/models/Llama3/Meta-Llama-3-8B-Instruct.Q8_0.gguf
  
debug: true
prompt_cache_path: "../cache/Llama3-8b.cache"
prompt_cache_all: true
prompt_cache_ro: false
backend: llama-cpp

threads: 16
embeddings: true

mlock: true

template:
  chat_message: /models/templates/Llama3-HOAS/chat_message
  chat: /models/templates/Llama3-HOAS/chat
  function: /models/templates/Llama3-HOAS/function
  completion: /models/templates/Llama3-HOAS/completion
context_size: 8192
stopwords:
- <|eot_id|>

My file structure is as follows

/models
    /cache
    /configs
    /models
    /templates

With LOCALAI_MODELS_PATH actually being set to /models/configs.

Prior to v2.13.0 I had everything set up to use relative paths (prefixed with ../), however an issue with the templates being outside the model directory popped up so in fixing that, I decided it would be better to make everything an absolute path.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working unconfirmed
Projects
None yet
Development

No branches or pull requests

2 participants