Skip to content
/ LAISer Public

An experimental local web search engine assistant frontend and CLI for ollama and llama.cpp with a focus on being extremely lightweight and easy to run. The goal is to provide something along the lines of a minimalist Perplexity.

License

Notifications You must be signed in to change notification settings

av1d/LAISer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LAISer

Local AI-Assisted Web Search

An experimental local search engine assistant frontend and CLI for ollama and llama.cpp with a focus on being extremely lightweight and easy to run. The goal is to provide something along the lines of a minimalist Perplexity. The web interface is responsive and works well with mobile devices.

Examples: Screenshot 01 Screenshot 02 Screenshot 03

It is written in Python 3 with Flask which provides a web-based UI, or you can run it from the command line instead.

If you can run ollama or llama.cpp then you can use this application.

It works by asking it a question or query like you would any other search engine.
It provides citations for all articles it references.
It does not chat, it only returns an answer.

The functionality is rudimentary, naive and serves as a basic proof-of-concept to show that feeding search results into a model can enhance the quality of the search results without doing anything fancy.

It will likely work with any halfway decent model but these are the ones I've tested it with. You don't really need anything fancy here. These are the tested models and they can be run on CPU (though may be a bit slow):

for llama.cpp:

for ollama:

  • stablelm-zephyr: ollama pull stablelm-zephyr
  • solobsd/rocket-3b: ollama pull solobsd/rocket-3b * untested
  • undi95/xwin-mlewd: ollama pull undi95/xwin-mlewd * untested

Known issues & limitations:

  • It currently only supports one web request at a time.
  • Doesn't allow refinement, it's "one and done".
  • Doesn't work with Firefox mobile or desktop. Use literally any other browser instead.
  • There are probably bugs.

Install:

git clone https://github.com/av1d/LAISer.git
cd LAISer
pip install -r requirements.txt

Run:

  • Make sure you have ollama or llama.cpp installed and running.
  • Edit your settings in the file settings.py

Now you can use it one of two ways:
python3 search.py -s or --server for server mode, or:
python3 search.py -q "What is the cost of a new TV?" or --query for command-line mode.

Found it useful? Tips appreciated but not required.

About

An experimental local web search engine assistant frontend and CLI for ollama and llama.cpp with a focus on being extremely lightweight and easy to run. The goal is to provide something along the lines of a minimalist Perplexity.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

Packages

No packages published