mirror of
https://github.com/ParisNeo/lollms-webui.git
synced 2024-12-20 21:03:07 +00:00
Merge pull request #86 from andzejsp/main
Added notes and links to Readme.md
This commit is contained in:
commit
15a8e6d95b
11
README.md
11
README.md
@ -79,12 +79,13 @@ Now you're ready to work!
|
|||||||
# Supported models
|
# Supported models
|
||||||
You can also refuse to download the model during the install procedure and download it manually.
|
You can also refuse to download the model during the install procedure and download it manually.
|
||||||
For now we support any ggml model such as :
|
For now we support any ggml model such as :
|
||||||
- [GPT4ALL 7B](https://huggingface.co/ParisNeo/GPT4All/resolve/main/gpt4all-lora-quantized-ggml.bin)
|
- [GPT4ALL 7B](https://huggingface.co/ParisNeo/GPT4All/resolve/main/gpt4all-lora-quantized-ggml.bin)
|
||||||
- [Vicuna 7B](https://huggingface.co/eachadea/legacy-ggml-vicuna-7b-4bit/resolve/main/ggml-vicuna-7b-4bit.bin)
|
- [Vicuna 7B](https://huggingface.co/eachadea/legacy-ggml-vicuna-7b-4bit/resolve/main/ggml-vicuna-7b-4bit.bin) NOTE: Does not work out of the box
|
||||||
- [Vicuna 7B rev 1](https://huggingface.co/eachadea/legacy-ggml-vicuna-7b-4bit/resolve/main/ggml-vicuna-7b-4bit-rev1.bin)
|
- [Vicuna 7B rev 1](https://huggingface.co/eachadea/legacy-ggml-vicuna-7b-4bit/resolve/main/ggml-vicuna-7b-4bit-rev1.bin)
|
||||||
- [Vicuna 13B q4 v0](https://huggingface.co/eachadea/ggml-vicuna-13b-1.1/resolve/main/ggml-vicuna-13b-1.1-q4_0.bin)
|
- [Vicuna 13B q4 v0](https://huggingface.co/eachadea/ggml-vicuna-13b-1.1/resolve/main/ggml-vicuna-13b-1.1-q4_0.bin) NOTE: Does not work out of the box
|
||||||
- [Vicuna 13B q4 v1](https://huggingface.co/eachadea/ggml-vicuna-13b-1.1/resolve/main/ggml-vicuna-13b-1.1-q4_1.bin)
|
- [Vicuna 13B q4 v1](https://huggingface.co/eachadea/ggml-vicuna-13b-1.1/resolve/main/ggml-vicuna-13b-1.1-q4_1.bin) NOTE: Does not work out of the box
|
||||||
- [ALPACA 7B](https://huggingface.co/Sosaka/Alpaca-native-4bit-ggml/blob/main/ggml-alpaca-7b-q4.bin) NOTE: Needs conversion
|
- [Vicuna 13B rev 1](https://huggingface.co/eachadea/ggml-vicuna-13b-4bit/resolve/main/ggml-vicuna-13b-4bit-rev1.bin)
|
||||||
|
- [ALPACA 7B](https://huggingface.co/Sosaka/Alpaca-native-4bit-ggml/blob/main/ggml-alpaca-7b-q4.bin) NOTE: Does not work out of the box - Needs conversion
|
||||||
|
|
||||||
Just download the model into the models folder and start using the tool.
|
Just download the model into the models folder and start using the tool.
|
||||||
## Usage
|
## Usage
|
||||||
|
@ -27,10 +27,10 @@ personality_conditionning: |
|
|||||||
welcome_message: "Welcome! I am GPT4All A free and open discussion AI. What can I do for you today?"
|
welcome_message: "Welcome! I am GPT4All A free and open discussion AI. What can I do for you today?"
|
||||||
|
|
||||||
# This prefix is added at the beginning of any message input by the user
|
# This prefix is added at the beginning of any message input by the user
|
||||||
message_prefix: "\nuser:"
|
message_prefix: "\nuser: "
|
||||||
|
|
||||||
# This suffix is added at the end of any message input by the user
|
# This suffix is added at the end of any message input by the user
|
||||||
message_suffix: "\ngpt4all:"
|
message_suffix: "\ngpt4all: "
|
||||||
|
|
||||||
# Here is the list of extensions this personality requires
|
# Here is the list of extensions this personality requires
|
||||||
dependencies: []
|
dependencies: []
|
||||||
|
Loading…
Reference in New Issue
Block a user