Update lmstudio_tutorial.md

This commit is contained in:
blasphemousjohn 2024-03-05 10:13:19 -08:00 committed by GitHub
parent 47a94b190a
commit 11bfd0b596
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -6,12 +6,27 @@
3. Download [LoLLMs Web UI latest release](https://github.com/ParisNeo/lollms-webui/releases) for your OS. Move to a folder of your choosing and run the install file - follow prompts as needed.
![Latest Releases](/assets/releases.png)
<figure>
<img src="assets/releases.png" width="400" height="475">
<alt="Latest Releases">
</figure>
4. Run LoLLMs. Choose the Settings tab in the LoLLMs Web UI.
<figure>
<img src="assets/settings_tab.png" width="400" height="350">
<alt="Settings Tab">
</figure>
5. Choose the Binding Zoo subsection, scroll down to Elf and select the Install button.
<figure>
<img src="assets/binding_zoo.png" width="600" height="400">
<alt="Binding Zoo">
</figure>
6. After install, restart LoLLMs by closing the terminal application (Windows) and relaunching LoLLMs.
7. After the application launches, go back to the Settings tab and choose the Models Zoo subsection. Simply select the elf_remote_model option to activate it.
@ -20,23 +35,49 @@
9. Select the model you downloaded, apply any settings necessary (context window, preset etc) in the Server Model Settings window, and check the Cross-Origin-Resources-Sharing (CORS) box so its enabled.
10. When done applying settings, press the green Start Server button.
<figure>
<img src="assets/lm_studio_server.png" width="600" height="300">
<alt="LM Studio Server">
</figure>
10. When done applying server settings, press the green Start Server button.
11. In the LoLLMs WebUI, go to the Settings tab and choose the Bindings Zoo subsection. Scroll down to the Elf binding option (should be the active binding) and choose the Settings button.
- In the address text box, copy the address shown here in LM Studio and paste it into this text box (NOTE: This address will not work, but is a necessary step to get this working):
<figure>
<img src="assets/server_ip_init.png" width="600" height="400">
<alt="False IP Address">
</figure>
- Set completion format to openai chat.
- In the model text box, type local-model.
- Set the ctx_size text box to the maximum context size supported for your model. This can be found in the LM Studio Server Model Settings window:
<figure>
<img src="assets/ctx_size.png" width="200" height="425">
<alt="LM Studio Server">
</figure>
- In the server_key text box, type not-needed.
- Leave the rest of the settings as-is. Your settings window should look similar to this (your address may be different). Press Save Settings button.
- Leave the rest of the settings as-is. Your settings window should look similar to this (your address may be different):
<figure>
<img src="assets/elf_settings.png" width="200" height="400">
<alt="LM Studio Server">
</figure>
- Press Save Settings button.
12. Open a new discussion in LoLLMs and in the message box window type test or something similar:
13. In the bottom right corner you will see a Could not connect to server error message. Ignore this message and open LM Studio.
14. In LM Studio look at the server logs. Ensure that there is a red message that says something like ERROR] Unexpected endpoint or method.. This message is actually good as it ensures that LoLLMs is seeing your LM Studio server.
14. In LM Studio look at the server logs. Ensure that there is a red message that says something like [ERROR] Unexpected endpoint or method.. This message is actually good as it ensures that LoLLMs is seeing your LM Studio server.
15. Go back to LoLLMs WebUI and choose the Settings tab. Go to the Binding Zoo subsection and choose the Settings button for the Elf binding we have enabled. Now, remove the last part of the URL for the address up to the 4 numbers, so it should look like this:
<figure>
<img src="assets/server_ip.png" width="400" height="120">
<alt="LM Studio Server">
</figure>
16. Save the changes you made to the binding. LoLLMs should automatically reload and put you back in the discussion you made before. In the message box type test and enter again. LM Studio should now be connected to LoLLMs and the model should start generating some text.