diff --git a/docs/tutorials/lmstudio_tutorial.md b/docs/tutorials/lmstudio_tutorial.md
index 353b83c6..98f41d32 100644
--- a/docs/tutorials/lmstudio_tutorial.md
+++ b/docs/tutorials/lmstudio_tutorial.md
@@ -6,12 +6,27 @@
3. Download [LoLLMs Web UI latest release](https://github.com/ParisNeo/lollms-webui/releases) for your OS. Move to a folder of your choosing and run the install file - follow prompts as needed.
-![Latest Releases](/assets/releases.png)
+
4. Run LoLLMs. Choose the ‘Settings’ tab in the LoLLMs Web UI.
+
+
+
5. Choose the ‘Binding Zoo’ subsection, scroll down to ‘Elf’ and select the ‘Install’ button.
+
+
+
6. After install, restart LoLLMs by closing the terminal application (Windows) and relaunching LoLLMs.
7. After the application launches, go back to the ‘Settings’ tab and choose the ‘Models Zoo’ subsection. Simply select the ‘elf_remote_model’ option to activate it.
@@ -20,23 +35,49 @@
9. Select the model you downloaded, apply any settings necessary (context window, preset etc) in the Server Model Settings window, and check the Cross-Origin-Resources-Sharing (CORS) box so it’s enabled.
-10. When done applying settings, press the green ‘Start Server’ button.
+
+
+
+10. When done applying server settings, press the green ‘Start Server’ button.
11. In the LoLLMs WebUI, go to the ‘Settings’ tab and choose the ‘Bindings Zoo’ subsection. Scroll down to the Elf binding option (should be the active binding) and choose the ‘Settings’ button.
- In the ‘address’ text box, copy the address shown here in LM Studio and paste it into this text box (NOTE: This address will not work, but is a necessary step to get this working):
+
+
- Set ‘completion format’ to ‘openai chat’.
- In the ‘model’ text box, type ‘local-model’.
- Set the ‘ctx_size’ text box to the maximum context size supported for your model. This can be found in the LM Studio Server Model Settings window:
+
+
- In the ‘server_key’ text box, type ‘not-needed’.
- - Leave the rest of the settings as-is. Your settings window should look similar to this (your address may be different). Press ‘Save Settings’ button.
+ - Leave the rest of the settings as-is. Your settings window should look similar to this (your address may be different):
+
+
+ - Press ‘Save Settings’ button.
12. Open a new discussion in LoLLMs and in the message box window type ‘test’ or something similar:
13. In the bottom right corner you will see a ‘Could not connect to server’ error message. Ignore this message and open LM Studio.
-14. In LM Studio look at the server logs. Ensure that there is a red message that says something like ‘ERROR] Unexpected endpoint or method.’. This message is actually good as it ensures that LoLLMs is seeing your LM Studio server.
+14. In LM Studio look at the server logs. Ensure that there is a red message that says something like ‘[ERROR] Unexpected endpoint or method.’. This message is actually good as it ensures that LoLLMs is seeing your LM Studio server.
15. Go back to LoLLMs WebUI and choose the ‘Settings’ tab. Go to the ‘Binding Zoo’ subsection and choose the Settings button for the ‘Elf’ binding we have enabled. Now, remove the last part of the URL for the ‘address’ up to the 4 numbers, so it should look like this:
+
16. Save the changes you made to the binding. LoLLMs should automatically reload and put you back in the discussion you made before. In the message box type ‘test’ and enter again. LM Studio should now be connected to LoLLMs and the model should start generating some text.