New documentation

This commit is contained in:
Saifeddine ALOUI 2024-12-11 00:49:16 +01:00
parent 1c9f203757
commit 36ee4e69b3

177
ManualInstall.md Normal file
View File

@ -0,0 +1,177 @@
Certainly! I'll create a cross-platform installation guide for LoLLMS WebUI that covers Windows, macOS, and Linux. This guide will provide step-by-step instructions for each platform, highlighting the differences where necessary.
# LoLLMS WebUI Cross-Platform Installation Guide
## Introduction
LoLLMS (Lord of Large Language and Multimodal Systems) is a powerful tool for working with large language models. This guide will walk you through the installation process for the LoLLMS WebUI on Windows, macOS, and Linux.
## Prerequisites
Before you begin, ensure you have the following:
- Internet connection
- Administrator/sudo privileges on your system
## Installation Steps
### 1. Install Git
#### Windows
- Download and install Git from https://git-scm.com/download/win
#### macOS
- Install Homebrew if not already installed:
```
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
```
- Install Git:
```
brew install git
```
#### Linux
- Use your distribution's package manager. For example, on Ubuntu:
```
sudo apt-get update
sudo apt-get install git
```
### 2. Install Python
Ensure you have Python 3.11. You can use conda to install the python version along with a separate environment, or use another environment management tool that allows you to install with python 3.11 as this is important.
#### Windows
- Download and install from https://www.python.org/downloads/windows/
#### macOS
- Install using Homebrew:
```
brew install python
```
#### Linux
- Most distributions come with Python pre-installed. If not, use your package manager. For Ubuntu:
```
sudo apt-get install python3 python3-pip
```
### 3. Clone the LoLLMS WebUI Repository
Open a terminal (Command Prompt on Windows) and run:
```
git clone --depth 1 --recurse-submodules https://github.com/ParisNeo/lollms-webui.git
cd lollms-webui
git submodule update --init --recursive
```
### 4. Create and Activate a Virtual Environment
#### Windows
```
python -m venv lollms_env
lollms_env\Scripts\activate
```
#### macOS and Linux
```
python3 -m venv lollms_env
source lollms_env/bin/activate
```
### 5. Install Requirements
With the virtual environment activated, run:
```
pip install -r requirements.txt
pip install -e lollms_core
```
### 6. Select and Install a Binding
Choose a binding based on your needs. Here are some options:
- Local bindings: ollama, python_llama_cpp, bs_exllamav2
- Remote bindings: groq, open_router, open_ai, mistral_ai, gemini, vllm, xAI, elf, remote_lollms
To install a binding, run:
```
python zoos/bindings_zoo/<binding_name>/__init__.py
```
Replace `<binding_name>` with your chosen binding.
### 7. Create Launcher Scripts
#### Windows
Create `lollms.bat` in the LoLLMS directory:
```batch
@echo off
call lollms_env\Scripts\activate
cd lollms-webui
python app.py %*
pause
```
#### macOS and Linux
Create `lollms.sh` in the LoLLMS directory:
```bash
#!/bin/bash
source lollms_env/bin/activate
cd lollms-webui
python app.py "$@"
```
Make it executable:
```
chmod +x lollms.sh
```
## Optional Steps
### Install CUDA (for NVIDIA GPUs)
If you have an NVIDIA GPU and want to use it for local AI:
#### Windows
- Download and install CUDA from https://developer.nvidia.com/cuda-downloads
#### macOS
- CUDA is not supported on macOS with recent NVIDIA GPUs.
#### Linux
- Follow the CUDA installation guide for your specific Linux distribution from https://developer.nvidia.com/cuda-downloads
### Install Visual Studio Code
For local AI development, you may want to install Visual Studio Code:
- Download and install from: https://code.visualstudio.com/download
## Running LoLLMS WebUI
To start the LoLLMS WebUI:
#### Windows
Run the `lollms.bat` file.
#### macOS and Linux
Run the `lollms.sh` script:
```
./lollms.sh
```
## Troubleshooting
If you encounter any issues during installation or running the WebUI, please check the following:
1. Ensure all prerequisites are correctly installed.
2. Verify that your Python environment is activated before running any commands.
3. Check that all required packages are installed correctly.
4. Make sure you have selected and installed a compatible binding.
5. For platform-specific issues, consult the documentation for your operating system.
For further assistance, please refer to the official LoLLMS documentation or seek help in the project's support channels.
This cross-platform guide provides instructions for installing LoLLMS WebUI on Windows, macOS, and Linux. It covers the installation of prerequisites, setting up the environment, cloning the repository, installing dependencies, selecting a binding, and creating launcher scripts. The guide also includes optional steps for CUDA installation (where applicable) and Visual Studio Code installation, as well as basic troubleshooting tips.