From 26d97219bb16ef8e1e4195a1193858f5483686f0 Mon Sep 17 00:00:00 2001 From: Saifeddine ALOUI Date: Sun, 27 Aug 2023 19:21:44 +0200 Subject: [PATCH] updated readme --- README.md | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 2c2fbdce..5b4edc3c 100644 --- a/README.md +++ b/README.md @@ -126,11 +126,22 @@ cd lollms-webui ``` Now create a new conda environment, activate it and install requirements +With cuda support (GPU mode): ```bash -conda create --prefix ./env python=3.10 +conda create --prefix ./env python=3.10 cuda-toolkit ninja git conda activate ./env pip install -r requirements.txt ``` + +Without cuda support (CPU mode): +```bash +conda create --prefix ./env python=3.10 ninja git +conda activate ./env +pip install -r requirements.txt +``` +You should create an empty file called `.no_gpu` in the folder in order to prevent lollms from trying to use GPU. + + #### Using Docker Alternatively, you can use Docker to set up the LoLLMS WebUI. Please refer to the Docker documentation for installation instructions specific to your operating system.