This commit is contained in:
Saifeddine ALOUI 2024-12-09 00:29:36 +01:00
parent 59377b0b0b
commit f73bd49cf9
5 changed files with 32 additions and 14 deletions

View File

@ -59,19 +59,25 @@ As an all-encompassing tool with access to over 500 AI expert conditioning acros
Thank you for all users who tested this tool and helped making it more user friendly.
## Installation
### Automatic installation (UI)
If you are using Windows, just visit the release page, download the lollms_installer.bat.
### Automatic installation (Console)
Download the installation script from scripts folder and run it.
The installation scripts are:
- `win_install.bat` for Windows.
- `linux_install.sh`for Linux.
- `mac_install.sh`for Mac.
- `lollms_installer.bat` for Windows.
- `lollms_installer.sh`for Linux.
- `lollms_installer_macos.sh`for Mac.
### Manual install:
Since v 9.4, it is not advised to do manual install as many services require the creation of a separate environment and lollms needs to have complete control on the environments. So If you install it using your own conda setup, you will not be able to install any service and reduce the use of lollms to the chat interface (no xtts, no comfyui, no fast generation through vllm or petals or soever)
Since v 10.14, manual installation os back:
make sure you have python 3.11 is installed or to have a python 3.11 conda environment or other way.
clone the repo at: `https://github.com/ParisNeo/lollms-webui.git`
better create an environment for lollms and activate it
in the repo folder, make sure you pull all submodules: `git submodule update --init --recursive`
install lollms by going to lollms_core then do: `pip install -e .`
go back to the root of the lollms_webui folder
install all requirements: `pip install -r requirements.txt`
now you are ready to run lolmms: `python app.py`
## Smart Routing: Optimizing for Money and Speed
Lollms' Smart Routing feature goes beyond just selecting the right model for accuracy. It empowers you to optimize your text generation process for two key factors: **money** and **speed**.

15
app.py
View File

@ -387,10 +387,17 @@ if __name__ == "__main__":
@app.exception_handler(ValidationError)
async def validation_exception_handler(request: Request, exc: ValidationError):
print(f"Error: {exc.errors()}") # Print the validation error details
return JSONResponse(
status_code=422,
content=jsonable_encoder({"detail": exc.errors(), "body": await exc.body}), # Send the error details and the original request body
)
if (hasattr(exc,"body")):
return JSONResponse(
status_code=422,
content=jsonable_encoder({"detail": exc.errors(), "body": await exc.body}), # Send the error details and the original request body
)
else:
return JSONResponse(
status_code=422,
content=jsonable_encoder({"detail": exc.errors(), "body": ""}), # Send the error details and the original request body
)
app = ASGIApp(socketio_server=sio, other_asgi_app=app)

View File

@ -1661,8 +1661,13 @@ class LOLLMSRAGClient {
},
body: body ? JSON.stringify(body) : null,
};
const response = await fetch(`${this.lc.host_address}${endpoint}`, options);
let response ="";
if (this.lc.host_address!=null){
response = await fetch(`${this.lc.host_address}${endpoint}`, options);
}
else{
response = await fetch(`${endpoint}`, options);
}
const data = await response.json();
if (!response.ok) {

@ -1 +1 @@
Subproject commit ae9f595bd0bf4df34cdf585438fb79be349dbf32
Subproject commit 5c1ecc21dbdd7b823c00a34fa9198a1eeec44c5b

@ -1 +1 @@
Subproject commit 1f2e0d04d594ab40abac17b47a86ab6e45b5ec69
Subproject commit 73b6226194830d875816b264919fceec12131c7c