mirror of
https://github.com/ParisNeo/lollms-webui.git
synced 2025-01-18 02:39:47 +00:00
fix
This commit is contained in:
parent
8ce861abda
commit
861e9f9f3b
190
docs/BACK.md
190
docs/BACK.md
@ -1,190 +0,0 @@
|
||||
# LoLLMS Web UI
|
||||
<div align="center">
|
||||
<img src="https://github.com/ParisNeo/lollms/blob/main/lollms/assets/logo.png" alt="Logo" width="200" height="200">
|
||||
</div>
|
||||
|
||||
![GitHub license](https://img.shields.io/github/license/ParisNeo/lollms-webui)
|
||||
![GitHub issues](https://img.shields.io/github/issues/ParisNeo/lollms-webui)
|
||||
![GitHub stars](https://img.shields.io/github/stars/ParisNeo/lollms-webui)
|
||||
![GitHub forks](https://img.shields.io/github/forks/ParisNeo/lollms-webui)
|
||||
[![Discord](https://img.shields.io/discord/1092918764925882418?color=7289da&label=Discord&logo=discord&logoColor=ffffff)](https://discord.gg/4rR282WJb6)
|
||||
[![Follow me on Twitter](https://img.shields.io/twitter/follow/SpaceNerduino?style=social)](https://twitter.com/SpaceNerduino)
|
||||
[![Follow Me on YouTube](https://img.shields.io/badge/Follow%20Me%20on-YouTube-red?style=flat&logo=youtube)](https://www.youtube.com/user/Parisneo)
|
||||
[![pages-build-deployment](https://github.com/ParisNeo/lollms-webui/actions/workflows/pages/pages-build-deployment/badge.svg)](https://github.com/ParisNeo/lollms-webui/actions/workflows/pages/pages-build-deployment)
|
||||
|
||||
|
||||
Welcome to LoLLMS WebUI (Lord of Large Language Models: One tool to rule them all), the hub for LLM (Large Language Model) models. This project aims to provide a user-friendly interface to access and utilize various LLM models for a wide range of tasks. Whether you need help with writing, coding, organizing data, generating images, or seeking answers to your questions, LoLLMS WebUI has got you covered.
|
||||
|
||||
[Click here for my youtube video on how to use the tool]([https://youtu.be/ds_U0TDzbzI](https://youtu.be/MxXNGv1zJ1A))
|
||||
## Features
|
||||
|
||||
- Choose your preferred binding, model, and personality for your tasks
|
||||
- Enhance your emails, essays, code debugging, thought organization, and more
|
||||
- Explore a wide range of functionalities, such as searching, data organization, and image generation
|
||||
- Easy-to-use UI with light and dark mode options
|
||||
- Integration with GitHub repository for easy access
|
||||
- Support for different personalities with predefined welcome messages
|
||||
- Thumb up/down rating for generated answers
|
||||
- Copy, edit, and remove messages
|
||||
- Local database storage for your discussions
|
||||
- Search, export, and delete multiple discussions
|
||||
- Support for Docker, conda, and manual virtual environment setups
|
||||
|
||||
## Screenshots
|
||||
Main page
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/9fd5ed82-cdff-467f-b159-9df61bc36b96)
|
||||
Settings page
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/50b1f51f-a85f-4a23-ba5d-979f51c8c83b)
|
||||
Hardware status
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/b10cecdf-d62f-4be8-b9af-59d6c6e7e43a)
|
||||
Support for most known bindings
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/516fe855-5ed9-4677-8350-3ae63478b3d6)
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/3e185079-e09b-4325-8ca0-fb66471eab68)
|
||||
Huge and updated models zoo for each binding type
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/a86f543c-4c60-43e4-8501-60d8d29d6938)
|
||||
Models search options
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/2441830d-0eca-4df7-8fa1-ffef4e16be8d)
|
||||
Custom models installation
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/50286fdf-16be-48e8-8bfa-d4e47b2160ff)
|
||||
Huge personalities library (about 300 personalities split in 36 categories)
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/188847e6-7c49-45e1-acf5-ca5a1f32ff53)
|
||||
Personalities search option
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/3b88a665-edb9-4ede-922a-3f2df9e749f2)
|
||||
Personalities bag where you can activate simultaniously multiple personalities
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/0955adc2-5e3b-4a49-9f54-7340be942d05)
|
||||
Multiple personalities discussions
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/32f630b8-712e-4d4c-8a69-fb932a3c856c)
|
||||
Hot personality selection
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/fbc7f249-d94c-4525-99b3-b0195b5bd800)
|
||||
|
||||
Artbot
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/45b507b5-d9be-4111-8ad4-266e27e334d4)
|
||||
Lollms personality maker
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/338a250b-1298-42a7-b4ec-a9f674353dea)
|
||||
Chat with docs with commands like send file and set database
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/9b9da237-2fa8-410c-a05a-28d0aa2dc494)
|
||||
|
||||
Python Specialist
|
||||
![image](https://github.com/ParisNeo/lollms-webui/assets/827993/01eee298-00e1-4caa-97c1-97b74ba8956d)
|
||||
|
||||
|
||||
|
||||
|
||||
## Installation
|
||||
|
||||
### Prerequisites
|
||||
|
||||
Before installing LoLLMS WebUI, make sure you have the following dependencies installed:
|
||||
|
||||
- [Python 3.10 or higher](https://www.python.org/downloads/release/python-3100/)
|
||||
- Pip - installation depends on OS, but make sure you have it installed.
|
||||
- [Git (for cloning the repository)](https://git-scm.com/downloads)
|
||||
- [Visual Studio Community](https://visualstudio.microsoft.com/vs/community/) with c++ build tools (for CUDA [nvidia GPU's]) - optional for windows
|
||||
- Build essentials (for CUDA [nvidia GPU's]) - optional for linux
|
||||
- [Nvidia CUDA toolkit 11.7 or higher](https://developer.nvidia.com/cuda-downloads) (for CUDA [nvidia GPU's]) - optional
|
||||
- [Miniconda3](https://docs.conda.io/en/latest/miniconda.html) - optional (more stable than python)
|
||||
|
||||
Ensure that the Python installation is in your system's PATH, and you can call it from the terminal. To verify your Python version, run the following command:
|
||||
|
||||
Windows:
|
||||
```bash
|
||||
python --version
|
||||
```
|
||||
|
||||
Linux:
|
||||
```bash
|
||||
python3 --version
|
||||
```
|
||||
|
||||
If you receive an error or the version is lower than 3.10, please install a newer version and try again.
|
||||
|
||||
### Installation steps
|
||||
|
||||
For detailed installation steps please refer to these documents:
|
||||
|
||||
- [Windows 10/11](./docs/usage/AdvancedInstallInstructions.md#windows-10-and-11)
|
||||
- [Linux (tested on ubuntu)](./docs/usage/AdvancedInstallInstructions.md#linux)
|
||||
#### Easy install
|
||||
|
||||
- Download the appropriate application launcher based on your platform:
|
||||
For Windows: `webui.bat`
|
||||
For Linux: `webui.sh`
|
||||
For Linux: `c_webui.sh` - using miniconda3
|
||||
- Place the downloaded launcher in a folder of your choice, for example:
|
||||
Windows: `C:\ai\LoLLMS-webui`
|
||||
Linux: `/home/user/ai/LoLLMS-webui`
|
||||
- Run the launcher script. Note that you might encounter warnings from antivirus or Windows Defender due to the tool's newness and limited usage. These warnings are false positives caused by reputation conditions in some antivirus software. You can safely proceed with running the script.
|
||||
Once the installation is complete, the LoLLMS WebUI will launch automatically.
|
||||
|
||||
#### Using Conda
|
||||
If you use conda, you can create a virtual environment and install the required packages using the provided `requirements.txt` file. Here's an example of how to set it up:
|
||||
First clone the project or download the zip file and unzip it:
|
||||
|
||||
```bash
|
||||
git clone https://github.com/ParisNeo/lollms-webui.git
|
||||
cd lollms-webui
|
||||
```
|
||||
Now create a new conda environment, activate it and install requirements
|
||||
|
||||
With cuda support (GPU mode):
|
||||
```bash
|
||||
conda create --prefix ./env python=3.10 cuda-toolkit ninja git
|
||||
conda activate ./env
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
Without cuda support (CPU mode):
|
||||
```bash
|
||||
conda create --prefix ./env python=3.10 ninja git
|
||||
conda activate ./env
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
You should create an empty file called `.no_gpu` in the folder in order to prevent lollms from trying to use GPU.
|
||||
|
||||
|
||||
#### Using Docker
|
||||
Alternatively, you can use Docker to set up the LoLLMS WebUI. Please refer to the Docker documentation for installation instructions specific to your operating system.
|
||||
|
||||
## Usage
|
||||
|
||||
You can launch the app from the webui.sh or webui.bat launcher. It will automatically perform updates if any are present. If you don't prefer this method, you can also activate the virtual environment and launch the application using python app.py from the root of the project.
|
||||
Once the app is running, you can go to the application front link displayed in the console (by default localhost:9600 but can change if you change configuration)
|
||||
### Selecting a Model and Binding
|
||||
- Open the LoLLMS WebUI and navigate to the Settings page.
|
||||
- In the Models Zoo tab, select a binding from the list (e.g., llama-cpp-official).
|
||||
- Wait for the installation process to finish. You can monitor the progress in the console.
|
||||
- Once the installation is complete, click the Install button next to the desired model.
|
||||
- After the model installation finishes, select the model and press Apply changes.
|
||||
- Remember to press the Save button to save the configuration.
|
||||
|
||||
### Starting a Discussion
|
||||
- Go to the Discussions view.
|
||||
- Click the + button to create a new discussion.
|
||||
- You will see a predefined welcome message based on the selected personality (by default, LoLLMS).
|
||||
- Ask a question or provide an initial prompt to start the discussion.
|
||||
- You can stop the generation process at any time by pressing the Stop Generating button.
|
||||
|
||||
### Managing Discussions
|
||||
- To edit a discussion title, simply type a new title or modify the existing one.
|
||||
- To delete a discussion, click the Delete button.
|
||||
- To search for specific discussions, use the search button and enter relevant keywords.
|
||||
- To perform batch operations (exporting or deleting multiple discussions), enable Check Mode, select the discussions, and choose the desired action.
|
||||
|
||||
# Contributing
|
||||
Contributions to LoLLMS WebUI are welcome! If you encounter any issues, have ideas for improvements, or want to contribute code, please open an issue or submit a pull request on the GitHub repository.
|
||||
|
||||
# License
|
||||
This project is licensed under the Apache 2.0 License. You are free to use this software commercially, build upon it, and integrate it into your own projects. See the [LICENSE](https://github.com/ParisNeo/lollms-webui/blob/main/LICENSE) file for details.
|
||||
|
||||
|
||||
# Contact
|
||||
|
||||
For any questions or inquiries, feel free to reach out via our discord server: https://discord.gg/4rR282WJb6
|
||||
|
||||
Thank you for your interest and support!
|
||||
|
||||
If you find this tool useful, don't forget to give it a star on GitHub, share your experience, and help us spread the word. Your feedback and bug reports are valuable to us as we continue developing and improving LoLLMS WebUI.
|
||||
|
||||
If you enjoyed this tutorial, consider subscribing to our YouTube channel for more updates, tutorials, and exciting content.
|
||||
|
||||
Happy exploring with LoLLMS WebUI!
|
@ -1,11 +0,0 @@
|
||||
# Documentation
|
||||
|
||||
Here are some useful documents:
|
||||
For developers:
|
||||
- [Database information](dev/db_infos.md)
|
||||
- [Full endpoints list](dev/full_endpoints_list.md)
|
||||
|
||||
Tutorials:
|
||||
- [Noobs](tutorials/noobs_tutorial.md)
|
||||
- [Personalities](tutorials/personalities_tutorial.md)
|
||||
|
124
docs/petals.md
124
docs/petals.md
@ -1,124 +0,0 @@
|
||||
Thank you very much. I actually managed only to make it run natively on linux.
|
||||
|
||||
On windows, there is a dependency that is making this very very difficult: uvloop. This dependency explicitly rejects any attempt to install it on windows. There is active work to make it windows friendly, but the pull requests are not yet accepted and they don't seem to be fully working yet. So we may expect them to make a windows version in the upcoming months but not sooner.
|
||||
|
||||
This means that my best shot at doing this is to use WSL.
|
||||
|
||||
It works like charm with WSL with cuda and everything:
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/aecc7e0e-2afa-4506-bbae-02bb12a355d2)
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/07f49df5-81e7-4391-9ddc-228b03a3c4d2)
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/6c30b8f8-b567-4311-a516-23416f2c3e47)
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/9bf4ff43-a4a7-4fc9-bc42-c49ec0bab2f5)
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/07f663da-6426-4fd1-b237-ee3d71903f01)
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/cb59498f-fde3-4e97-bcb0-187cb1422927)
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/ea9fb44a-f754-4346-8e26-edd30d9ef09f)
|
||||
The node is visible from the [https://health.petals.dev/](https://health.petals.dev/) site. So everything is running fine.
|
||||
|
||||
To sum up, I've built a simple .bat file that installs an ubuntu WSL system, installs python and pip, then installs petals and runs the server.
|
||||
|
||||
But that won't be acceptable if I understand the rules of this challenge. So I am integrating the installation directly in the lollms binding installation procedure. Usually, if you are using linux, I install the binding and run the node from python with the right models. So for windows I'll make a test and use the wsl instead.
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/dd005337-ee66-46c6-ae0d-73881eb34676)
|
||||
|
||||
Now with this, when you run lollms it starts the node but I need to code a bridge so that it is usable for text generation. I may go with a client that uses socketio to communicate with lollms.
|
||||
|
||||
The other solution is to literally install lollms in wsl, which will solve all bridging needs. I think I'll go with that other solution, that would save me some time.
|
||||
|
||||
I'll make a version of lollms that runs on wsl and is using petals by default.
|
||||
|
||||
DONE!
|
||||
|
||||
Now lollms can be installed with wsl support
|
||||
Works!
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/a94e2a03-bc74-45d3-86f3-1f5f9eb2da09)
|
||||
Now Install petals
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/a818eb40-387d-4fc5-8c88-5b8912648b20)
|
||||
|
||||
It automatically installs cuda and stuff:
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/4e3c7f1a-a99a-4083-8df7-6ff0065f9cf5)
|
||||
|
||||
Now it is using petals:
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/6400deb6-255a-48b0-bfc3-1a26c01b15a2)
|
||||
|
||||
To finish, I created an exe installer using innosetup:
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/873fa7d2-5688-4f6f-a5dd-a3743eb9df92)
|
||||
|
||||
Once installed you will have three new icons:
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/71932ff1-2c01-4155-96fd-e525c94b5a50)
|
||||
|
||||
- The lollms with petals launches lollms with petals support
|
||||
- The petals server runs a petals-team/StableBeluga2 server
|
||||
- The ubuntu is a terminal to interact with the wsl image that is running lollms or code using petals or any of the lollms library tools.
|
||||
|
||||
OK, now I finished making the installer. I'll make a video on how to do the install.
|
||||
|
||||
You can find all the scripts to build the installer in the lollms repository:
|
||||
|
||||
[https://github.com/ParisNeo/lollms-webui/tree/main/scripts/wsl](https://github.com/ParisNeo/lollms-webui/tree/main/scripts/wsl)
|
||||
|
||||
The installer is built using innosetup tool (free to download from the internet):
|
||||
|
||||
Steps:
|
||||
- Download the installer (make sure your antivirus don't block the download because the installer is new and sometimes the antiviruses consider that its reputation is not high enough for it to be safe)
|
||||
- Run the installer and accept licence and press next next next as any install.
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/4aaab953-75ce-4f63-86e6-1f273c0796ae)
|
||||
|
||||
|
||||
- After copying files, a console window wil appear. If you don't have wsl, it will install it and install an ubuntu distribution, It will ask you for a user name and password to be used for the ubuntu distribution. Otherwize, it may load a terminal. Just type exit to go on.
|
||||
- After that, another script is executed, this script requires sudo privileges, so make sure you type the password you have created when installed the ubuntu wsl. This script will update all files, install cuda, add it to the path and setup the environment variables, configure the whole system, install miniconda, clone lollms-webui repository, install all required files.
|
||||
- Now you have finished the install, you will be asked if you want to run lollms, you can accept.
|
||||
- Notice that there will be three new shortcuts on the desktop as stated before:
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/1250872c-a720-4656-a373-d4d43f125433)
|
||||
|
||||
- The first one is a simple ubuntu terminal, useful for debug and manual execution of petals
|
||||
- The second one is for running lollms to do inference with petals or any other binding
|
||||
- The third one is for running a petals server to give part of your PC to the community (you'll be prompted for a model hugging face path. if you press enter it will use petals-team/StableBeluga2)
|
||||
|
||||
You need to run lollms to install petals binding. When it is loaded it opens a browser. If it doesn't open a browser and navigate to localhost:9600.
|
||||
Go to settings -> Bindings zoo -> petals and press install. You can monitor the install by looking at the console output.
|
||||
|
||||
Once ready, open the models zoo and select a model you want to use for petals. Wait for it to load. If no model is showing up, just reload the localhost:9600 page and then go to settings and the models zoo should have models in it.
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/d1981e83-ea36-4df4-be99-ca21cb8ed168)
|
||||
|
||||
You can run the petals server by double clicking the petals server icon on the desktop. This will use your machine as part of the hive mind:
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/1176c8f5-5e64-4df1-baf1-d8ada8d49b47)
|
||||
|
||||
|
||||
And after all, in the discussion view it works like charm. We can see here that it is using the bs_petals which is the codename for the petals binding (i can't use the same name as the module to avoid import issues):
|
||||
|
||||
![image](https://github.com/TheSCInitiative/bounties/assets/827993/8c453a88-240a-4836-9d69-8e9fd1273508)
|
||||
|
||||
Now this is all in my lollms hugging face repository.
|
||||
You can find the code for wsl install of everything in here:
|
||||
[https://github.com/ParisNeo/lollms-webui/tree/main/scripts/wsl](https://github.com/ParisNeo/lollms-webui/tree/main/scripts/wsl)
|
||||
|
||||
You can modify the code to adapt any aspect to your needs then use innosetup to generate an installer or even make an installer that is independant from lollms if you don't need it.
|
||||
|
||||
I also provide an executable installer on my release page of lollms, just select the petals version:
|
||||
https://github.com/ParisNeo/lollms-webui/releases/tag/v6.5.0
|
||||
|
||||
The one with wsl and petals support is [lollms-with-petals.exe](https://github.com/ParisNeo/lollms-webui/releases/download/v6.5.0/lollms-with-petals.exe)
|
||||
|
||||
I will probably make a video explaining exactly how to install and use this tool.
|
||||
|
||||
|
||||
I hope you like this. Tell me if you have questions or notice a bug or something.
|
||||
|
||||
Here is my free discord channel: https://discord.gg/vHRwSxb5
|
||||
- My twitter: https://twitter.com/SpaceNerduino
|
||||
- My github: https://github.com/ParisNeo
|
||||
- My youtube channel: https://www.youtube.com/@Parisneo
|
||||
- Lollms community on twitter: https://twitter.com/i/communities/1695793673017966985
|
||||
- lollms-webui github: https://github.com/ParisNeo/lollms-webui
|
||||
|
||||
|
||||
Best regards
|
@ -1,38 +0,0 @@
|
||||
# Project structure:The project folder has the following structure:
|
||||
|
||||
- `docs`: This folder contains documentation related files.
|
||||
- `BACK.md`: This file contains information about the backend of the project.
|
||||
- `dev`: This folder contains development-related documentation.
|
||||
- `dev`: This folder contains further development-related documentation.
|
||||
- `db_infos.md`: This file contains information about the database used in the project.
|
||||
- `full_endpoints_list.md`: This file contains a list of all the endpoints in the server.
|
||||
- `new_ui_dev.md`: This file contains information about the development of a new user interface.
|
||||
- `scheme.png`: This file contains a scheme diagram related to the project.
|
||||
- `server_endpoints.md`: This file contains information about the server endpoints.
|
||||
- `index.md`: This file contains the main documentation index.
|
||||
- `petals.md`: This file contains information about the petals of the project.
|
||||
- `project_structure.md`: This file contains information about the overall project structure.
|
||||
- `tutorials`: This folder contains tutorial-related documentation.
|
||||
- `tutorials`: This folder contains further tutorial-related documentation.
|
||||
- `noobs_tutorial.md`: This file contains a tutorial for beginners.
|
||||
- `personalities_tutorial.md`: This file contains a tutorial about personalities in the project.
|
||||
- `usage`: This folder contains usage-related documentation.
|
||||
- `usage`: This folder contains further usage-related documentation.
|
||||
- `AdvancedInstallInstructions.md`: This file contains advanced installation instructions.
|
||||
- `Build_extensions.md`: This file contains information about building extensions for the project.
|
||||
- `Linux_Osx_Install.md`: This file contains installation instructions for Linux and OSX.
|
||||
- `Linux_Osx_Usage.md`: This file contains usage instructions for Linux and OSX.
|
||||
- `youtube`: This folder contains YouTube-related documentation.
|
||||
- `youtube`: This folder contains further YouTube-related documentation.
|
||||
- `lollms_collaborative_test.md`: This file contains information about a collaborative test on YouTube.
|
||||
- `lollms_lawyer.md`: This file contains information about a lawyer-related topic on YouTube.
|
||||
- `lollms_snake_game.md`: This file contains information about a snake game on YouTube.
|
||||
- `lollms_v6_windows_install.md`: This file contains installation instructions for version 6 on Windows.
|
||||
- `playground_coding.md`: This file contains information about coding in the project's playground.
|
||||
- `playground_translation.md`: This file contains information about translation in the project's playground.
|
||||
- `scheme.png`: This file contains a scheme diagram related to the project.
|
||||
- `script_install.md`: This file contains installation instructions for a script related to the project.
|
||||
- `script_lollms.md`: This file contains information about a script related to the project.
|
||||
- `script_models.md`: This file contains information about script models in the project.
|
||||
- `script_personalities.md`: This file contains information about script personalities in the project.
|
||||
- `v3_installation.md`: This file contains installation instructions for version 3 of the project.
|
@ -1 +1 @@
|
||||
Subproject commit 9c721ccb3e2594be1fb887fdf8c06a05d1ba1bfa
|
||||
Subproject commit 045568f106a355825ddde264b71283c68c48f5ee
|
Loading…
Reference in New Issue
Block a user