fixed some vulenerabilities

This commit is contained in:
Saifeddine ALOUI 2024-03-28 23:58:51 +01:00
parent e231ed60ee
commit 41dbb1b3f2
18 changed files with 994 additions and 89 deletions

View File

@ -0,0 +1,68 @@
# Security Vulnerability Report for chat_bar.py
This report aims to identify potential security vulnerabilities in the provided code snippet from `chat_bar.py` and suggest fixes for them.
## 1. Unrestricted Access to Sensitive Functionality
The `/add_webpage` endpoint does not seem to have any access restrictions, allowing any client to use this functionality. This can be potentially exploited by remote users to scrape web pages and save their content to the server.
**Vulnerable Code Snippet:**
```python
@router.post("/add_webpage")
async def add_webpage(request: AddWebPageRequest):
# ...
```
**Suggested Fix:**
To restrict this functionality to localhost only, you can use the `forbid_remote_access` function from the `lollms.security` module.
```python
from lollms.security import forbid_remote_access
@router.post("/add_webpage")
async def add_webpage(request: AddWebPageRequest):
forbid_remote_access(lollmsElfServer)
# ...
```
## 2. Potential Path Traversal Vulnerability
Although the `sanitize_path` function is used to prevent path traversal attacks, it's important to ensure that it's used correctly and consistently. In the `do_scraping` function, the `sanitize_path` function is used with `allow_absolute_path=True`, which might expose a potential path traversal vulnerability if the `lollmsElfServer.lollms_paths.personal_uploads_path` is not properly set.
**Vulnerable Code Snippet:**
```python
file_path = sanitize_path(lollmsElfServer.lollms_paths.personal_uploads_path / f"web_{index}.txt", True)
```
**Suggested Fix:**
Ensure that `lollmsElfServer.lollms_paths.personal_uploads_path` is a safe path and does not allow path traversal. If there's any doubt, it's better to disallow absolute paths.
```python
file_path = sanitize_path(lollmsElfServer.lollms_paths.personal_uploads_path / f"web_{index}.txt")
```
## 3. Unhandled Exceptions
The `execute_command` function in the commented-out code does not seem to handle exceptions. If an error occurs during command execution, it could lead to unexpected behavior or server crashes.
**Vulnerable Code Snippet:**
```python
lollmsElfServer.personality.processor.execute_command(command, parameters)
```
**Suggested Fix:**
Handle exceptions properly to prevent server crashes and unexpected behavior.
```python
try:
lollmsElfServer.personality.processor.execute_command(command, parameters)
except Exception as e:
lollmsElfServer.error(f"Error executing command: {str(e)}", client_id=client_id)
return {'status': False, 'error': str(e)}
```

View File

@ -0,0 +1,83 @@
# Security Vulnerability Report for lollms-webui
This report analyzes the provided Python code in the file `lollms_advanced.py` and identifies potential security vulnerabilities. Each vulnerability is explained, and a fix is proposed using code examples.
## 1. Unsafe File Path Validation
The code uses a regular expression to validate file paths, which might not be secure enough. The current regular expression `FILE_PATH_REGEX` only checks for alphanumeric characters, underscores, hyphens, and forward/backward slashes. This might allow path traversal attacks, where an attacker can access files outside the intended directory.
**Vulnerable Code:**
```python
FILE_PATH_REGEX = r'^[a-zA-Z0-9_\-\\\/]+$'
def validate_file_path(path):
return re.match(FILE_PATH_REGEX, path)
```
**Proposed Fix:**
Use the `sanitize_path` function from the `lollms.security` module to ensure that the file path is safe and does not allow path traversal attacks.
```python
from lollms.security import sanitize_path
def validate_file_path(path):
try:
sanitized_path = sanitize_path(path, allow_absolute_path=False)
return sanitized_path is not None
except Exception as e:
print(f"Path validation error: {str(e)}")
return False
```
## 2. Lack of Remote Access Restriction
The provided code does not restrict sensitive functionalities to localhost access only. This can potentially expose sensitive endpoints to remote attacks.
**Vulnerable Code:**
```python
# No remote access restriction is observed in the code
```
**Proposed Fix:**
Use the `forbid_remote_access` function from the `lollms.security` module to restrict sensitive functionalities to localhost access only.
```python
from lollms.security import forbid_remote_access
from fastapi import Depends
from lollms_webui import LOLLMSWebUI
def check_remote_access(lollms_webui: LOLLMSWebUI = Depends(LOLLMSWebUI)):
forbid_remote_access(lollms_webui.lollmsElfServer)
# Use the `check_remote_access` function as a dependency for sensitive endpoints
@router.post("/sensitive_endpoint")
def sensitive_endpoint(lollms_webui: LOLLMSWebUI = Depends(check_remote_access)):
# Endpoint logic
```
## 3. Unsafe Code Execution
The code imports multiple execution engines (Python, LaTeX, Bash, JavaScript, and HTML), which might be used for executing user-provided code. Executing user-provided code without proper sanitization and restrictions can lead to remote code execution (RCE) vulnerabilities.
**Vulnerable Code:**
```python
from utilities.execution_engines.python_execution_engine import execute_python
from utilities.execution_engines.latex_execution_engine import execute_latex
from utilities.execution_engines.shell_execution_engine import execute_bash
from utilities.execution_engines.javascript_execution_engine import execute_javascript
from utilities.execution_engines.html_execution_engine import execute_html
```
**Proposed Fix:**
Without the actual code implementing the execution engines, it's hard to provide a fix. However, it's recommended to:
1. Sanitize user inputs before passing them to the execution engines.
2. Limit the functionality of the execution engines to prevent RCE.
3. Use sandboxing techniques to isolate the execution environment.
4. Restrict access to sensitive system resources.
5. Consider using a separate, less privileged user account to run the execution engines.
Please review the implementation of the execution engines and apply the necessary security measures accordingly.

View File

@ -0,0 +1,214 @@
# Security Vulnerability Report for lollms_message.py
This report aims to identify potential security vulnerabilities in the provided code snippet from `lollms_message.py`. The analysis focuses on the FastAPI routes and their implementations, and suggests possible fixes for any detected issues.
## Table of Contents
1. [Unrestricted Access to Sensitive Endpoints](#unrestricted-access-to-sensitive-endpoints)
2. [Potential Exception Handling Improvements](#potential-exception-handling-improvements)
3. [Unsanitized Inputs](#unsanitized-inputs)
---
## Unrestricted Access to Sensitive Endpoints
The provided code snippet does not restrict access to sensitive endpoints for remote users. This means that anyone with access to the server can perform actions such as editing, ranking, and deleting messages.
### Affected Code Snippets
```python
@router.post("/edit_message")
async def edit_message(edit_params: EditMessageParameters):
...
@router.post("/message_rank_up")
async def message_rank_up(rank_params: MessageRankParameters):
...
@router.post("/message_rank_down")
def message_rank_down(rank_params: MessageRankParameters):
...
@router.post("/delete_message")
async def delete_message(delete_params: MessageDeleteParameters):
...
```
### Potential Impact
Unauthorized users may gain access to sensitive functionalities, leading to potential data manipulation and unauthorized deletion.
### Proposed Fix
To mitigate this issue, use the `forbid_remote_access` function from the `lollms.security` library to restrict access to these endpoints for remote users.
```python
from lollms.security import forbid_remote_access
@router.post("/edit_message")
async def edit_message(edit_params: EditMessageParameters):
forbid_remote_access(lollmsElfServer)
...
@router.post("/message_rank_up")
async def message_rank_up(rank_params: MessageRankParameters):
forbid_remote_access(lollmsElfServer)
...
@router.post("/message_rank_down")
def message_rank_down(rank_params: MessageRankParameters):
forbid_remote_access(lollmsElfServer)
...
@router.post("/delete_message")
async def delete_message(delete_params: MessageDeleteParameters):
forbid_remote_access(lollmsElfServer)
...
```
---
## Potential Exception Handling Improvements
The current exception handling in the code returns generic error messages to the user, which might not provide enough information for debugging purposes.
### Affected Code Snippets
```python
@router.post("/edit_message")
async def edit_message(edit_params: EditMessageParameters):
...
except Exception as ex:
trace_exception(ex)
return {"status": False, "error": "There was an error editing the message"}
@router.post("/message_rank_up")
async def message_rank_up(rank_params: MessageRankParameters):
...
except Exception as ex:
trace_exception(ex)
return {"status": False, "error": "There was an error ranking up the message"}
@router.post("/message_rank_down")
def message_rank_down(rank_params: MessageRankParameters):
...
except Exception as ex:
return {"status": False, "error":str(ex)}
@router.post("/delete_message")
async def delete_message(delete_params: MessageDeleteParameters):
...
except Exception as ex:
trace_exception(ex)
return {"status": False, "error": "There was an error deleting the message"}
```
### Potential Impact
Inadequate error information might make debugging more difficult and time-consuming.
### Proposed Fix
Instead of returning generic error messages, return more descriptive error messages or error codes to help identify the issue.
```python
@router.post("/edit_message")
async def edit_message(edit_params: EditMessageParameters):
...
except Exception as ex:
trace_exception(ex)
return {"status": False, "error_code": 1001, "error": str(ex)}
@router.post("/message_rank_up")
async def message_rank_up(rank_params: MessageRankParameters):
...
except Exception as ex:
trace_exception(ex)
return {"status": False, "error_code": 1002, "error": str(ex)}
@router.post("/message_rank_down")
def message_rank_down(rank_params: MessageRankParameters):
...
except Exception as ex:
return {"status": False, "error_code": 1003, "error": str(ex)}
@router.post("/delete_message")
async def delete_message(delete_params: MessageDeleteParameters):
...
except Exception as ex:
trace_exception(ex)
return {"status": False, "error_code": 1004, "error": str(ex)}
```
---
## Unsanitized Inputs
The current code does not sanitize inputs before passing them to the functions. This might lead to potential security issues, such as path traversal attacks.
### Affected Code Snippets
```python
@router.post("/edit_message")
async def edit_message(edit_params: EditMessageParameters):
client_id = edit_params.client_id
message_id = edit_params.id
new_message = edit_params.message
...
@router.post("/message_rank_up")
async def message_rank_up(rank_params: MessageRankParameters):
client_id = rank_params.client_id
message_id = rank_params.id
...
@router.post("/message_rank_down")
def message_rank_down(rank_params: MessageRankParameters):
client_id = rank_params.client_id
message_id = rank_params.id
...
@router.post("/delete_message")
async def delete_message(delete_params: MessageDeleteParameters):
client_id = delete_params.client_id
message_id = delete_params.id
...
```
### Potential Impact
Unsanitized inputs can lead to potential security vulnerabilities, such as path traversal attacks.
### Proposed Fix
Sanitize inputs using the `sanitize_path` function from the `lollms.security` library before passing them to the functions.
```python
from lollms.security import sanitize_path
@router.post("/edit_message")
async def edit_message(edit_params: EditMessageParameters):
client_id = sanitize_path(edit_params.client_id)
message_id = sanitize_path(edit_params.id)
new_message = sanitize_path(edit_params.message)
...
@router.post("/message_rank_up")
async def message_rank_up(rank_params: MessageRankParameters):
client_id = sanitize_path(rank_params.client_id)
message_id = sanitize_path(rank_params.id)
...
@router.post("/message_rank_down")
def message_rank_down(rank_params: MessageRankParameters):
client_id = sanitize_path(rank_params.client_id)
message_id = sanitize_path(rank_params.id)
...
@router.post("/delete_message")
async def delete_message(delete_params: MessageDeleteParameters):
client_id = sanitize_path(delete_params.client_id)
message_id = sanitize_path(delete_params.id)
...
```
---

View File

@ -0,0 +1,83 @@
# Security Vulnerability Report for lollms_playground.py
This report provides an analysis of the potential security vulnerabilities found in the `lollms_playground.py` file and suggests fixes for the identified issues.
## 1. Insecure File Operations
The `get_presets`, `add_preset`, and `del_preset` functions are vulnerable to path traversal attacks due to insecure file operations. An attacker can manipulate the input to traverse the file system and access or modify unauthorized files.
**Vulnerable Code Snippets:**
```python
# In get_presets function
presets_folder = Path("__file__").parent/"presets"
for filename in presets_folder.glob('*.yaml'):
with open(filename, 'r', encoding='utf-8') as file:
preset = yaml.safe_load(file)
# In add_preset function
filename = presets_folder/f"{fn}.yaml"
with open(filename, 'w', encoding='utf-8') as file:
yaml.dump(preset_data, file)
# In del_preset function
presets_file = lollmsElfServer.lollms_paths.personal_discussions_path/"lollms_playground_presets"/preset_data.name
presets_file.unlink()
```
### Recommended Fixes:
1. Use the `sanitize_path_from_endpoint` function from the `lollms.security` module to sanitize the input path before performing file operations.
2. Use `os.path.join` to safely join path components instead of directly concatenating them.
**Fixed Code Snippets:**
```python
# In get_presets function
presets_folder = sanitize_path_from_endpoint(str(Path("__file__").parent/"presets"), allow_absolute_path=False)
for filename in glob.glob(os.path.join(presets_folder, '*.yaml')):
with open(filename, 'r', encoding='utf-8') as file:
preset = yaml.safe_load(file)
# In add_preset function
fn = sanitize_path_from_endpoint(preset_data.name, allow_absolute_path=False)
filename = os.path.join(presets_folder, f"{fn}.yaml")
with open(filename, 'w', encoding='utf-8') as file:
yaml.dump(preset_data, file)
# In del_preset function
preset_name = sanitize_path_from_endpoint(preset_data.name, allow_absolute_path=False)
presets_file = os.path.join(lollmsElfServer.lollms_paths.personal_discussions_path, "lollms_playground_presets", preset_name)
presets_file.unlink()
```
## 2. Lack of Access Control for Sensitive Endpoints
The `lollms_playground.py` file does not implement access control for sensitive endpoints. This allows remote users to access and manipulate data, which should be restricted to localhost.
### Recommended Fix:
Use the `forbid_remote_access` function from the `lollms.security` module to restrict access to sensitive endpoints.
**Fixed Code Snippets:**
```python
from lollms.security import forbid_remote_access
@router.get("/get_presets")
def get_presets():
forbid_remote_access(lollmsElfServer)
# ... rest of the function
@router.post("/add_preset")
async def add_preset(preset_data: PresetData):
forbid_remote_access(lollmsElfServer)
# ... rest of the function
@router.post("/del_preset")
async def del_preset(preset_data: PresetData):
forbid_remote_access(lollmsElfServer)
# ... rest of the function
```
By implementing these fixes, the security vulnerabilities in the `lollms_playground.py` file can be significantly reduced.

View File

@ -0,0 +1,107 @@
# Security Vulnerability Report for lollms_webui_infos.py
This report analyzes the code in `lollms_webui_infos.py` and identifies potential security vulnerabilities. It also suggests fixes for the detected issues.
## 1. Lack of Input Validation and Sanitization
The code does not seem to validate or sanitize inputs, which can lead to security vulnerabilities like Path Traversal attacks. Although the provided context mentions the `sanitize_path` function from the `lollms.security` library, it is not used within the code.
**Vulnerable Code Snippet:**
```python
# No input validation or sanitization is performed
```
**Proposed Fix:**
Before using any user-provided input, especially file paths or database paths, validate and sanitize them using the `sanitize_path` function.
```python
from lollms.security import sanitize_path
# Assuming 'path' is user-provided input
sanitized_path = sanitize_path(path, allow_absolute_path=False)
```
## 2. Insecure Restart and Update Operations
The `restart_program` and `update_software` functions can be accessed remotely if the server is not in headless mode and is hosted on localhost. This can lead to unauthorized restart or update operations.
**Vulnerable Code Snippet:**
```python
@router.get("/restart_program")
async def restart_program():
# ...
@router.get("/update_software")
async def update_software():
# ...
```
**Proposed Fix:**
Use the `forbid_remote_access` function from the `lollms.security` library to restrict these sensitive operations to localhost.
```python
from lollms.security import forbid_remote_access
@router.get("/restart_program")
async def restart_program():
forbid_remote_access(lollmsElfServer)
# ...
@router.get("/update_software")
async def update_software():
forbid_remote_access(lollmsElfServer)
# ...
```
## 3. Duplicate Endpoints
There are two identical endpoints for `get_lollms_webui_version`. One of them is redundant and should be removed.
**Vulnerable Code Snippet:**
```python
@router.get("/get_versionID")
async def get_lollms_webui_version():
# ...
@router.get("/get_lollms_webui_version")
async def get_lollms_webui_version():
# ...
```
**Proposed Fix:**
Remove the redundant endpoint.
```python
@router.get("/get_lollms_webui_version")
async def get_lollms_webui_version():
# ...
```
## 4. Lack of Error Handling
The code does not handle potential errors or exceptions gracefully. This can lead to application crashes or exposure of sensitive information.
**Vulnerable Code Snippet:**
```python
# No error handling is performed
```
**Proposed Fix:**
Implement try-except blocks to handle potential errors and exceptions.
```python
try:
# Potentially error-prone code
except Exception as e:
# Handle the exception gracefully
```
Please consider these recommendations to improve the security and robustness of your application.

View File

@ -0,0 +1,99 @@
# Security Vulnerability Report for lollms_chatbox_events.py
This report presents an analysis of the provided Python code in the `lollms_chatbox_events.py` file and identifies potential security vulnerabilities. The vulnerabilities are presented with corresponding code snippets, explanations of potential flaws, and suggested fixes.
## Table of Contents
1. [Uncontrolled Resource Consumption (CWE-400)](#cwe-400)
2. [Path Traversal (CWE-22)](#cwe-22)
3. [Missing Access Control for Sensitive Functionality](#missing-access-control)
---
<a name="cwe-400"></a>
## 1. Uncontrolled Resource Consumption (CWE-400)
**Vulnerable Code Snippet:**
```python
@sio.on('take_picture')
def take_picture(sid):
try:
client = lollmsElfServer.session.get_client(sid)
lollmsElfServer.info("Loading camera")
if not PackageManager.check_package_installed("cv2"):
PackageManager.install_package("opencv-python")
import cv2
cap = cv2.VideoCapture(0)
# ...
```
**Explanation:**
The `take_picture` function captures an image using the default camera device (`cv2.VideoCapture(0)`). This functionality can lead to uncontrolled resource consumption, as an attacker could potentially trigger this event multiple times, causing the application to consume significant resources.
**Recommended Fix:**
Implement a rate-limiting mechanism to restrict the number of times the `take_picture` event can be triggered within a certain timeframe.
```python
from fastapi.security import OAuth2PasswordBearer
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="token")
@sio.on('take_picture')
@ratelimit(requests=1, per=60) # Limit to 1 request per minute
def take_picture(sid, auth: str = Depends(oauth2_scheme)):
# ...
```
---
<a name="cwe-22"></a>
## 2. Path Traversal (CWE-22)
**Vulnerable Code Snippet:**
```python
def add_webpage(sid, data):
# ...
url = data['url']
index = find_first_available_file_index(lollmsElfServer.lollms_paths.personal_uploads_path, "web_", ".txt")
file_path = lollmsElfServer.lollms_paths.personal_uploads_path / f"web_{index}.txt"
scrape_and_save(url=url, file_path=file_path)
# ...
```
**Explanation:**
The `add_webpage` function saves the scraped webpage content to a file. The file path is constructed using the `lollmsElfServer.lollms_paths.personal_uploads_path` and a generated index. An attacker may manipulate the URL to perform a path traversal attack, overwriting sensitive files or accessing unauthorized data.
**Recommended Fix:**
Use the provided `sanitize_path` function from the `lollms.security` module to ensure that the generated file path is safe and does not allow path traversal attacks.
```python
from lollms.security import sanitize_path
def add_webpage(sid, data):
# ...
url = data['url']
index = find_first_available_file_index(lollmsElfServer.lollms_paths.personal_uploads_path, "web_", ".txt")
file_path = sanitize_path(lollmsElfServer.lollms_paths.personal_uploads_path / f"web_{index}.txt")
scrape_and_save(url=url, file_path=file_path)
# ...
```
---
<a name="missing-access-control"></a>
## 3. Missing Access Control for Sensitive Functionality
**Explanation:**
The provided code does not have any access control checks for sensitive functionality, such as taking pictures or adding web pages. If the server is exposed to the internet, an attacker could potentially trigger these events and consume resources or access sensitive data.
**Recommended Fix:**
Use the provided `forbid_remote_access` function from the `lollms.security` module to ensure that sensitive functionality is restricted to localhost.
```python
from lollms.security import forbid_remote_access
def add_events(sio:socketio):
forbid_remote_access(lollmsElfServer)
# ...
```
Add the `forbid_remote_access` call at the beginning of the `add_events` function to restrict sensitive functionality to localhost.

View File

@ -0,0 +1,82 @@
# Security Vulnerability Report for lollms_discussion_events.py
This report aims to identify potential security vulnerabilities in the provided Python code from the `lollms_discussion_events.py` file. The analysis focuses on common security issues and suggests possible fixes.
## 1. Lack of Input Validation and Sanitization
### Vulnerability
The `new_discussion` and `load_discussion` functions do not perform input validation or sanitization on the data received from the client. This may expose the application to security risks such as SQL injection, Cross-Site Scripting (XSS), or path traversal attacks.
#### Vulnerable Code Snippet
```python
@sio.on('new_discussion')
async def new_discussion(sid, data):
...
title = data["title"]
...
@sio.on('load_discussion')
async def load_discussion(sid, data):
...
if "id" in data:
discussion_id = data["id"]
...
```
### Potential Flaws
- Unvalidated user input may lead to SQL injection or XSS attacks.
- Lack of input sanitization may allow path traversal attacks.
### Proposed Fix
Implement input validation and sanitization using appropriate libraries or functions. For example, you can use the `sanitize_path` function provided by the `lollms.security` library to prevent path traversal attacks.
#### Fixed Code Snippet
```python
from lollms.security import sanitize_input, sanitize_path
@sio.on('new_discussion')
async def new_discussion(sid, data):
...
title = sanitize_input(data["title"])
...
@sio.on('load_discussion')
async def load_discussion(sid, data):
...
if "id" in data:
discussion_id = sanitize_input(data["id"])
...
```
## 2. Exposure of Sensitive Functionality to Remote Access
### Vulnerability
The provided code does not restrict sensitive functionalities to localhost access only. This may allow remote users to access and exploit these functionalities if the server is exposed.
### Potential Flaws
- Remote users may access sensitive functionalities if the server is exposed.
- This may lead to unauthorized access, data leaks, or other security issues.
### Proposed Fix
Implement access restrictions for sensitive functionalities using the `forbid_remote_access` function provided by the `lollms.security` library.
#### Fixed Code Snippet
```python
from lollms.security import forbid_remote_access
def add_events(sio:socketio):
forbid_remote_access(lollmsElfServer)
...
```
By implementing these fixes, you can significantly improve the security of the `lollms_discussion_events.py` module and better protect the application against potential attacks.

View File

@ -0,0 +1,70 @@
# Security Vulnerability Report for lollms_generation_events.py
This report aims to identify potential security vulnerabilities in the provided code snippet from `lollms_generation_events.py` and suggest possible fixes.
## 1. Lack of Input Validation and Sanitization
The code does not seem to validate or sanitize the input data received from the client in the `handle_generate_msg`, `generate_msg_with_internet`, `handle_generate_msg_from`, and `handle_continue_generate_msg_from` functions. This could potentially lead to security vulnerabilities like Cross-Site Scripting (XSS) attacks or SQL Injection attacks.
**Vulnerable Code Snippets:**
```python
prompt = data["prompt"]
```
```python
id_ = data['id']
generation_type = data.get('msg_type',None)
```
**Proposed Fix:**
Implement input validation and sanitization for all data received from the client. For instance, you can use a library like `marshmallow` for input validation and `bleach` for input sanitization.
```python
from marshmallow import Schema, fields, validate
from bleach import clean
class GenerateMsgSchema(Schema):
prompt = fields.Str(required=True, validate=validate.Length(min=1))
class GenerateMsgFromSchema(Schema):
id = fields.Int(required=True, validate=validate.Range(min=0))
msg_type = fields.Str(allow_none=True)
# In your handler functions
data = GenerateMsgSchema().load(data)
prompt = clean(data["prompt"])
```
## 2. Potential Path Traversal Vulnerability
Although the provided code snippet does not directly handle file paths, it is important to note that the application might be vulnerable to path traversal attacks if it uses unsanitized user inputs to construct file paths elsewhere in the codebase.
**Proposed Fix:**
Use the `sanitize_path` function from the `lollms.security` module to sanitize any file paths constructed using user inputs.
```python
from lollms.security import sanitize_path
sanitized_path = sanitize_path(user_input_path)
```
## 3. Lack of Access Control for Remote Users
The code does not seem to restrict access to sensitive functionalities for remote users. This could potentially expose sensitive functionalities to unauthorized users if the server is not running on localhost.
**Proposed Fix:**
Use the `forbid_remote_access` function from the `lollms.security` module to restrict access to sensitive functionalities for remote users.
```python
from lollms.security import forbid_remote_access
try:
forbid_remote_access(lollmsElfServer)
except Exception as e:
ASCIIColors.error(str(e))
return
```

View File

@ -0,0 +1,68 @@
# Security Vulnerability Report for lollms_interactive_events.py
This report aims to identify potential security vulnerabilities in the provided code snippet from `lollms_interactive_events.py` and suggest fixes for them.
## Potential Vulnerabilities
### 1. Unrestricted Access to Sensitive Functionality
The current code does not seem to implement any access restrictions for sensitive functionalities such as starting and stopping video and audio streams. This could potentially allow remote users to access these functionalities if the server is not running on localhost.
**Vulnerable Code Snippet:**
```python
@sio.on('start_webcam_video_stream')
def start_webcam_video_stream(sid):
lollmsElfServer.info("Starting video capture")
try:
from lollms.media import WebcamImageSender
lollmsElfServer.webcam = WebcamImageSender(sio,lollmsCom=lollmsElfServer)
lollmsElfServer.webcam.start_capture()
except:
lollmsElfServer.InfoMessage("Couldn't load media library.\nYou will not be able to perform any of the media linked operations. please verify the logs and install any required installations")
```
### 2. Lack of Exception Specificity
The code uses a generic `except` clause without specifying the exception type. This could lead to unexpected behavior as the code will suppress all types of exceptions, making debugging more difficult.
**Vulnerable Code Snippet:**
```python
except:
lollmsElfServer.InfoMessage("Couldn't load media library.\nYou will not be able to perform any of the media linked operations. please verify the logs and install any required installations")
```
## Proposed Fixes
### 1. Restrict Access to Sensitive Functionality
To restrict access to sensitive functionalities, you can use the `forbid_remote_access` function from the `lollms.security` module. This function raises an exception if the server is not running on localhost.
**Fixed Code Snippet:**
```python
from lollms.security import forbid_remote_access
@sio.on('start_webcam_video_stream')
def start_webcam_video_stream(sid):
forbid_remote_access(lollmsElfServer)
lollmsElfServer.info("Starting video capture")
try:
from lollms.media import WebcamImageSender
lollmsElfServer.webcam = WebcamImageSender(sio,lollmsCom=lollmsElfServer)
lollmsElfServer.webcam.start_capture()
except Exception as e:
lollmsElfServer.InfoMessage("Couldn't load media library.\nYou will not be able to perform any of the media linked operations. please verify the logs and install any required installations")
```
### 2. Specify Exception Type
To improve error handling and make debugging easier, specify the exception type in the `except` clause.
**Fixed Code Snippet:**
```python
except ImportError as e:
lollmsElfServer.InfoMessage("Couldn't load media library.\nYou will not be able to perform any of the media linked operations. please verify the logs and install any required installations")
```

View File

@ -27,13 +27,15 @@ from fastapi import FastAPI, UploadFile, File
import shutil
import os
import platform
from urllib.parse import urlparse
from functools import partial
from datetime import datetime
from utilities.execution_engines.python_execution_engine import execute_python
from utilities.execution_engines.latex_execution_engine import execute_latex
from utilities.execution_engines.shell_execution_engine import execute_bash
from lollms.security import sanitize_path, forbid_remote_access
from lollms.internet import scrape_and_save
from urllib.parse import urlparse
import threading
# ----------------------- Defining router and main class ------------------------------
@ -47,7 +49,7 @@ class AddWebPageRequest(BaseModel):
class CmdExecutionRequest(BaseModel):
client_id: str = Field(...)
command: str = Field(..., description="Url to be used")
parameters: List
parameters: List[str] = Field(..., description="Command parameters")
@ -107,10 +109,11 @@ async def execute_personality_command(request: CmdExecutionRequest):
lollmsElfServer.busy=False
return {'status':True,}
"""
MAX_PAGE_SIZE = 10000000
@router.post("/add_webpage")
async def add_webpage(request: AddWebPageRequest):
forbid_remote_access(lollmsElfServer)
client = lollmsElfServer.session.get_client(request.client_id)
if client is None:
raise HTTPException(status_code=400, detail="Unknown client. This service only accepts lollms webui requests")
@ -121,8 +124,17 @@ async def add_webpage(request: AddWebPageRequest):
client = lollmsElfServer.session.get_client(request.client_id)
url = request.url
index = find_first_available_file_index(lollmsElfServer.lollms_paths.personal_uploads_path,"web_",".txt")
file_path=lollmsElfServer.lollms_paths.personal_uploads_path/f"web_{index}.txt"
scrape_and_save(url=url, file_path=file_path)
file_path=sanitize_path(lollmsElfServer.lollms_paths.personal_uploads_path/f"web_{index}.txt",True)
try:
result = urlparse(url)
if all([result.scheme, result.netloc]): # valid URL
if scrape_and_save(url=url, file_path=file_path,max_size=MAX_PAGE_SIZE):
raise HTTPException(status_code=400, detail="Web page too large")
else:
raise HTTPException(status_code=400, detail="Invalid URL")
except Exception as e:
raise HTTPException(status_code=400, detail=f"Exception : {e}")
try:
if not lollmsElfServer.personality.processor is None:
lollmsElfServer.personality.processor.add_file(file_path, client, partial(lollmsElfServer.process_chunk, client_id = request.client_id))

View File

@ -29,12 +29,15 @@ import re
import subprocess
from typing import Optional
# Regular expression pattern to validate file paths
FILE_PATH_REGEX = r'^[a-zA-Z0-9_\-\\\/]+$'
from lollms.security import sanitize_path
# Function to validate file paths using the regex pattern
def validate_file_path(path):
return re.match(FILE_PATH_REGEX, path)
try:
sanitized_path = sanitize_path(path, allow_absolute_path=False)
return sanitized_path is not None
except Exception as e:
print(f"Path validation error: {str(e)}")
return False
from utilities.execution_engines.python_execution_engine import execute_python
from utilities.execution_engines.latex_execution_engine import execute_latex
@ -72,9 +75,7 @@ async def execute_code(request: CodeRequest):
if lollmsElfServer.config.headless_server_mode:
return {"status":False,"error":"Code execution is blocked when in headless mode for obvious security reasons!"}
if lollmsElfServer.config.host!="localhost" and lollmsElfServer.config.host!="127.0.0.1":
return {"status":False,"error":"Code execution is blocked when the server is exposed outside for very obvious reasons!"}
forbid_remote_access(lollmsElfServer, "Code execution is blocked when the server is exposed outside for very obvious reasons!")
if not lollmsElfServer.config.turn_on_code_execution:
return {"status":False,"error":"Code execution is blocked by the configuration!"}

View File

@ -16,7 +16,7 @@ from lollms.types import MSG_TYPE
from lollms.utilities import detect_antiprompt, remove_text_from_string, trace_exception
from ascii_colors import ASCIIColors
from lollms.databases.discussions_database import DiscussionsDB
from lollms.security import forbid_remote_access
from safe_store.text_vectorizer import TextVectorizer, VectorizationMethod, VisualizationMethod
import tqdm
from typing import Any, Optional
@ -36,6 +36,7 @@ class EditMessageParameters(BaseModel):
@router.post("/edit_message")
async def edit_message(edit_params: EditMessageParameters):
forbid_remote_access(lollmsElfServer)
client_id = edit_params.client_id
message_id = edit_params.id
new_message = edit_params.message
@ -55,6 +56,7 @@ class MessageRankParameters(BaseModel):
@router.post("/message_rank_up")
async def message_rank_up(rank_params: MessageRankParameters):
forbid_remote_access(lollmsElfServer)
client_id = rank_params.client_id
message_id = rank_params.id
@ -68,6 +70,7 @@ async def message_rank_up(rank_params: MessageRankParameters):
@router.post("/message_rank_down")
def message_rank_down(rank_params: MessageRankParameters):
forbid_remote_access(lollmsElfServer)
client_id = rank_params.client_id
message_id = rank_params.id
try:
@ -82,6 +85,7 @@ class MessageDeleteParameters(BaseModel):
@router.post("/delete_message")
async def delete_message(delete_params: MessageDeleteParameters):
forbid_remote_access(lollmsElfServer)
client_id = delete_params.client_id
message_id = delete_params.id

View File

@ -15,7 +15,7 @@ from starlette.responses import StreamingResponse
from lollms.types import MSG_TYPE
from lollms.main_config import BaseConfig
from lollms.utilities import detect_antiprompt, remove_text_from_string, trace_exception, find_first_available_file_index, add_period, PackageManager
from lollms.security import sanitize_path_from_endpoint, validate_path
from lollms.security import sanitize_path_from_endpoint, validate_path, forbid_remote_access
from pathlib import Path
from ascii_colors import ASCIIColors
import os
@ -57,6 +57,7 @@ async def add_preset(preset_data: PresetData):
:param request: The HTTP request object.
:return: A JSON response with the status of the operation.
"""
forbid_remote_access(lollmsElfServer)
try:
presets_folder = lollmsElfServer.lollms_paths.personal_discussions_path/"lollms_playground_presets"
@ -83,6 +84,7 @@ async def del_preset(preset_data: PresetData):
:param preset_data: The data of the preset.
:return: A JSON response with the status of the operation.
"""
forbid_remote_access(lollmsElfServer)
# Get the JSON data from the POST request.
if preset_data.name is None:
raise HTTPException(status_code=400, detail="Preset name is missing in the request")
@ -110,6 +112,7 @@ async def save_presets(preset_data: PresetDataWithValue):
:param preset_data: The data of the preset.
:return: A JSON response with the status of the operation.
"""
forbid_remote_access(lollmsElfServer)
# Get the JSON data from the POST request.
if preset_data.preset is None:
raise HTTPException(status_code=400, detail="Preset data is missing in the request")

View File

@ -14,6 +14,7 @@ import pkg_resources
from lollms_webui import LOLLMSWebUI
from ascii_colors import ASCIIColors
from lollms.utilities import load_config, run_async
from lollms.security import sanitize_path, forbid_remote_access
from pathlib import Path
from typing import List
import sys
@ -42,6 +43,7 @@ async def get_lollms_webui_version():
@router.get("/restart_program")
async def restart_program():
"""Restart the program."""
forbid_remote_access(lollmsElfServer)
if lollmsElfServer.config.headless_server_mode:
return {"status":False,"error":"Restarting app is blocked when in headless mode for obvious security reasons!"}
@ -69,6 +71,7 @@ async def restart_program():
@router.get("/update_software")
async def update_software():
"""Update the software."""
forbid_remote_access(lollmsElfServer)
if lollmsElfServer.config.headless_server_mode:
return {"status":False,"error":"Updating app is blocked when in headless mode for obvious security reasons!"}
@ -99,6 +102,7 @@ async def update_software():
@router.get("/check_update")
def check_update():
"""Checks if an update is available"""
forbid_remote_access(lollmsElfServer)
if lollmsElfServer.config.headless_server_mode:
return {"status":False,"error":"Checking updates is blocked when in headless mode for obvious security reasons!"}

View File

@ -30,6 +30,7 @@ import time
from lollms.internet import scrape_and_save
from lollms.databases.discussions_database import Discussion
from lollms.security import forbid_remote_access
from datetime import datetime
router = APIRouter()
@ -38,6 +39,7 @@ lollmsElfServer:LOLLMSWebUI = LOLLMSWebUI.get_instance()
# ----------------------------------- events -----------------------------------------
def add_events(sio:socketio):
forbid_remote_access(lollmsElfServer)
@sio.on('create_empty_message')
def create_empty_message(sid, data):
client_id = sid

View File

@ -27,14 +27,15 @@ import threading
import os
from lollms.databases.discussions_database import Discussion
from lollms.security import forbid_remote_access
from datetime import datetime
router = APIRouter()
lollmsElfServer = LOLLMSWebUI.get_instance()
# ----------------------------------- events -----------------------------------------
def add_events(sio:socketio):
forbid_remote_access(lollmsElfServer)
@sio.on('new_discussion')
async def new_discussion(sid, data):
if lollmsElfServer.personality is None:

View File

@ -19,6 +19,7 @@ from lollms.personality import MSG_TYPE, AIPersonality
from lollms.types import MSG_TYPE, SENDER_TYPES
from lollms.utilities import load_config, trace_exception, gc
from lollms.utilities import find_first_available_file_index, convert_language_name
from lollms.security import forbid_remote_access
from lollms_webui import LOLLMSWebUI
from pathlib import Path
from typing import List
@ -32,6 +33,7 @@ lollmsElfServer = LOLLMSWebUI.get_instance()
# ----------------------------------- events -----------------------------------------
def add_events(sio:socketio):
forbid_remote_access(lollmsElfServer)
@sio.on('generate_msg')
def handle_generate_msg(sid, data):
client_id = sid

View File

@ -19,6 +19,7 @@ from lollms.personality import MSG_TYPE, AIPersonality
from lollms.types import MSG_TYPE, SENDER_TYPES
from lollms.utilities import load_config, trace_exception, gc
from lollms.utilities import find_first_available_file_index, convert_language_name, PackageManager, run_async
from lollms.security import forbid_remote_access
from lollms_webui import LOLLMSWebUI
from pathlib import Path
from typing import List
@ -37,6 +38,7 @@ lollmsElfServer:LOLLMSWebUI = LOLLMSWebUI.get_instance()
# ----------------------------------- events -----------------------------------------
def add_events(sio:socketio):
forbid_remote_access(lollmsElfServer)
@sio.on('start_webcam_video_stream')
def start_webcam_video_stream(sid):
lollmsElfServer.info("Starting video capture")