open-webui/TROUBLESHOOTING.md

37 lines
2.9 KiB
Markdown
Raw Permalink Normal View History

2024-02-17 16:07:43 +08:00
# Open WebUI Troubleshooting Guide
2023-11-15 04:20:51 +08:00
2024-02-17 15:11:23 +08:00
## Understanding the Open WebUI Architecture
2023-12-30 01:29:12 +08:00
2024-02-17 15:11:23 +08:00
The Open WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
2023-12-30 01:29:12 +08:00
2024-03-07 03:44:00 +08:00
- **How it Works**: The Open WebUI is designed to interact with the Ollama API through a specific route. When a request is made from the WebUI to Ollama, it is not directly sent to the Ollama API. Initially, the request is sent to the Open WebUI backend via `/ollama` route. From there, the backend is responsible for forwarding the request to the Ollama API. This forwarding is accomplished by using the route specified in the `OLLAMA_BASE_URL` environment variable. Therefore, a request made to `/ollama` in the WebUI is effectively the same as making a request to `OLLAMA_BASE_URL` in the backend. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_BASE_URL/api/tags` in the backend.
2023-12-30 01:29:12 +08:00
- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
2024-02-17 15:11:23 +08:00
## Open WebUI: Server Connection Error
2023-11-15 04:20:51 +08:00
2023-12-30 01:29:12 +08:00
If you're experiencing connection issues, its often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.
2023-11-15 04:20:51 +08:00
2023-12-30 01:29:12 +08:00
**Example Docker Command**:
2023-11-15 04:20:51 +08:00
```bash
2024-03-07 03:44:00 +08:00
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
2023-11-15 04:20:51 +08:00
```
2024-10-14 15:13:26 +08:00
### Error on Slow Responses for Ollama
Open WebUI has a default timeout of 5 minutes for Ollama to finish generating the response. If needed, this can be adjusted via the environment variable AIOHTTP_CLIENT_TIMEOUT, which sets the timeout in seconds.
2023-12-30 01:29:12 +08:00
### General Connection Errors
2023-11-15 04:20:51 +08:00
2024-02-15 06:17:45 +08:00
**Ensure Ollama Version is Up-to-Date**: Always start by checking that you have the latest version of Ollama. Visit [Ollama's official site](https://ollama.com/) for the latest updates.
2023-11-15 04:20:51 +08:00
2023-12-30 01:29:12 +08:00
**Troubleshooting Steps**:
2023-11-15 04:20:51 +08:00
2023-12-30 01:29:12 +08:00
1. **Verify Ollama URL Format**:
2024-03-07 03:44:00 +08:00
- When running the Web UI container, ensure the `OLLAMA_BASE_URL` is correctly set. (e.g., `http://192.168.1.1:11434` for different host setups).
2024-02-17 15:11:23 +08:00
- In the Open WebUI, navigate to "Settings" > "General".
2024-03-07 03:44:00 +08:00
- Confirm that the Ollama Server URL is correctly set to `[OLLAMA URL]` (e.g., `http://localhost:11434`).
2023-11-15 04:20:51 +08:00
2023-12-30 01:29:12 +08:00
By following these enhanced troubleshooting steps, connection issues should be effectively resolved. For further assistance or queries, feel free to reach out to us on our community Discord.