rename to open-webui

This commit is contained in:
Timothy J. Baek 2024-02-16 23:30:38 -08:00
parent 509d2a61eb
commit 90bcd1644a
9 changed files with 60 additions and 58 deletions

View File

@ -126,19 +126,19 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open
#### Installing with Docker 🐳
- **Important:** When using Docker to install Open WebUI, make sure to include the `-v ollama-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
- **Important:** When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
- **If Ollama is on your computer**, use this command:
```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
- **To build the container yourself**, follow these steps:
```bash
docker build -t ollama-webui .
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ollama-webui
docker build -t open-webui .
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always open-webui
```
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000).
@ -148,14 +148,14 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
```bash
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
Or for a self-built container:
```bash
docker build -t ollama-webui .
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend/data --name ollama-webui --restart always ollama-webui
docker build -t open-webui .
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always open-webui
```
### Installing Ollama and Open WebUI Together
@ -215,8 +215,8 @@ For other ways to install, like using Kustomize or Helm, check out [INSTALLATION
In case you want to update your local Docker installation to the latest version, you can do it performing the following actions:
```bash
docker rm -f ollama-webui
docker pull ghcr.io/ollama-webui/ollama-webui:main
docker rm -f open-webui
docker pull ghcr.io/open-webui/open-webui:main
[insert command you used to install]
```
@ -243,8 +243,8 @@ The Open WebUI consists of two primary components: the frontend and the backend
Run the following commands to install:
```sh
git clone https://github.com/ollama-webui/ollama-webui.git
cd ollama-webui/
git clone https://github.com/open-webui/open-webui.git
cd open-webui/
# Copying required .env file
cp -RPp example.env .env

View File

@ -15,7 +15,7 @@ If you're experiencing connection issues, its often due to the WebUI docker c
**Example Docker Command**:
```bash
docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
### General Connection Errors

View File

@ -40,9 +40,7 @@ class UrlUpdateForm(BaseModel):
@app.post("/url/update")
async def update_ollama_api_url(
form_data: UrlUpdateForm, user=Depends(get_admin_user)
):
async def update_ollama_api_url(form_data: UrlUpdateForm, user=Depends(get_admin_user)):
app.state.OLLAMA_API_BASE_URL = form_data.url
return {"OLLAMA_API_BASE_URL": app.state.OLLAMA_API_BASE_URL}
@ -68,10 +66,14 @@ async def proxy(path: str, request: Request, user=Depends(get_current_user)):
if path in ["pull", "delete", "push", "copy", "create"]:
if user.role != "admin":
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED, detail=ERROR_MESSAGES.ACCESS_PROHIBITED
status_code=status.HTTP_401_UNAUTHORIZED,
detail=ERROR_MESSAGES.ACCESS_PROHIBITED,
)
else:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=ERROR_MESSAGES.ACCESS_PROHIBITED)
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail=ERROR_MESSAGES.ACCESS_PROHIBITED,
)
headers.pop("host", None)
headers.pop("authorization", None)
@ -126,7 +128,7 @@ async def proxy(path: str, request: Request, user=Depends(get_current_user)):
try:
return await run_in_threadpool(get_request)
except Exception as e:
error_detail = "Ollama WebUI: Server Connection Error"
error_detail = "Open WebUI: Server Connection Error"
if r is not None:
try:
res = r.json()

View File

@ -61,7 +61,7 @@ async def update_ollama_api_url(
# yield line
# except Exception as e:
# print(e)
# error_detail = "Ollama WebUI: Server Connection Error"
# error_detail = "Open WebUI: Server Connection Error"
# yield json.dumps({"error": error_detail, "message": str(e)}).encode()
@ -110,7 +110,7 @@ async def proxy(path: str, request: Request, user=Depends(get_current_user)):
except Exception as e:
print(e)
error_detail = "Ollama WebUI: Server Connection Error"
error_detail = "Open WebUI: Server Connection Error"
if response is not None:
try:

View File

@ -9,7 +9,12 @@ from pydantic import BaseModel
from apps.web.models.users import Users
from constants import ERROR_MESSAGES
from utils.utils import decode_token, get_current_user, get_verified_user, get_admin_user
from utils.utils import (
decode_token,
get_current_user,
get_verified_user,
get_admin_user,
)
from config import OPENAI_API_BASE_URL, OPENAI_API_KEY, CACHE_DIR
import hashlib
@ -47,7 +52,6 @@ async def update_openai_url(form_data: UrlUpdateForm, user=Depends(get_admin_use
return {"OPENAI_API_BASE_URL": app.state.OPENAI_API_BASE_URL}
@app.get("/key")
async def get_openai_key(user=Depends(get_admin_user)):
return {"OPENAI_API_KEY": app.state.OPENAI_API_KEY}
@ -107,7 +111,7 @@ async def speech(request: Request, user=Depends(get_verified_user)):
except Exception as e:
print(e)
error_detail = "Ollama WebUI: Server Connection Error"
error_detail = "Open WebUI: Server Connection Error"
if r is not None:
try:
res = r.json()
@ -188,7 +192,7 @@ async def proxy(path: str, request: Request, user=Depends(get_verified_user)):
return response_data
except Exception as e:
print(e)
error_detail = "Ollama WebUI: Server Connection Error"
error_detail = "Open WebUI: Server Connection Error"
if r is not None:
try:
res = r.json()

View File

@ -10,16 +10,16 @@ services:
restart: unless-stopped
image: ollama/ollama:latest
ollama-webui:
open-webui:
build:
context: .
args:
OLLAMA_API_BASE_URL: '/ollama/api'
dockerfile: Dockerfile
image: ghcr.io/ollama-webui/ollama-webui:main
container_name: ollama-webui
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
volumes:
- ollama-webui:/app/backend/data
- open-webui:/app/backend/data
depends_on:
- ollama
ports:
@ -33,4 +33,4 @@ services:
volumes:
ollama: {}
ollama-webui: {}
open-webui: {}

View File

@ -1,36 +1,37 @@
# Contributing to Ollama WebUI
# Contributing to Open WebUI
🚀 **Welcome, Contributors!** 🚀
Your interest in contributing to Ollama WebUI is greatly appreciated. This document is here to guide you through the process, ensuring your contributions enhance the project effectively. Let's make Ollama WebUI even better, together!
Your interest in contributing to Open WebUI is greatly appreciated. This document is here to guide you through the process, ensuring your contributions enhance the project effectively. Let's make Open WebUI even better, together!
## 📌 Key Points
### 🦙 Ollama vs. Ollama WebUI
### 🦙 Ollama vs. Open WebUI
It's crucial to distinguish between Ollama and Ollama WebUI:
It's crucial to distinguish between Ollama and Open WebUI:
- **Ollama WebUI** focuses on providing an intuitive and responsive web interface for chat interactions.
- **Open WebUI** focuses on providing an intuitive and responsive web interface for chat interactions.
- **Ollama** is the underlying technology that powers these interactions.
If your issue or contribution pertains directly to the core Ollama technology, please direct it to the appropriate [Ollama project repository](https://ollama.com/). Ollama WebUI's repository is dedicated to the web interface aspect only.
If your issue or contribution pertains directly to the core Ollama technology, please direct it to the appropriate [Ollama project repository](https://ollama.com/). Open WebUI's repository is dedicated to the web interface aspect only.
### 🚨 Reporting Issues
Noticed something off? Have an idea? Check our [Issues tab](https://github.com/ollama-webui/ollama-webui/issues) to see if it's already been reported or suggested. If not, feel free to open a new issue. When reporting an issue, please follow our issue templates. These templates are designed to ensure that all necessary details are provided from the start, enabling us to address your concerns more efficiently.
Noticed something off? Have an idea? Check our [Issues tab](https://github.com/open-webui/oopen-webui/issues) to see if it's already been reported or suggested. If not, feel free to open a new issue. When reporting an issue, please follow our issue templates. These templates are designed to ensure that all necessary details are provided from the start, enabling us to address your concerns more efficiently.
> [!IMPORTANT]
>
> - **Template Compliance:** Please be aware that failure to follow the provided issue template, or not providing the requested information at all, will likely result in your issue being closed without further consideration. This approach is critical for maintaining the manageability and integrity of issue tracking.
>
> - **Detail is Key:** To ensure your issue is understood and can be effectively addressed, it's imperative to include comprehensive details. Descriptions should be clear, including steps to reproduce, expected outcomes, and actual results. Lack of sufficient detail may hinder our ability to resolve your issue.
### 🧭 Scope of Support
We've noticed an uptick in issues not directly related to Ollama WebUI but rather to the environment it's run in, especially Docker setups. While we strive to support Docker deployment, understanding Docker fundamentals is crucial for a smooth experience.
We've noticed an uptick in issues not directly related to Open WebUI but rather to the environment it's run in, especially Docker setups. While we strive to support Docker deployment, understanding Docker fundamentals is crucial for a smooth experience.
- **Docker Deployment Support**: Ollama WebUI supports Docker deployment. Familiarity with Docker is assumed. For Docker basics, please refer to the [official Docker documentation](https://docs.docker.com/get-started/overview/).
- **Docker Deployment Support**: Open WebUI supports Docker deployment. Familiarity with Docker is assumed. For Docker basics, please refer to the [official Docker documentation](https://docs.docker.com/get-started/overview/).
- **Advanced Configurations**: Setting up reverse proxies for HTTPS and managing Docker deployments requires foundational knowledge. There are numerous online resources available to learn these skills. Ensuring you have this knowledge will greatly enhance your experience with Ollama WebUI and similar projects.
- **Advanced Configurations**: Setting up reverse proxies for HTTPS and managing Docker deployments requires foundational knowledge. There are numerous online resources available to learn these skills. Ensuring you have this knowledge will greatly enhance your experience with Open WebUI and similar projects.
## 💡 Contributing
@ -40,14 +41,14 @@ Looking to contribute? Great! Here's how you can help:
We welcome pull requests. Before submitting one, please:
1. Discuss your idea or issue in the [issues section](https://github.com/ollama-webui/ollama-webui/issues).
1. Discuss your idea or issue in the [issues section](https://github.com/open-webui/open-webui/issues).
2. Follow the project's coding standards and include tests for new features.
3. Update documentation as necessary.
4. Write clear, descriptive commit messages.
### 📚 Documentation & Tutorials
Help us make Ollama WebUI more accessible by improving documentation, writing tutorials, or creating guides on setting up and optimizing the web UI.
Help us make Open WebUI more accessible by improving documentation, writing tutorials, or creating guides on setting up and optimizing the web UI.
### 🤔 Questions & Feedback
@ -55,6 +56,6 @@ Got questions or feedback? Join our [Discord community](https://discord.gg/5rJgQ
## 🙏 Thank You!
Your contributions, big or small, make a significant impact on Ollama WebUI. We're excited to see what you bring to the project!
Your contributions, big or small, make a significant impact on Open WebUI. We're excited to see what you bring to the project!
Together, let's create an even more powerful tool for the community. 🌟

View File

@ -1,20 +1,20 @@
# Security Policy
Our primary goal is to ensure the protection and confidentiality of sensitive data stored by users on ollama-webui.
## Supported Versions
Our primary goal is to ensure the protection and confidentiality of sensitive data stored by users on open-webui.
## Supported Versions
| Version | Supported |
| ------- | ------------------ |
| main | :white_check_mark: |
| others | :x: |
## Reporting a Vulnerability
If you discover a security issue within our system, please notify us immediately via a pull request or contact us on discord.
## Product Security
We regularly audit our internal processes and system's architecture for vulnerabilities using a combination of automated and manual testing techniques.
We are planning on implementing SAST and SCA scans in our project soon.

View File

@ -2,7 +2,7 @@
Sometimes, its beneficial to host Ollama, separate from the UI, but retain the RAG and RBAC support features shared across users:
# Ollama WebUI Configuration
# Open WebUI Configuration
## UI Configuration
@ -24,7 +24,6 @@ Enable the site first before you can request SSL:
`a2ensite server.com.conf` # this will enable the site. a2ensite is short for "Apache 2 Enable Site"
```
# For SSL
<VirtualHost 192.168.1.100:443>
@ -62,14 +61,12 @@ Create server.com.conf if it is not yet already created, containing the above `<
Once it's created, run `certbot --apache -d server.com`, this will request and add/create an SSL keys for you as well as create the server.com.le-ssl.conf
# Configuring Ollama Server
On your latest installation of Ollama, make sure that you have setup your api server from the official Ollama reference:
[Ollama FAQ](https://github.com/jmorganca/ollama/blob/main/docs/faq.md)
### TL;DR
The guide doesn't seem to match the current updated service file on linux. So, we will address it here:
@ -81,6 +78,7 @@ sudo nano /etc/systemd/system/ollama.service
```
Add the following lines:
```
Environment="OLLAMA_HOST=0.0.0.0:11434" # this line is mandatory. You can also specify
```
@ -106,15 +104,13 @@ Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/
WantedBy=default.target
```
Save the file by pressing CTRL+S, then press CTRL+X
When your computer restarts, the Ollama server will now be listening on the IP:PORT you specified, in this case 0.0.0.0:11434, or 192.168.254.106:11434 (whatever your local IP address is). Make sure that your router is correctly configured to serve pages from that local IP by forwarding 11434 to your local IP server.
# Ollama Model Configuration
## For the Ollama model configuration, use the following Apache VirtualHost setup:
## For the Ollama model configuration, use the following Apache VirtualHost setup:
Navigate to the apache sites-available directory:
@ -198,7 +194,6 @@ If you encounter any misconfiguration or errors, please file an issue or engage
Let's make this UI much more user friendly for everyone!
Thanks for making ollama-webui your UI Choice for AI!
Thanks for making open-webui your UI Choice for AI!
This doc is made by **Bob Reyes**, your **Ollama-Web-UI** fan from the Philippines.
This doc is made by **Bob Reyes**, your **Open-WebUI** fan from the Philippines.