Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use open-webui without setting up "Memory" #2408

Closed
4 tasks done
Kisaragi-ng opened this issue May 20, 2024 · 5 comments
Closed
4 tasks done

Unable to use open-webui without setting up "Memory" #2408

Kisaragi-ng opened this issue May 20, 2024 · 5 comments

Comments

@Kisaragi-ng
Copy link

Bug Report

Description

Bug Summary:
In version v0.1.125, open-webui is not passing chat into ollama, causing the chat just hang / stuck. However, by enabling Memory and creating one personality make the chat works properly.

Steps to Reproduce:

  1. using docker-compose.yml, run docker compose up
  2. install phi3:latest from ollama library
  3. begin chat using template
  4. ollama doesn't respond to chat
  5. enable Memory, create one personality, restart chat
  6. chat works as intended

Expected Behavior:
open-webui should still works without setting up Memory, as this feature is Experimental (and from my understanding, this is not mandatory).

Actual Behavior:
open-webui doesn't respond to chat.

Environment

  • Open WebUI Version: v0.1.125

  • Ollama (if applicable): 0.1.38

  • Operating System: docker

  • Browser (if applicable): Firefox 115.11.0esr, Edge 125.0.2535.51 (Official build) (64-bit)

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:

firefox:

Backend config: 
Object { status: true, name: "Open WebUI", version: "0.1.125", auth: true, default_locale: "en-US", images: false, default_models: null, default_prompt_suggestions: (6) […], trusted_header_auth: false, admin_export_enabled: true }
0.3690807d.js:1:37714
submitPrompt <empty string> 4.dc01e3b1.js:5:2577
modelId phi3:latest 4.dc01e3b1.js:5:3857
Uncaught (in promise) TypeError: l.userContext is null
    Immutable 38
    <anonymous> https://mydomain.com/:79
    promise callback* https://mydomain.com/:78
4.dc01e3b1.js:8:3
    we Immutable
    AsyncFunctionThrow self-hosted:857
    Immutable 17
    InterpretGeneratorResume self-hosted:1461
    AsyncFunctionNext self-hosted:852
    (Async: async)
    F Immutable
    map self-hosted:221
    Immutable 6
    InterpretGeneratorResume self-hosted:1461
    AsyncFunctionNext self-hosted:852
    (Async: async)
    F Immutable
    map self-hosted:221
    Immutable 6
    InterpretGeneratorResume self-hosted:1461
    AsyncFunctionNext self-hosted:852
    Immutable 3
    <anonymous> https://mydomain.com/:79
    (Async: promise callback)
    <anonymous> https://mydomain.com/:78

edge:

Backend config: 
Object { status: true, name: "Open WebUI", version: "0.1.125", auth: true, default_locale: "en-US", images: false, default_models: "phi3:latest", default_prompt_suggestions: (6) […], trusted_header_auth: false, admin_export_enabled: true }
0.3690807d.js:1:37714
setting default models globally index.4df4e131.js:93:23328
submitPrompt <empty string> 4.dc01e3b1.js:5:2577
modelId phi3:latest 4.dc01e3b1.js:5:3857
Uncaught (in promise) TypeError: l.userContext is null
    Immutable 5
4.dc01e3b1.js:8:3

Docker Container Logs:
unfortunately, nothing interesting:

open-webui  | INFO:     127.0.0.1:57778 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:55764 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:58104 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:59696 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:38172 - "GET /health HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET / HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/config HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/auths/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/changelog HTTP/1.1" 200 OK
open-webui  | INFO:apps.ollama.main:get_all_models()
open-webui  | INFO:apps.openai.main:get_all_models()
ollama      | [GIN] 2024/05/20 - 04:58:30 | 200 |     416.606µs |      172.21.0.3 | GET      "/api/tags"
open-webui  | INFO:     ****:0 - "GET /ollama/api/tags HTTP/1.1" 200 OK
open-webui  | INFO:apps.openai.main:get_all_models()
open-webui  | INFO:     ****:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /openai/api/models HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
open-webui  | INFO:apps.ollama.main:get_all_models()
ollama      | [GIN] 2024/05/20 - 04:58:31 | 200 |     393.705µs |      172.21.0.3 | GET      "/api/tags"
open-webui  | INFO:     ****:0 - "GET /ollama/api/tags HTTP/1.1" 200 OK
open-webui  | INFO:apps.openai.main:get_all_models()
open-webui  | INFO:apps.openai.main:get_all_models()
open-webui  | INFO:     ****:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /openai/api/models HTTP/1.1" 200 OK
ollama      | [GIN] 2024/05/20 - 04:58:31 | 200 |        23.7µs |      172.21.0.3 | GET      "/api/version"
open-webui  | INFO:     ****:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /ollama/api/version HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "POST /api/v1/chats/new HTTP/1.1" 200 OK
open-webui  | INFO:     ****:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
open-webui  | INFO:     127.0.0.1:39706 - "GET /health HTTP/1.1" 200 OK

Screenshots (if applicable):
Memory disabled:
ollamai1

Memory enabled:
ollamai2

Memory entry:
ollamai3

Installation Method

Docker version 26.1.2, build 211e74b

Additional Information

I'm willing to provide additional information if required. during this open-webui usage, I have already make sure there are no ad blocker running active, and I also have tested this in private window without addon enabled resulting the same result. I also found this error after updating my docker image from adb86c02cf4b to b0ef2a0e3744 (current main docker image).

@mptyl
Copy link

mptyl commented May 20, 2024

Same problem + error in connecting with openAI. An error windows pop-up appears with a join error. An OpenAI model can only be used by deleting the system prompt.

If you enable the memory, you can use the System prompt with OpenAI models like gpt-3.5.

To replicate the event:

  1. remove the memory
  2. write something in system prompt, whatever you want
  3. use an OpenAI model (exception is thrown) or use a local model (hangs up)

@tjbck
Copy link
Contributor

tjbck commented May 20, 2024

Please pull the current latest main, and let me know if the issue persists!

image

@tjbck tjbck closed this as completed May 20, 2024
@CultusMechanicus
Copy link

It's still not working properly, now disabling the Memory toggle also disables the System Prompt too.

@tjbck
Copy link
Contributor

tjbck commented May 20, 2024

@CultusMechanicus Added a fix with #2427, let me know if the issue persists!

@JamesBedwell
Copy link

This seems to be related to my issue #2441. I enabled memory and it allows me to send the message once again, but it ignores the system prompt. I have not yet tried the latest one with the fix #2427.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants