Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Uncaught (in promise) TypeErrors in Open WebUI, causes page freeze & browser console errors until refresh #2208

Open
4 tasks done
silentoplayz opened this issue May 12, 2024 · 5 comments
Labels
bug Something isn't working core core feature help wanted Extra attention is needed

Comments

@silentoplayz
Copy link
Collaborator

silentoplayz commented May 12, 2024

Bug Report

Description

Bug Summary:

  • I've found a few Uncaught (in promise) TypeErrors in Open WebUI, causing unexpected page freeze-like behavior along with a few console log errors produced to potentially help debug the issues.

Expected Behavior:

  • The page should not freeze up due to Uncaught (in promise) TypeErrors, requiring a manual refresh by the user to unfreeze the page and clear out browser console log errors.

Actual Behavior:

  • A few different Uncaught (in promise) TypeErrors, followed by a series of error messages related to Immutable, requiring a manual refresh by the user to unfreeze the page to solve.

Environment

  • Open WebUI Version: v0.3.2 (latest)
  • Ollama: This bug has been found to be present on versions v0.1.35-v0.1.42 based on my testing.
  • Operating System: Windows 11 Pro Insider Preview (Edition) - Version: 24H2 - Installed on: May 19, 2024 - OS build: 26120.670 - Experience: Windows Feature Experience Pack 1000.26100.6.0
  • Browser: Firefox v126.0.1 (64-bit)

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs + Reproduction Details:

1st error (browser console log):

Uncaught (in promise) TypeError: t is null
    $ dom.js:252
    m Tooltip.svelte:39
    _t Component.js:44
    m ResponseMessage.svelte:978
    p ResponseMessage.svelte:945
    p ResponseMessage.svelte:642
    p ResponseMessage.svelte:584
    p ResponseMessage.svelte:423
    p ResponseMessage.svelte:383
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    ee Chat.svelte:1236
    q Chat.svelte:1219
    ctx Component.js:138
    K Messages.svelte:215
    K Messages.svelte:212
    j Messages.svelte:295
    n Messages.svelte:311
    t lifecycle.js:105
    t lifecycle.js:104
    D UserMessage.svelte:55
    te UserMessage.svelte:345
    c rocket-loader.min.js:1
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m UserMessage.svelte:362
    m Tooltip.svelte:39
    _t Component.js:44
    m UserMessage.svelte:341
    m UserMessage.svelte:419
    m UserMessage.svelte:422
    _t Component.js:44
    m Messages.svelte:308
    m Messages.svelte:370
    p Messages.svelte:286
    p Messages.svelte:285
    p Messages.svelte:380
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    Nt Chat.svelte:300
    xe MessageInput.svelte:706
    c rocket-loader.min.js:1
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m MessageInput.svelte:987
    m MessageInput.svelte:445
    _t Component.js:44
    m Chat.svelte:1271
    m Chat.svelte:1204
    _t Component.js:44
    m Help.svelte:40
    _t Component.js:44
    m root.svelte:54
    m root.svelte:49
    m +layout.svelte:196
    p +layout.svelte:193
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    si +layout.svelte:179
    F utils.js:41
dom.js:252:62

Steps to Reproduce 1st Error:

  1. Log into Open WebUI on either one of my personal TLDs or directly via localhost.
  2. Choose a local or external large language model in the model selector dropdown to begin chatting with.
  3. Send a query and await for the finished response.
  4. Send another query and await for the finished response.
  5. Delete your most recent message (2nd query) sent to the LLM.
  6. Press F12 on your keyboard to open up your browser console and switch to the Console tab to observe error.
  • The bugs I've reported above are reproducible with both local and external models.
  • All steps mentioned for step-by-step bug reproduction methods are crucial for ensuring you can replicate these same (or related) errors in your browser console.
  1. Press the Edit button of the first message you sent in the chat (before the LLM's response in this chat).
  2. Observe new error:
Uncaught (in promise) TypeError: C is undefined
    I UserMessage.svelte:36
    ee UserMessage.svelte:295
    c rocket-loader.min.js:1
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m UserMessage.svelte:312
    m Tooltip.svelte:39
    _t Component.js:44
    m UserMessage.svelte:291
    m UserMessage.svelte:419
    m UserMessage.svelte:422
    _t Component.js:44
    m Messages.svelte:308
    m Messages.svelte:370
    m Messages.svelte:374
    p Messages.svelte:285
    p Messages.svelte:380
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    Zo Sidebar.svelte:39
    o index.js:56
    Ut Chat.svelte:431
    Nt Chat.svelte:362
    Rt MessageInput.svelte:475
    jt dom.js:371
    c rocket-loader.min.js:1
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m MessageInput.svelte:987
    m MessageInput.svelte:445
    _t Component.js:44
    m Chat.svelte:1271
    m Chat.svelte:1204
    _t Component.js:44
    m Help.svelte:40
    _t Component.js:44
    m root.svelte:54
    m root.svelte:49
    m +layout.svelte:196
    p +layout.svelte:193
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    si +layout.svelte:179
    F utils.js:41
    _t Component.js:47
    ut scheduler.js:99
    promise callback*lt scheduler.js:20
UserMessage.svelte:36:2
  1. Press the Edit button of the LLM's response to your message.
  2. Observe new error:
Uncaught (in promise) TypeError: B is undefined
    ze ResponseMessage.svelte:316
    Ne ResponseMessage.svelte:650
    c rocket-loader.min.js:1
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m ResponseMessage.svelte:665
    m Tooltip.svelte:39
    _t Component.js:44
    m ResponseMessage.svelte:644
    m ResponseMessage.svelte:945
    m ResponseMessage.svelte:1007
    p ResponseMessage.svelte:584
    p ResponseMessage.svelte:423
    p ResponseMessage.svelte:383
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    W Chat.svelte:974
    Ut Chat.svelte:495
    Ut Chat.svelte:441
    Nt Chat.svelte:362
    Rt MessageInput.svelte:475
    jt dom.js:371
    c rocket-loader.min.js:1
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m MessageInput.svelte:987
    m MessageInput.svelte:445
    _t Component.js:44
    m Chat.svelte:1271
    m Chat.svelte:1204
    _t Component.js:44
    m Help.svelte:40
    _t Component.js:44
    m root.svelte:54
    m root.svelte:49
    m +layout.svelte:196
    p +layout.svelte:193
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    si +layout.svelte:179
    F utils.js:41
    _t Component.js:47
    ut scheduler.js:99
    promise callback*lt scheduler.js:20
ResponseMessage.svelte:316:2

2nd error; not sure if this is a huge issue at all though, as this doesn't appear to cause any issues and is unrelated to the actual bug report I'm making here (browser console log):

This error occurred upon signing out of my Open WebUI account:

Error: Promised response from onMessage listener went out of scope 6 background.js:505:27

This error occurred upon signing into my Open WebUI account:

Error: Promised response from onMessage listener went out of scope

Docker Container Logs:

  • Docker Desktop/Container logs are irrelevant to the issue at hand. There's nothing within Docker logs that screams that there's an "ISSUE" like browser logs does specifically.

Installation Method

Docker Desktop via a custom-built docker-compose.yml file for my Open WebUI instance and domains. Ollama is running natively on Windows via the Windows (Preview) version.

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Screen Recording:

Bug

Please let me know if I've missed anything or if there's any additional information needed!

@silentoplayz silentoplayz changed the title Bug: Uncaught TypeError when (not) annotating language model responses in Open WebUI, causing page freeze until refresh bug: Uncaught TypeError when (not) annotating language model responses in Open WebUI, causing page freeze until refresh May 12, 2024
@silentoplayz silentoplayz changed the title bug: Uncaught TypeError when (not) annotating language model responses in Open WebUI, causing page freeze until refresh bug: Uncaught TypeError when (not) annotating LLM responses in Open WebUI, causes page freeze until refresh May 12, 2024
@tjbck
Copy link
Contributor

tjbck commented May 13, 2024

1st issue should be fixed on dev!

tjbck added a commit that referenced this issue May 13, 2024
Haunui pushed a commit to Haunui/open-webui that referenced this issue May 17, 2024
@Yanyutin753
Copy link
Contributor

Every time I delete information, the page fails to report an error and cannot be used normally. After refreshing the page, it can only be used normally.

Console error

image

@silentoplayz silentoplayz changed the title bug: Uncaught TypeError when (not) annotating LLM responses in Open WebUI, causes page freeze until refresh bug: Uncaught TypeErrors in Open WebUI, causes page freeze & browser console errors until refresh May 22, 2024
@silentoplayz silentoplayz changed the title bug: Uncaught TypeErrors in Open WebUI, causes page freeze & browser console errors until refresh bug: Uncaught (in promise) TypeErrors in Open WebUI, causes page freeze & browser console errors until refresh May 22, 2024
@tjbck tjbck added bug Something isn't working help wanted Extra attention is needed core core feature labels May 26, 2024
@kojdj0811
Copy link

I noticed a similar error. I'm using a 70b model on a slow GPU (P40), and I get an error at exactly 5 minutes and the text won't update.

Even after the above phenomenon occurred, the AI was still generating answers on the server. After the operation of the AI for answers was over, it was confirmed that VRAM was cleared after an additional 5 minutes as the default setting of OLLAMA_KEEP_ALIVE.

image

@silentoplayz
Copy link
Collaborator Author

silentoplayz commented Jun 11, 2024

I noticed a similar error. I'm using a 70b model on a slow GPU (P40), and I get an error at exactly 5 minutes and the text won't update.

Even after the above phenomenon occurred, the AI was still generating answers on the server. After the operation of the AI for answers was over, it was confirmed that VRAM was cleared after an additional 5 minutes as the default setting of OLLAMA_KEEP_ALIVE.

If you're still experiencing this problem, I recommend opening up a separate new bug report with more details that may help the repo maintainer (or contributors) to debug and solve the issue within the codebase. Logs could be handy here.

@silentoplayz
Copy link
Collaborator Author

1st issue should be fixed on dev!

I've updated the bug report, so this comment no longer aligns as of now (just a heads up!).

On a lighter note, I'm glad I could help by reporting bugs I've found that you then have fixed to make Open WebUI better for everyone to enjoy. Thanks for your hard work. 🍻

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working core core feature help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants