Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: hanging connection causing blank screen #2337

Closed
3 of 4 tasks
Zambito1 opened this issue May 17, 2024 · 13 comments
Closed
3 of 4 tasks

bug: hanging connection causing blank screen #2337

Zambito1 opened this issue May 17, 2024 · 13 comments

Comments

@Zambito1
Copy link

Zambito1 commented May 17, 2024

Bug Report

Description

Bug Summary:
If the Open WebUI backend hangs indefinitely, the UI will show a blank screen with just the keybinding help button in the bottom right.

image

Steps to Reproduce:

I noticed this because I run Open WebUI on my desktop, and Ollama on another machine. If I connected to a VPN on my desktop, LAN connections from my desktop would hang indefinitely. When I tried to boot up Open WebUI, I would just see the screen above.

The easiest way to reproduce this is to run nc -lp 11434 and try to use that as the Ollama server. You will see something like:

$ nc -lp 11434
GET /api/tags HTTP/1.1
Host: host.docker.internal:11434
Accept: */*
Accept-Encoding: gzip, deflate
User-Agent: Python/3.11 aiohttp/3.9.5

As the Open WebUI backend tries to connect to Ollama, but nc will never respond to the request, nor will it close the connection. You will see the above screen when you open the web interface.

I was able to work around my VPN issue by splitting my connection to only route traffic that I want to access over the VPN with the virtual interface.

Expected Behavior:

You can see the web interface, and ideally there will be some sort of timeout to show a connection error.

Actual Behavior:

You are hit with a wall of nothing.

Environment

  • Open WebUI Version: v0.1.124

  • Ollama (if applicable): NA

  • Operating System: NA

  • Browser (if applicable): NA

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
NA, the connection issue is on the backend. There is nothing that fails in the UI logs.

Docker Container Logs:
This is what happens when you load the index page:

INFO:     172.17.0.1:58940 - "GET /api/config HTTP/1.1" 200 OK
INFO:     172.17.0.1:58940 - "GET /api/v1/auths/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:58940 - "GET /api/changelog HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()

If you kill nc, the UI will actually show up:

ERROR:apps.ollama.main:Connection error: Server disconnected
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO:     172.17.0.1:40308 - "GET /ollama/api/tags HTTP/1.1" 200 OK
INFO:apps.openai.main:get_all_models()
INFO:apps.openai.main:get_all_models()
INFO:     172.17.0.1:40308 - "GET /openai/api/models HTTP/1.1" 200 OK
INFO:     172.17.0.1:40308 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
INFO:     172.17.0.1:40308 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:40308 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:40308 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:40308 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO:     172.17.0.1:40308 - "GET /ollama/api/tags HTTP/1.1" 200 OK
INFO:     172.17.0.1:40308 - "GET /static/favicon.png HTTP/1.1" 304 Not Modified
INFO:apps.ollama.main:get_all_models()
INFO:     172.17.0.1:40308 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO:     172.17.0.1:49174 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error
INFO:apps.openai.main:get_all_models()
INFO:apps.openai.main:get_all_models()
INFO:     172.17.0.1:49174 - "GET /openai/api/models HTTP/1.1" 200 OK
INFO:     172.17.0.1:49174 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK

Screenshots (if applicable):
See above.

Installation Method

Docker, but probably does not matter.

@tjbck
Copy link
Contributor

tjbck commented May 17, 2024

PR welcome!

@tjbck tjbck changed the title Blank screen when the Ollama connection is hanging bug: hanging connection causing blank screen May 17, 2024
@tjbck
Copy link
Contributor

tjbck commented May 17, 2024

Updated some code on our dev branch, let us know if that did anything for you.

@Zambito1
Copy link
Author

Hey @tjbck, I got around to testing the dev branch today. The problem still seems to happen. I should have said to use nc -lk 11434 or ncat -lk 11434, that will continue to listen to connections after the first one. If you do not use -k, refreshing the page will kill netcat, and the UI will appear.

@Hiradpi
Copy link

Hiradpi commented May 22, 2024

having the same exact issue here but Im not using docker

❯ bash start.sh
Loading WEBUI_SECRET_KEY from file, not provided as an environment variable.
Loading WEBUI_SECRET_KEY from .webui_secret_key

  ___                    __        __   _     _   _ ___ 
 / _ \ _ __   ___ _ __   \ \      / /__| |__ | | | |_ _|
| | | | '_ \ / _ \ '_ \   \ \ /\ / / _ \ '_ \| | | || | 
| |_| | |_) |  __/ | | |   \ V  V /  __/ |_) | |_| || | 
 \___/| .__/ \___|_| |_|    \_/\_/ \___|_.__/ \___/|___|
      |_|                                               

      
v0.2.0.dev1 - building the best open-source AI user interface.      
https://github.com/open-webui/open-webui

INFO:     Started server process [72181]
INFO:     Waiting for application startup.
INFO:apps.litellm.main:start_litellm_background
INFO:apps.litellm.main:run_background_process
INFO:apps.litellm.main:Executing command: ['litellm', '--port', '14365', '--host', '127.0.0.1', '--telemetry', 'False', '--config', '/home/hirad/tool/WEBUI/open-webui/backend/data/litellm/config.yaml']
ERROR:apps.litellm.main:Failed to start subprocess: [Errno 2] No such file or directory
ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-3' coro=<start_litellm_background() done, defined at /home/hirad/tool/WEBUI/open-webui/backend/apps/litellm/main.py:130> exception=FileNotFoundError(2, 'No such file or directory')>
Traceback (most recent call last):
  File "/home/hirad/tool/WEBUI/open-webui/backend/apps/litellm/main.py", line 145, in start_litellm_background
    await run_background_process(command)
  File "/home/hirad/tool/WEBUI/open-webui/backend/apps/litellm/main.py", line 106, in run_background_process
    process = await asyncio.create_subprocess_exec(
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/asyncio/subprocess.py", line 223, in create_subprocess_exec
    transport, protocol = await loop.subprocess_exec(
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "uvloop/loop.pyx", line 2820, in subprocess_exec
  File "uvloop/loop.pyx", line 2778, in __subprocess_run
  File "uvloop/handles/process.pyx", line 611, in uvloop.loop.UVProcessTransport.new
  File "uvloop/handles/process.pyx", line 112, in uvloop.loop.UVProcess._init
FileNotFoundError: [Errno 2] No such file or directory
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO:     127.0.0.1:43710 - "GET / HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/entry/start.b373985c.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43724 - "GET /_app/immutable/chunks/scheduler.bcadb9af.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43744 - "GET /_app/immutable/chunks/index.33abb791.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43764 - "GET /_app/immutable/chunks/preload-helper.a4192956.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /_app/immutable/chunks/singletons.7b5aa448.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43754 - "GET /_app/immutable/entry/app.5440acdc.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/chunks/index.bbb895f1.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /logo.svg HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/nodes/0.f688e5ad.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/chunks/globals.7f7f1b26.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43754 - "GET /_app/immutable/chunks/index.9487f70b.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /_app/immutable/chunks/navigation.7ceac460.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43764 - "GET /_app/immutable/chunks/spread.8a54911c.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43754 - "GET /_app/immutable/assets/Toaster.ebb080d6.css HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43744 - "GET /_app/immutable/chunks/Toaster.svelte_svelte_type_style_lang.a4036649.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43724 - "GET /_app/immutable/assets/0.95c63d39.css HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/chunks/each.f3e982ad.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43724 - "GET /_app/immutable/assets/dayjs.beb6c5d8.css HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /_app/immutable/chunks/index.dffa361e.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43764 - "GET /_app/immutable/chunks/index.2f7d24b6.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/assets/Messages.e8d876d7.css HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43754 - "GET /_app/immutable/assets/2.f41ff901.css HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43744 - "GET /_app/immutable/assets/Chat.2c02f5b5.css HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43724 - "GET /_app/immutable/chunks/index.e5c8c553.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43764 - "GET /_app/immutable/nodes/1.a362bd41.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /_app/immutable/chunks/stores.94806990.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/nodes/2.c36cbd74.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43754 - "GET /_app/immutable/chunks/FileSaver.min.898eb36f.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43724 - "GET /_app/immutable/chunks/_commonjsHelpers.de833af9.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43744 - "GET /_app/immutable/chunks/index.b538cffa.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/chunks/index.c7f29288.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /_app/immutable/chunks/index.ad993ff1.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43764 - "GET /_app/immutable/chunks/index.8f7d3b63.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43754 - "GET /_app/immutable/chunks/index.0c08649e.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43744 - "GET /_app/immutable/chunks/dayjs.min.e4172223.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43724 - "GET /_app/immutable/chunks/index.3d76b714.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/chunks/AdvancedParams.600b48aa.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43754 - "GET /_app/immutable/chunks/index.07ab8f60.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43764 - "GET /_app/immutable/chunks/index.1df8947f.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /_app/immutable/chunks/create.324e2fd2.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43744 - "GET /_app/immutable/chunks/UserMenu.40971939.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/chunks/menu-trigger.9c7a531b.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43724 - "GET /_app/immutable/chunks/updater.a83b9f1f.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43754 - "GET /_app/immutable/chunks/Tags.3118ecf3.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43764 - "GET /_app/immutable/chunks/index.4a35aec8.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /_app/immutable/chunks/index.c4ca6160.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43744 - "GET /_app/immutable/nodes/4.613b5850.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43710 - "GET /_app/immutable/chunks/Chat.67cb31ce.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43754 - "GET /_app/immutable/chunks/Messages.6443db80.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43724 - "GET /_app/immutable/chunks/AddFilesPlaceholder.f82566d6.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43764 - "GET /_app/immutable/chunks/Selector.a20a77c5.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /_app/immutable/chunks/MenuLines.ccad0b33.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /favicon.png HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /manifest.json HTTP/1.1" 200 OK
INFO:     127.0.0.1:43736 - "GET /api/config HTTP/1.1" 200 OK
INFO:     127.0.0.1:43736 - "GET /static/favicon.png HTTP/1.1" 200 OK
INFO:     127.0.0.1:43764 - "GET /_app/immutable/chunks/translation.fb9f8d12.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43736 - "GET /api/v1/auths/ HTTP/1.1" 200 OK
INFO:     127.0.0.1:43736 - "GET /api/changelog HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
INFO:     127.0.0.1:43736 - "GET /assets/fonts/Arimo-Variable.ttf HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:43724 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK
INFO:     127.0.0.1:43754 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
INFO:     127.0.0.1:43710 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
INFO:     127.0.0.1:43744 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
INFO:     127.0.0.1:43764 - "GET /ollama/api/tags HTTP/1.1" 200 OK
INFO:apps.openai.main:get_all_models()
INFO:apps.openai.main:get_all_models()
INFO:     127.0.0.1:43764 - "GET /openai/api/models HTTP/1.1" 200 OK

and I also have these vars set
export OLLAMA_MODELS=/run/media/hirad/HSSD/AI
I ran ollama with ollama serve command (no docker again)
I have llama3 and codellama

( ollama is running on gpu mode )

and the os is arch linux with lts kernel

@justinh-rahb
Copy link
Collaborator

@Hiradpi I'm going to guess this is on Windows. You'll need to set ENABLE_LITELLM=False as an environment variable since it's not supported running directly on Windows with the method we're using, and will soon be deprecated and removed from the project anyway.

@Hiradpi
Copy link

Hiradpi commented May 22, 2024

this worked thanks a lot I added ENABLE_LITELLM=False to the .env file and it solved the problem thanks a lot <3

@Zambito1
Copy link
Author

For clarity this is unrelated to the issue originally described. I just tested it again on the latest dev branch, and the UI is still blank when the Ollama service is reachable but unresponsive (such as when using netcat instead of ollama to listen to the port).

@tjbck
Copy link
Contributor

tjbck commented May 26, 2024

Just pushed a massive refac/update to our dev branch, please try again and let us know how it went! Much thanks!

@Zambito1
Copy link
Author

Now the UI will be blank for 5 seconds, and then it will appear after the backend connection to Ollama times out.

image

The UI appears after the models connection finishes. It is definitely much better than indefinitely showing a blank screen (enough that I would consider the original issue resolved), but I do think the UI should be able to show before that request completes. Feel free to close this issue, or use it to continue tracking the UI not appearing until this request finishes at your discretion.

@tjbck
Copy link
Contributor

tjbck commented May 26, 2024

@Zambito1 could you try disabling Ollama connection from settings > connections?

@Zambito1
Copy link
Author

Hm, that menu actually has some weird behavior when I try to do that. When I navigate there while listening with netcat instead of Ollama, the UI will show Ollama and Open AI as disabled.

image

When the connection attempt to Ollama times out, the UI will change automatically, switching both to be enabled.

image

I did not interact with anything manually between the above screenshots.

If I have nothing listening on the port that Open WebUI expects Ollama to be on (neither netcat nor Ollama), these are both immediately populated and on when I navigate to the menu.

image

Note that I did not manually enter the URL between the above two screenshots. I just closed netcat, and then reopened the settings window.

@tjbck
Copy link
Contributor

tjbck commented May 26, 2024

Just added a fix to include a loading screen, let me know if you encounter the same set of issues!

@Zambito1
Copy link
Author

On commit b7fc37d the settings will load for 12 seconds before timing out (2 sequential 6 second time outs).

image

image

After the connection times out, the settings will show with the URLs populated and the connections enabled.

image

If I disable the Ollama connection now, I am able to refresh the page without waiting on a blank screen like my earlier tests today.

@tjbck tjbck closed this as completed Jun 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants