Open WebUI quietly releases 0.5.11, adding one of the best dev-focused features ever: Jupyter notebook support

If you’ve been wanting to run Python programs directly in Open WebUI but found that the limited libraries provided in the Pyodide sandbox were too limiting, good news: Open WebUI just added support for Jupyter Notebook. Why is this so cool? The big deal (for me at least) is that connecting Open WebUI to Jupyter lets you load whatever Python libraries you want in your local Python environment so that the code your LLM writes in response to your prompt will execute (if you have the “code interpreter” feature in Open WebUI turned on and pointed to your Jupyter instance.) Of course, this is also hugely dangerous because it bypasses the Pyodide sandbox, and executes via the Jupyter instance that you point it to in the configuration settings. So be careful what you ask it to write. Anyways, don’t sleep on this release. I got it running and was able to have it one-shot the creation of a synthetic dataset using the Python Faker tool, writing the records to both the console and also saving a .TXT file sent to the current working directory on my local computer. As with most new Open WebUI features, there is pretty much no documentation yet on how to set it up.

Here’s the basics on how I got it running:

  1. Make sure you have Anaconda and Jupyter setup and Jupyter running on your host computer.
  2. In Open WebUI, got to Admin Settings > Code Interpreter > change from “Pyodide” to “Jupyter”
  3. For the host, if you’re running Open WebUI via Docker, it’s probably going to be:

http://host.docker.internal:8888

Note: By default Jupyter uses token based authentication.

  1. Choose “token” for authentication and copy your token from the running Jupyter terminal window (this token changes every time you restart Jupyter btw (unless you set it otherwise.)

If you are using Docker to host Open WebUI, you’ll probably need to add the part below to get it to work. Note: there are obvious security risks for changing this setting

  1. From an Anaconda terminal type:

jupyter notebook --generate-config

  1. Go to the jupyter_notebook_config.py that was just created and edit it.

  2. Look for the

NotebookApp.allow_remote_access

setting and change it to “True” and also remove the “#” to uncomment the setting.

That’s it. Now you can load whatever Python libraries you want in your host environment and they can be called and run in conjunction with the code that the LLM is writing in the chat in Open WebUI. Again, this could be very dangerous since it’s executed in the context of wherever Jupyter is running, but it’s still pretty badass to watch an LLM one-shot and run the code instantly in the chat.

https://github.com/open-webui/open-webui/releases