Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: function calling #2175

Closed
wants to merge 4 commits into from

Conversation

not-nullptr
Copy link
Contributor

@not-nullptr not-nullptr commented May 11, 2024

Pull Request Checklist

  • Target branch: Pull requests should target the dev branch.
  • Description: Briefly describe the changes in this pull request.
  • Changelog: Ensure a changelog entry following the format of Keep a Changelog is added at the bottom of the PR description.
  • Documentation: Have you updated relevant documentation Open WebUI Docs, or other documentation sources?
  • Dependencies: Are there any new dependencies? Have you updated the dependency versions in the documentation?
  • Testing: Have you written and run sufficient tests for the changes?
  • Code Review: Have you self-reviewed your code and addressed any coding standard issues?

Description

i've implemented function calling, since this is one of the last "big" features relating to LLMs that open-webui has yet to have implemented.


Changelog Entry

Added

  • function calling (incl. settings menu)
  • in-browser monaco editor for typescript function editing

Fixed

(n/a)

Changed

(n/a)

Removed

(n/a)

Security

(n/a)

Breaking Changes

(n/a)


Additional Information

this PR also adds monaco as a dependency, since functions are written in-the-browser with first-class typescript support. i looked into writing my own editor - this is not an option (for me, at least)

functions are written to localStorage, this is because i don't know python and i'd rather not write poor code for the backend .

there's still some work to be done (such as adding validation for parameter names - they can only include values which a javascript variable could include) but its functional.

demo:
https://github.com/open-webui/open-webui/assets/62841684/70bcdd8c-d887-43f7-8f7b-54b5ac1dab31

@sebdg
Copy link

sebdg commented May 12, 2024

Hi there thanks for the nice work, I can indeed make a function, and using phi3 it was rather easy to make it callit, but seams the code-editing feature doesn't save my changes. I was able trough the development console to retrieve, modify and set back my modified code
I was then able to get my function called as you can see here.
image

My function should have returned a dummy value to the LLM like :
image
But it seams that is not send back to the LLM and he doesn't take the response into account, I'm probably doing it wrong on that part.

But again, great work the settings page etc are cool, the storage should be server-side, but hey that's the next step 👍 good work, I easly see how you can integrate that with other tools/api's. Maybe a sort of template could be proposed like 'http' request to send the request 'as-is' somewhere else, ala webhooks or whatever other api the user has setup with streamlit, fastapi... whatnot

@not-nullptr
Copy link
Contributor Author

@sebdg thanks for the response! you need to press "save and go back" or ctrl+s in order for your code to actually be saved. this is a remnent from my old project and im sure i could make it autosave on key press :)

also it seems your code has an error - at runtime it would not give the LLM a response because you have not defined params1 and params2 (you need to add them to the function's parameters - this is just typescript with all the bells and whistles)

if you add those, it'll work (but you may get weird results logged since "param1" and "param2" are nondescript names)

@tjbck
Copy link
Contributor

tjbck commented May 13, 2024

Amazing stuff, I'll review in a bit 🙌

@not-nullptr
Copy link
Contributor Author

Amazing stuff, I'll review in a bit 🙌

not sure how ready this thing is but i super appreciate it. if you reckon its ready enough then merging is up to you :)

@silentoplayz
Copy link
Collaborator

silentoplayz commented May 16, 2024

There might be room for fixes based on this build log of mine:
PRBuildLog.txt

@not-nullptr
Copy link
Contributor Author

There might be room for fixes based on this build log of mine: PRBuildLog.txt

im unsure what linting (i assume?) you guys are using, but it seems to be detecting problems in the monaco module, which it shouldn't do. this isn't an issue with my code

@audy
Copy link

audy commented May 16, 2024

@not-nullptr this is excellent! My feedback after using this is that I wish there was a way to render the response from the API in open-webui directly, without passing it through the model first. For example, I have a function that fetches some JSON and formats a response as markdown.

@not-nullptr
Copy link
Contributor Author

thanks! though at that point, wouldn't you just want to use a separate program or something? seems a bit unnecessary to run it through open webui

@audy
Copy link

audy commented May 16, 2024

thanks! though at that point, wouldn't you just want to use a separate program or something? seems a bit unnecessary to run it through open webui

Having the model is nice for figuring out which endpoint to call and mapping the params but I found that passing the response back through the model resulted in weird behavior (like Phi3 telling me that "a chance of Llamas" is not a real type of weather). Maybe this can be fixed with prompt engineering though so not really in scope?

@not-nullptr
Copy link
Contributor Author

i feel that's more of a prompt engineering issue, like you said. wrangling smaller models to output what you want can be tricky though, i get what you mean...

@silentoplayz
Copy link
Collaborator

silentoplayz commented May 17, 2024

There might be room for fixes based on this build log of mine: PRBuildLog.txt

im unsure what linting (i assume?) you guys are using, but it seems to be detecting problems in the monaco module, which it shouldn't do. this isn't an issue with my code

Building your PR took 30-35 seconds longer than building the dev branch itself with the latest commits, and doing so also throws a few warnings related to use of eval in specific file paths being strongly discouraged, as it poses security risks and may cause issues with minification. Both of what I mentioned aren't issues outside of this PR, even if the latter are simply linter warnings. I mean no harsh feelings, but this is just my two cents. My intention is not to discourage, but rather to provide constructive feedback. I am personally hoping for function calling to make its way into Open WebUI myself. I simply hope that any issues with the PR could be addressed before it is reviewed over before the possibility of having your PR merged.

@not-nullptr
Copy link
Contributor Author

eval is used in order to actually execute the function ran. all code executed is written by the user so it's not unsafe. the higher build times are probably due to monaco being added as a dependency, as it isn't very light. this current method is the only way to get typescript intellisense in the browser afaik

@Stargate256
Copy link

Does it not work with openAI api, or am I doing something wrong?

@not-nullptr
Copy link
Contributor Author

Does it not work with openAI api, or am I doing something wrong?

nope, custom method. apologies (though this is a pretty good idea)

@hchasens
Copy link

hchasens commented May 20, 2024

As much as I want to see function calling added quickly, since everyone else uses the openAI API standard it makes sense that it should be implemented here. If it's added later it'll break a bunch of things and the project might be forced to support a deprecated API for a while to come. This way you also gain compatibility with all the programs already out there as well.

@not-nullptr
Copy link
Contributor Author

As much as I want to see function calling added quickly, since everyone else uses the openAI API standard it makes sense that it should be implemented here. If it's added later it'll break a bunch of things and the project might be forced to support a deprecated API for a while to come. This way you also gain compatibility with all the programs already out there as well.

all function calling in this PR occurs on the client.

@tjbck
Copy link
Contributor

tjbck commented May 20, 2024

I agree with @not-nullptr, I'll be taking a look in a bit to try to have this merged for our next release.

We can always add OpenAI API compatible function calling feature later, and have the best of the both worlds.

@not-nullptr
Copy link
Contributor Author

mega appreciate it

@hchasens
Copy link

all function calling in this PR occurs on the client.

Apologies, I didn't notice. In that case I'd love to see this added!

@tjbck tjbck added this to the v0.2.0 milestone May 26, 2024
@tjbck
Copy link
Contributor

tjbck commented Jun 2, 2024

I'll be closing this in favour of Pipelines Plugin function calling support, but I might cherrypick some changes here to support browser-end JS function calling later down the line. Thank you for all your hard work, @not-nullptr! I've added you as a co-author for v0.2.0 in recognition of your inspiring contributions :)

#798 (comment)

https://github.com/open-webui/pipelines/blob/main/examples/function_calling/function_calling_filter_pipeline.py
image

@hchasens
Copy link

hchasens commented Jun 6, 2024

Pipeline is a more robust system, but I love the feature of having a web-based editor. In future, could we perhaps we can get a frontend for pipelines (with an editor), either as part of pipeline deployment or as part of admin settings in webui? It'd streamline the testing and deployment!

@not-nullptr
Copy link
Contributor Author

Pipeline is a more robust system, but I love the feature of having a web-based editor. In future, could we perhaps we can get a frontend for pipelines (with an editor), either as part of pipeline deployment or as part of admin settings in webui? It'd streamline the testing and deployment!

i am more than happy to implement everything else if someone can get a code editor with completion working in the browser for python.

@tjbck
Copy link
Contributor

tjbck commented Jun 6, 2024

Coming soon! #2825

I'll let you know where we could use some help once I setup the scaffolding! (perhaps the reintroduction of js function calling!)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants