Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Choosing a model should be exclusively done through the UI. #1894

Closed
alimsyed opened this issue May 18, 2024 · 6 comments
Closed

Choosing a model should be exclusively done through the UI. #1894

alimsyed opened this issue May 18, 2024 · 6 comments
Labels
enhancement New feature or request

Comments

@alimsyed
Copy link

What problem or use case are you trying to solve?
The settings for choosing the LLM you want to use is not clear. This way, if you want to switch up the LLM behind the scene (which you do want to do often) you don't have to restart the setup. The frontend form would allow you to choose the LLM the way it does now.

Describe the UX of the solution you'd like
By allowing choosing of LLM exclusively through the frontend will remove a lot of ambiguity and make life much easier for the average user, driving up interest.

Do you have thoughts on the technical implementation?
The implementation for this shouldn't be difficult. The form responsible for settings up LLM should also allow saving API keys if user chooses to so so.

Describe alternatives you've considered

Additional context
Making the project more user friendly is going to drive up usage.

@alimsyed alimsyed added the enhancement New feature or request label May 18, 2024
@Shimada666
Copy link
Contributor

I'm not quite sure I understand what you mean. The current frontend already allows for selecting the model and API key. What additional support would you like to see added?

@alimsyed
Copy link
Author

For instance, the documentation regarding groq integration requires API Key to be provided at the time of starting the container:
https://github.com/xuhe2/OpenDevin/blob/1c2e276e4a0358ef06ebf50526f902c6e222f129/docs/modules/usage/llms/groqLLMs.md

I am suggesting no LLM related information should ever be required at the time of starting the container. All LLM related settings should be done through the existing UI form.

@Shimada666
Copy link
Contributor

Perhaps we can support inputting the Groq API key when using the Groq model and the Gemini API key when using the Gemini model, and so on...
Another way is to provide all configuration options for user input. But maybe too many..
How do you think we should implement this? @amanape

@alimsyed
Copy link
Author

I would suggest to have a smarter form. We can make the API key a mandatory field if you choose an online LLM provider like groq or gemini. If local (Ollama) is being used, then the API Key field won't appear or at least becomes non-mandatory.

@enyst
Copy link
Collaborator

enyst commented May 19, 2024

I think we could fix that documentation to clarify that

  • with UI, the user should input LLM_API_KEY in the UI API key field and that's all they need.
  • llm_api_key in config / UI also works for Groq, as it should
  • llm_api_key in config or Groq API key in env are used when running without UI, so optional or not used with UI

Please do tell if I'm missing something.

@alimsyed just to note:

  • afaik with ollama, using "ollama" as API key works fine
  • there are online providers which do not work with an API key, like Azure has an Azure Vault feature which does not use API keys at all, it has different environment vars it needs. So I don't think API key should be mandatory, because we may break those use cases.

@enyst
Copy link
Collaborator

enyst commented May 20, 2024

@alimsyed I just checked for that document, and it doesn't exist in OpenDevin, it's only a branch. It's because the PR proposing the Groq document has not been merged:
#1542

And I don't find that page on the documentation site:
https://opendevin.github.io/OpenDevin/modules/usage/llms

I think that's okay, precisely because there does not seem to be something particular about Groq on this issue, it works like other providers, already like you propose: from the UI.

I will close this issue, please do let us know though if there are any problems.

@enyst enyst closed this as completed May 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants