Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prompt format for SeaLLM model? #64

Closed
duyluandethuong opened this issue May 19, 2024 · 2 comments
Closed

Prompt format for SeaLLM model? #64

duyluandethuong opened this issue May 19, 2024 · 2 comments

Comments

@duyluandethuong
Copy link

duyluandethuong commented May 19, 2024

I'm trying to use this model: https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5-GGUF

The SeaLLM-7B-v2.5 has the following preset for LM studio: https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5-GGUF/blob/main/seallm-v2.5.preset.json

I don't know how to convert this into prompt format for LLMFarm. Please help.

When I try to run in Xcode, in print the following lines

Screenshot 2024-05-19 at 10 52 14 AM
@guinmoon
Copy link
Owner

guinmoon commented May 19, 2024

Try this: BOS = false, EOS = false

[system](<|im_start|>system
You are a helpful assistant.)
<eos><|im_start|>user
{prompt}<eos><|im_start|>assistant

stop words = <|im_start|>,<eos>

@guinmoon
Copy link
Owner

Tell me this template helped you?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants