Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Bunny VLM (SigLip + Phi-3) #7404

Closed
criminact opened this issue May 20, 2024 · 1 comment
Closed

Support for Bunny VLM (SigLip + Phi-3) #7404

criminact opened this issue May 20, 2024 · 1 comment

Comments

@criminact
Copy link
Contributor

criminact commented May 20, 2024

Can we add support for this Bunny model. Local SigLip + Phi-3 would be great

Bunny (Siglip + llama-3) is already supported, I think? - https://github.com/BAAI-DCAI/Bunny/blob/main/README.md

@criminact criminact changed the title Support for Bunny VLM Support for Bunny VLM (SigLip + Phi-3) May 20, 2024
@criminact
Copy link
Contributor Author

Bunny (SigLip + Phi-3) is already supported - https://huggingface.co/BAAI/Bunny-v1_0-4B-gguf
Bunny (Siglip + llama-3) is already supported - https://huggingface.co/BAAI/Bunny-Llama-3-8B-V-gguf

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant