Skip to content

.Net: Test local modal with Semantic Kernel (i.e., Llama via Ollama) #3990

@madsbolaris

Description

@madsbolaris
Member

Deploy a IChatCompletionService Llama model locally with Ollama and validate that it works with Semantic Kernel and the existing IChatCompletionService interface.

Activity

converted this from a draft issue on Dec 5, 2023
added
.NETIssue or Pull requests regarding .NET code
on Dec 5, 2023
changed the title [-]Test local modal with Semantic Kernel (i.e., Llama via Ollama)[/-] [+].Net: Test local modal with Semantic Kernel (i.e., Llama via Ollama)[/+] on Dec 5, 2023
self-assigned this
on Dec 6, 2023
madsbolaris

madsbolaris commented on Dec 12, 2023

@madsbolaris
MemberAuthor

Doesn't need to test function calling.

alliscode

alliscode commented on Dec 15, 2023

@alliscode
Member

I tested our IChatCompletionService with Ollama using mistral. My conclusion is that our abstractions are good enough to allow this to work, but it does not currently work due to some implementation details in the Azure OpenAI SDK:

  • Ollama chat API uses the same interface as OpenAI however the Ollama response is streaming by default where the OpenAI response is not-streaming by default. This is a problem because the only way to disable streaming on Ollama is to set streaming=false in the request body, which is never done in the Azure OpenAI SDK, it's either true or null (missing).
  • Ollama streaming uses Server Sent Events just like OpenAI but Ollama uses named events, also known as multi-line events, see this and the Azure OpenAI SDK does not support this, it throws an exception.

Once these issues are fixed in the Azure OpenAI SDK, our IChatCompletionService will support Ollama.

stephentoub

stephentoub commented on Jan 11, 2024

@stephentoub
Member

Once these issues are fixed in the Azure OpenAI SDK, our IChatCompletionService will support Ollama.

Are there issues open on that for Azure.AI.OpenAI? Is anyone working on it? Timeframe?

clement128

clement128 commented on Jan 17, 2024

@clement128

hello any update for this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

Labels

.NETIssue or Pull requests regarding .NET codev1.0.1Required for the Semantic Kernel v1.0.1 release

Type

No type

Projects

Status

Sprint: Done

Milestone

No milestone

Relationships

None yet

    Development

    No branches or pull requests

      Participants

      @madsbolaris@stephentoub@alliscode@shawncal@clement128

      Issue actions

        .Net: Test local modal with Semantic Kernel (i.e., Llama via Ollama) · Issue #3990 · microsoft/semantic-kernel