Location>code7788 >text

AI-access

Popularity:622 ℃/2025-02-13 10:43:42

Preface

The model has been applied for before and can be accessed and used through the test. Is the access to this article still available?Ollama, We can already interact on the command line terminal before, and now we connect the AI ​​to the code;

Prepare

As a Neter here.net, the first thing is to create a project, the WebApi project is used here, and the console can also be used;

useSemanticKernelAccess to AI,SemanticKernelIt is a tool that helps programs connect to AI models. The following is the official introduction:

Semantic Kernel is a lightweight, open-source development kit that lets you easily build AI agents and integrate the latest AI models into your C#, Python, or Java codebase. It serves as an efficient middleware that enables rapid delivery of enterprise-grade solutions. 

Introducing SemanticKernel package

dotnet add package 
dotnet add package 

The ollama connector is currently an alpha version. You need to check the pre-release version in Nuget search.

Ollama access example

register

using ;
using ;
using ;
using OllamaSharp;

var endpoint = new Uri("http://localhost:11434");
var modelId = "llama3:latest";
(new OllamaApiClient(endpoint, modelId));

Create an interface

[Route("api/[controller]")]
[ApiController]
public class AIChatController : ControllerBase
{
    private readonly OllamaApiClient _ollamaApiClient;

    public AIChatController(OllamaApiClient ollamaApiClient)
    {
        _ollamaApiClient = ollamaApiClient;
    }
    
    [HttpGet("Chat")]
    public async Task Chat()
    {
    #pragma warning disable SKEXP0001
        var history = new List<Message>();
        (new Message()
        {
            Role = ,
            Content = "you are a useful assistant",
        });
        (new Message()
        {
            Role = ,
            Content = "hello",
        });
    
        var req = new ()
        {
            Messages = history,
            Stream = true
        };
    
        var sb = new StringBuilder();
        var content = _ollamaApiClient.ChatAsync(req);
    
        await foreach (var chatMessageContent in content)
        {
            var msg = chatMessageContent?.;
            (msg);
            (msg);
            await ($"data: {msg}\n\n");
            await ();
        }
    }
}

response:

Hello! It's nice to meet you. I'm here to assist you with any questions, tasks, or just about anything you'd like to chat about. What's on your mind today?

Moonhost access example

register

var MoonshotAIKey = "sk-2xyIeQ49Xl714yquKkMrIdvsuI4aZmnvgNHHKxEaXkk384Os";
var endpoint = new Uri("/v1");
var modelId = "moonshot-v1-8k";
var kernelBuilder = ()
      .AddOpenAIChatCompletion(modelId: modelId!, apiKey: MoonshotAIKey, endpoint: endpoint, httpClient: new HttpClient());

[Route("api/[controller]")]
[ApiController]
public class AIChatController : ControllerBase
{
    private readonly Kernel _kernel;
    public AIChatController(Kernel kernel) 
    {
        _kernel = kernel
    }
    
    /// <summary>
    /// MoonShot
    /// </summary>
    /// <returns></returns>
    [HttpGet("MoonShotChat")]
    public async Task MoonShotChat()
    {
        var settings = new OpenAIPromptExecutionSettings()
        {
            Temperature = 0,
            ToolCallBehavior = 
        };

        var history=new ChatHistory();
        ("you are a useful assistant");
        ("hello");
        
        var chatCompletionService=_kernel.GetRequiredService<IChatCompletionService>();
        var result=await (history,settings,_kernel);

        (());
        //Hello! How can I help you today?
    }
}