In the last article, we have introduced the basic functions and usage of GraphRag in detail. If you are still not familiar with it, we recommend reading the previous article first
With the first two posts, I'm sure you've learned thatCurrently only the OpenAI specification interface is supported, but many of our partners have proposed in the community to add support for local models (e.g., ollama, etc.). So this time, we will explore how to use custom and local models in.
Why choose?
adoptedSemantic KernelAs a basis, it allows us to abstract the session and vector interfaces very succinctly. As a result, users can implement their customized solutions very easily. Next, we will show, through a concrete example, how to integrate local and homegrown models into theCenter.
Default Configuration Method
First, let's look at how to do the default configuration:
// OpenAI configuration ("OpenAI").Get<OpenAIOption>(); // Document Slicing Configuration ("TextChunker").Get<TextChunkerOption>(); // Configure the database connection ("GraphDBConnection").Get<GraphDBConnectionOption>(); // Note that the configuration file needs to be injected first, and then the ();
Here, we will inject the OpenAI configuration, the text slicing configuration and the database connection configuration in the default configuration. Then, in turn, inject these configuration files and theof services.
Customized Configuration Methods
If a custom model or a local model is required, some additional service interfaces may need to be implemented, below is an example of a custom configuration:
var kernelBuild = (); <ITextGenerationService>("mock-text", new MockTextCompletion()); <IChatCompletionService>("mock-chat", new MockChatCompletion()); <ITextEmbeddingGenerationService>(new MockTextEmbeddingGeneratorService()); ("mock-embedding", new MockTextEmbeddingGeneratorService()); (());
In this custom configuration example, we introduce three custom service interfaces:ITextGenerationService
、IChatCompletionService
respond in singingITextEmbeddingGenerationService
。
Implementing custom service interfaces
Next, we need to provide specific implementations for each service interface. Here are the specific implementations for the three interfaces:
realizationIChatCompletionService
public class MockChatCompletion : IChatCompletionService { private readonly Dictionary<string, object?> _attributes = new(); private string _chatId; private static readonly JsonSerializerOptions _jsonSerializerOptions = new() { NumberHandling = , Encoder = () }; public IReadOnlyDictionary<string, object?> Attributes => _attributes; public MockChatCompletion() { } public async Task<IReadOnlyList<ChatMessageContent>> GetChatMessageContentsAsync(ChatHistory chatHistory, PromptExecutionSettings? executionSettings = null, Kernel? kernel = null, [EnumeratorCancellation] CancellationToken cancellationToken = default) { StringBuilder sb = new(); string result = $"It's a line.Mockdigital,Easy chat testing,Your message is:{().ToString()}"; return [new(, ())]; } public async IAsyncEnumerable<StreamingChatMessageContent> GetStreamingChatMessageContentsAsync(ChatHistory chatHistory, PromptExecutionSettings? executionSettings = null, Kernel? kernel = null, [EnumeratorCancellation] CancellationToken cancellationToken = default) { StringBuilder sb = new(); string result = $"It's a line.Mockdigital,Easy chat testing,Your message is:{().ToString()}"; foreach (var c in result) { yield return new StreamingChatMessageContent(, ()); } } }
realizationITextGenerationService
public class MockTextCompletion : ITextGenerationService, IAIService { private readonly Dictionary<string, object?> _attributes = new(); private string _chatId; private static readonly JsonSerializerOptions _jsonSerializerOptions = new() { NumberHandling = , Encoder = () }; public IReadOnlyDictionary<string, object?> Attributes => _attributes; public MockTextCompletion() { } public async Task<IReadOnlyList<TextContent>> GetTextContentsAsync(string prompt, PromptExecutionSettings? executionSettings = null, Kernel? kernel = null, CancellationToken cancellationToken = default) { StringBuilder sb = new(); string result = $"It's a line.Mockdigital,Easy chat testing,Your message is:{prompt}"; return [new(())]; } public async IAsyncEnumerable<StreamingTextContent> GetStreamingTextContentsAsync(string prompt, PromptExecutionSettings? executionSettings = null, Kernel? kernel = null, CancellationToken cancellationToken = default) { StringBuilder sb = new(); string result = $"It's a line.Mockdigital,Easy chat testing,Your message is:{prompt}"; foreach (var c in result) { var streamingTextContent = new StreamingTextContent((), modelId: "mock"); yield return streamingTextContent; } } }
realizationITextEmbeddingGenerationService
public sealed class MockTextEmbeddingGeneratorService : ITextEmbeddingGenerationService { private Dictionary<string, object?> AttributesInternal { get; } = []; public IReadOnlyDictionary<string, object?> Attributes => ; public MockTextEmbeddingGeneratorService() { } public async Task<IList<ReadOnlyMemory<float>>> GenerateEmbeddingsAsync( IList<string> data, Kernel? kernel = null, CancellationToken cancellationToken = default) { IList<ReadOnlyMemory<float>> results = new List<ReadOnlyMemory<float>>(); float[] array1 = { 1.0f, 2.0f, 3.0f }; float[] array2 = { 4.0f, 5.0f, 6.0f }; float[] array3 = { 7.0f, 8.0f, 9.0f }; // Wrap the array in ReadOnlyMemory<float> and add it to the list (new ReadOnlyMemory<float>(array1)); (new ReadOnlyMemory<float>(array2)); (new ReadOnlyMemory<float>(array3)); return results; } public void Dispose() { } }
Seeing this, you may have realized that integrating custom and local models is very simple. Just follow the steps above, implement the appropriate interfaces and inject the configurations, and you're ready to go in theUse these customizations in the
concluding remarks
Through this article, we have learned how theThe integration of homegrown models and local models in the I hope you can develop more features that suit your needs based on these examples. For more exciting content, welcome to follow my public number and send into the group to join ourExchange group to share and learn with your community partners!
Thanks for reading and we'll see you in the next installment!