Skip to content

What is Microsoft.Extensions.AI Library and Will It Replace Semantic Kernel?

Image of fantasy library with vibrant color. Generated with Canva AI

Microsoft has published new AI library (well, technically it is just LLM library, but nowadays AI = LLM) in preview. This library is called Microsoft.Extensions.AI. With this library, developers can abstract away what LLM service (like OpenAI) they are using and easily leverage basic chat functionality from LLM. Sounds a lot like Semantic Kernel, doesn’t it? So what is the difference and when you should use this library over Semantic Kernel? Is Semantic Kernel now obsolete? Let’s find out answers to these questions by first exploring the Microsoft Extensions AI library.

Microsoft Extensions AI

Microsoft Extensions AI (MEAI later) consists of multiple NuGet packages just like Semantic Kernel. Basically you need to install the Microsoft.Extensions.AI + LLM Service package. For example if you plan to use Ollama you need to install Microsoft.Extensions.AI and Microsoft.Extensions.AI.Ollama packages. This doesn’t make a huge difference compared to SK, where you need to install Microsoft.SemanticKernel.Core and Microsoft.SemanticKernel.Connectors.Ollama. The bigger difference comes in coding practices.

Microsoft.Extensions.AI has only few NuGet packages currently. It supports OpenAI, Ollama and AzureAIInference
Microsoft.Extensions.AI has less packages currently than what Semantic Kernel does.

Code Sample

Ok, so the NuGet packages does not have a huge difference. MEAI has direct .NET Framework support and .NET Standard support where SK has only .NET Framework support through the .NET Standard support. This is not a big thing, but noteworthy to mention.

What about the coding style? This is where the differences starts to appear.

Semantic Kernel

The whole idea of Semantic Kernel is build around Kernel object, which is created with KernelBuilder. You create the KernelBuilder object with features that you want to use from LLM services. To just use the basic Ollama chat client we are adding Ollama chat completion and creating the Kernel object with that.

var builder = Kernel.CreateBuilder().AddOllamaChatCompletion("phi3", new Uri("http://localhost:11434/"));
var kernel = builder.Build();

var result = await kernel.InvokePromptAsync("What is AI?");

Console.WriteLine(result);

Honestly I have never liked this abstraction. I think the Kernel object and builders are bit quirky to use. In Startup.cs file you need to create the builder, build the Kernel object and add it manually into DI. This all feels like what we generally did in 2010 with C#.

Microsoft Extensions AI

In MEAI the code looks bit different. We can directly create the OllamaChatClient, that implements IChatClient interface. So if we want to use abstractions, we can use the IChatClient interface. In MEAI the code is much cleaner and simpler looking.

OllamaChatClient client = new(new Uri("http://localhost:11434/"), "phi3");
var response = await client.CompleteAsync("What is AI?");

Console.WriteLine(response.Message);

For DI Semantic Kernel didn’t have neat build in support. Yes you could use the OpenAI setting classes to bind the configurations and then add Kernel object as singleton, but still it felt bit frustrating. Microsoft published a whole page about how to add Kernel object into DI, because it wasn’t that simple to do.

MEAI has a support for “standard” middleware implementation, so it is much simpler to feed it into DI.

app.Services.AddChatClient(builder => builder 
    .UseLogging()
    .Use(new OllamaClient(...)).AsChatClient(...)); 

Why Two Different Libraries?

This is the big question. Why we have two different (well technically three if you count the direct OpenAI library…) LLM SDK’s from Microsoft? First I want to state that I don’t have any insight on this, I can only answer with my own opinion. I personally think that there are few reasons why Microsoft had to implement this new library.

1. Semantic Kernel is complicated

Problem with Semantic Kernel in general is that it is bit complicated. It is not super complex to use, but it still has some own weird abstractions and ways to do things. For example the DI implementation is too complicated for most devs. Another example is streaming. If you want to use streaming with Semantic Kernel, you have to use this weird KernelFunction abstraction:

var func = kernel.CreateFunctionFromPrompt("What is AI?");
var streaming = kernel.InvokeStreamingAsync(func);

// Compared to MEAI

var streaming = client.CompleteStreamingAsync("What is AI?");

2. Naming is hard

When you release something for public it is very hard to change the abstractions later on. If you made a mistake or assumptions, that didn’t work out really well in the end, you have a problem. Microsoft states in this blog post:

For those already familiar with Semantic Kernel, think of Microsoft.Extensions.AI as the evolution of the Microsoft.SemanticKernel.Abstractions package that you’re currently using to access remote AI services.

So Microsoft.Extensions.AI is version 2.0 of Microsoft.SemanticKernel.Abstractions. If we compare SK implementation against the MEAI we can quickly find out that names are now much more simpler. You have classes like ChatRole (previously AuthorRole) and ChatMessage (previously ChatMessageContent) etc. On overall the abstraction is simpler and more understandable as it uses more “standard” names for things.

SemanticKernel.Abstractions on the left and Microsoft.Extensions.AI.Abstractions on the right

As stated in the MS blog post, the MEAI will be used by the Semantic Kernel when it MEAI goes into GA. So I will read that so, that MEAI will replace the Microsoft.SemanticKernel.Abstractions.I am not sure if that is the case, but it feels bit weird to maintain both of these libraries.

One thing to highlight here is that Semantic Kernel supports C#, Python and Java and MEAI supports only .NET. So if you are working with other than .NET libraries, the MEAI will not affect you currently (I don’t know what happens when the SK abstraction is replaced…)

Summary

Ok let’s summarize this. Microsoft.Extensions.AI is simpler, more “dotnetish” library for LLM solutions. Semantic Kernel is more suitable for enterprise level AI implementations where AI is the centric thing. If you want to add AI (LLM) support into your existing app go with the MEAI. If you want to build a brand new app, that is 100% about AI, choose the Semantic Kernel as it has more powerful AI abstractions like Agents.

What do you think about MEAI and having these two libraries living side by side? Leave a comment below!

Leave a Reply

Your email address will not be published. Required fields are marked *