Skip to content

llm

Access Real-Time Data from LLM using Semantic Kernel

LLMs have been around for years now, and we have been using them for data that doesn’t change frequently. We don’t ask them for current data, like today’s weather or the balance in our bank account. Instead, we ask evergreen questions, such as who the members of the band ABBA are or whether we can eat ice cream in space (the answer is yes!).

However, since GPT-4, we have gained …

AI Toolkit for Visual Studio Code

AI Toolkit for Visual Studio Code is a “relatively” new toolset, that was previously known as Windows AI Studio. Toolkit is an extension, that allows developers to run SLMs (Small Language Models) on their local machine, or use Azure AI Studio LLM models from cloud. As you can easily use the Azure AI Studio models without any extensions like this, I find the local models more interesting …

Photo by David Bartus: https://www.pexels.com/photo/black-audio-mixer-690779/

Develop LLM Solutions In Local Environment

Sometimes you want to develop or experiment with LLM APIs without having to submit your data to Azure Open AI or any other LLM service. Instead, you simply want to run a basic LLM AI locally to test whether your idea works. Jussi has authored some informative blog posts about Ollama, which indeed is a great tool for this situation, but it might not be the best…

LM Studio

This image is generated with AI Pixlr service

Azure AI Language Service

Azure AI Language Service is a service hosted on the cloud, offering capabilities in Natural Language Processing (NLP) to comprehend and dissect text. It supports summarization, entity linking, key phrase extraction, and many other features. It shares many features with OpenAI, albeit in a much simpler form.

In this post, I’m going to build an Azure Function that wraps the Azure AI Language Service into a nice and simple API. …