LM Studio - The Easiest Way to get Started with Hugging Face LLMs

LM Studio - The Easiest Way to get Started with Hugging Face LLMs

Want to get started with a local Hugging Face Large Language Model (LLM) but aren’t technical enough to just starting writing code?

No worries! There is a particularly easy way to get started with local Hugging Face LLMs without needing to write a line of code. Just download LM Studio!


A screenshot of the LM Studio downloads page.

Now download the correct version for your Operating System. If you are a Windows shop like us, then select this button:

LM Studio downloads page, zoomed in on the three downloads options. The top option is for MAC, the second is for Windows, and the third is for Linux in beta. The Windows option is circled.

Or whichever is appropriate for you.

The tool itself is quite easy to use:

The first page of the LM Studio tool, which has a search bar and a news column. In the search bar is written gpt4.

Now select the little magnifying glass to find a model off of the Hugging Face hub:

The magnifying glass icon on the top left of the tool's interface is circled (second icon down). The page it presents has the search bar, wherein gpt4 is written. It has been searched and results showcase various gpt4 related models pulled from the Hugging Face hub.

For fun, search for ‘Story’ and press Go: (Or a good alternative is Mistral Instruct)

The same page as before, only the search option of story is used and the results showcase various items/models with the keyword story.

Now go to the ‘Chat’ (the little dialogue bubble) and select the model you just downloaded:

The question or statement cloud icon, located right under the magnifying glass icon near the top left, is circled. The subsequent page is shown showcasing the model that was previously downloaded. The model's name is located at the top and is circled "TheBloke mistral instruct v0 1 7B Q4_0 gguf"

Once it is loaded, you can chat with your own private model.

There are hundreds of models to play with on the Hugging Face hub!

Why might you want to download a local LLM? One obvious reason is that Open AI’s ChatGPT and Google’s Gemini are both costly when used via an API. So, learning to play with a local LLM might make sense. Another reason might be that you want to circumvent the sometimes overly protective ‘safety rails’ they put on their LLMs. Or maybe you need a finetuned LLM for a specific purpose. Hugging Face has an amazing array of those.

But whatever your reason, welcome to the wide world of Hugging Face LLMs!

If you would like to learn more about Hugging Face, be sure to check out other other article on how to get started with Hugging Face and its' pipelines!



comments powered by Disqus

Follow Us

Latest Posts

subscribe to our newsletter