“The future of AI is bright, and it will continue to revolutionize the way we live and work. With advancements in machine learning and natural language processing, AI will become even more powerful and ubiquitous in the coming years.”
-GPT4All

As much as I harp on the current hype surrounding AI and its pace of advancements, I do believe there is a place and use for these new tools. While we aren’t going to be replaced overnight, I do expect a big productivity boom from all these tools. One of the most significant drawbacks to the current Large Language Model AI (LLM AI) is that it is controlled by someone else. OpenAI has its popular ChatGPT, Facebook has its Meta AI, Google its Bard, and Microsoft has Bing AI. All of these companies have to make money, so whether through a subscription or selling your data, there is a cost. They are also black boxes in how most of them work. If we want to live in a world where everyone is on equal footing with AI, people need to be able to run them locally. In this post, I will share how you can run your own ChatGPT at home.
The Easiest Choice: GPT4All
- Easiest to setup
- No GPU required
- Large Selection of Language Models to play with
- It can be a little taxing on older CPUs
When researching this article, I was surprised to find out how many choices there are if you want to run your own LLM AI from home. One of the easiest and quickest choices to getting up and running is from the folks over at GPT4All. They offer a one-click installer to download a Chatgpt like AI onto your Windows, Linux, or Apple Mac computer. Once downloaded, pick the Lanauge Model you want to work with, and presto! I had zero issues setting this AI up and was asking it questions in minutes. Despite its simplistic interface, its settings menu has a lot of knobs you can play with your tweak your output. It also has one of the largest selection of language models to choose from. While it doesn’t require a GPU, it can tax your CPU, and responses are a little slower depending on what you are running it on. Overall, it is the easiest experience to set up and doesn’t require any real technical knowledge. The community support is great if you run into any issues as well.

The Rabbit Hole Choice: Alpaca-LoRA
- Based on the fantastic work of Standford’s Alpaca Project
- Fast and customizable
- Huge community support
- Does require a bit of technical knowledge
Let’s say you are like me and want to get down in the dirt and really PLAY with an LLM AI. However, you don’t want to make a career out of it, and you get to go home at the end of the day. My suggestion to you would be Alpaca-LoRA. Alpaca low-rank adaptation or Alpaca-LoRA is a LLM AI forked from the University of Standford Alpaca Project. Standford set out to make an LLM AI that fixed some of the deficiencies with ChatGPT, like generating false information and toxic language. Stanford released assets to the open-source community, which then created Alpaca-LoRA. Once you have cloned the repo and installed the requirements in Python it’s pretty straightforward to get up and running. I found it to be more descriptive and better able to handle programming challenges than GPT4All. The downside was that everything was handled via the Python Console instead of a nice interface like GPT4All. Not ideal, but this is where the amazing community support comes in. The GitHub repo has a resource section for all the projects Alpaca-LoRA has spawned. If you want a ChatGPT-style interface, someone has created that. Maybe you need Alpaca-LoRA in Spanish? Someone has done that too. The open-source community has embraced Alpaca-LoRA to the point that a leaked memo from Google states that they are falling behind the open-source community. This is the model I ended up going with at home. If you don’t mind getting your hands dirty, this is the model to pick. It’s not as easy to set up as GPT4All, but it has many more features.

The Explorer: HuggingFace
- Not an LLM but rather a large site hosting thousands of models
- Models large and small are available
- Try before you download feature
- It can be a bit overwhelming
For this last one, I had difficulty narrowing it down to a specific model or program. Instead of just picking one, I’ll let you decide. HuggingFace is the place to go if you want to learn or play with machine learning. Most of the LLM Ai’s you can play with today started on this site. You can find everything here, from conversational, image to text, text to video, object detection, and more. The best part is that most projects allow you to play with them before downloading anything. If you are looking for the best LLM Ai’s, I suggest starting here. Huggingface isn’t just for grabbing the latest and greatest; it’s also a great place to learn about machine learning and language models. I often picked up on what is happening in Ai just browsing the site. It can be a bit overwhelming browsing the site, but there is no better place to discover new Ai models.

Conclusion
I hope you found this helpful; I learned much from researching this article. I honestly hope that the open-source community continues pushing the boundaries of Ai. I would much rather have a future where everyone can access these models than those who can afford them or are locked away in some company’s data center. Until next time!