Local Chatting with Ollama
Summary
How to install and use a local AI Chatbot.
Ollama is an open-source AI model that provides a sophisticated AI chatbot. Running it locally means you don’t need to pay any additional API fees and it’s great if you want to do local development. Following these simple instructions enables you to set it up a local free-to-use AI to chat with. This is based on the instructions found on the Open WebUI GitHub Page.
Installing Software
Docker
If you are not an engineer, you may not have Docker installed. Docker is not the only Virtual Container software you can run, but it is the most popular currently and it’s free for individual use. Head over to their site to download and install docker, it’s quite simple. The tool we are using is ‘Docker-Desktop’ and if you are using Windows or Mac the installation is currently very similar. After you download the appropriate installer, you just run it and follow along with the wizard.
Now that you have docker installed, you will really not need to interact much more with it for the rest of this installation, just having it there and running is all you need. Depending on your setup, you might need to restart your machine.
To verify that it is working as expected, you need to open a terminal window. Once open to verify docker is installed, you may type docker --version
(results are shown in the image below.
Ollama
As I mentioned this is a very short tutorial! The next step after installing docker is to run Ollama and WebUI.
docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
After running the command above for the first time, docker will download and install all the required software and run a local container hosting the WebUI application that will allow you to access your local version of Ollama. You can verify that it is working, by looking at the ‘Containers’ tab in the Docker UI, shown in the image below.
Chatting With AI Locally
Once the previous command completes your local Chat AI is ready to talk to you! To begin, open the Web UI by clicking this link.
NOTE: If you restart your computer, you will need to run that command again to restart the WebUI.
The first time you open this site, it will ask you to log in. Create and account and login. Then you will be able to chat.
More Models
If you are interested, there are more models that you can install through the WebUI interface. If you code for example, you may want to install an AI that is trained specifically for coding, such as codellama
. Installing new models is quite easy, visit the link below.
http://localhost:3000/workspace/models
Happy Chatting!
Getting started with AI has never been easier, I hope you enjoy your new AI friend.
If you get stuck, or just want to copy/paste easier, here is a GitHub Repository with the example above.