20250201 - Ollama Docker镜像使用指南 - ollama- Docker Image Docker Hub¶
- 分类:
Clippings - 创建:
2025-02-01 - 标签:
Ollama, Docker, 语言模型, 本地运行, GPU支持
ollama/ollama - Docker Image | Docker Hub¶
摘要 (Summary)¶
Ollama是一个便于在本地运行大型语言模型的Docker镜像。它支持CPU、Nvidia GPU以及AMD GPU,并提供相应的安装和配置说明。用户可以通过简单的命令启动模型并尝试不同的模型。
要点 (Key Facts)¶
- Ollama Docker镜像,支持CPU和GPU。
- 安装NVIDIA Container Toolkit和配置Docker使用Nvidia驱动。
- 使用AMD GPU时的指令。
- 可以在本地运行模型,例如llama3。
正文 (Content)¶
Ollama Docker image¶
Ollama makes it easy to get up and running with large language models locally.
CPU only¶
Nvidia GPU¶
Install the NVIDIA Container Toolkit.
Install with Apt
- Configure the repository
curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey \
| sudo gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg
curl -s -L https://nvidia.github.io/libnvidia-container/stable/deb/nvidia-container-toolkit.list \
| sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' \
| sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list
sudo apt-get update
- Install the NVIDIA Container Toolkit packages
Install with Yum or Dnf
- Configure the repository
curl -s -L https://nvidia.github.io/libnvidia-container/stable/rpm/nvidia-container-toolkit.repo \
| sudo tee /etc/yum.repos.d/nvidia-container-toolkit.repo
- Install the NVIDIA Container Toolkit packages
Configure Docker to use Nvidia driver
Start the container
==RUN the docker container securely Completely local, isolated from the rest of the OS==:
docker run -d --gpus all -v ollama:/root/.ollama -p 11434:11434 --security-opt=no-new-privileges --cap-drop=ALL --cap-add=SYS_NICE --memory=8g --memory-swap=8g --cpus=4 --read-only --name ollama ollama/ollama
AMD GPU¶
To run Ollama using Docker with AMD GPUs, use the rocm tag and the following command:
docker run -d --device /dev/kfd --device /dev/dri -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama:rocm
Run model locally¶
Now you can run a model:
Try different models¶
More models can be found on the Ollama library.