Location>code7788 >text

DeepSeek localized deployment on M-chip Mac

Popularity:277 ℃/2025-03-07 12:54:37

Run DeepSeek-R1 with Ollama on your Mac and provide web-side access through Open-WebUI.

1. Install Ollama

Ollama official:/
Ollama is a lightweight AI inference framework that supports running LLMs (large language models) locally. First, download and install Ollama.

mac native tool brew install ollama

$ brew install --cask ollama
Running `brew update --auto-update`...
==> Auto-updated Homebrew!
Updated 2 taps (homebrew/core and homebrew/cask).
==> New Formulae
bpmnlint          gersemi           i686-elf-grub     kirimase          org-formation     rattler-index     semver            todoist           x86_64-elf-grub
cf-terraforming   globstar          immich-go         largetifftools    ov                rhai              sequoia-sqv       trdsql            yoke
cloudfoundry-cli  gotz              kafkactl          lazyjournal       pivy              rpds-py           sttr              typioca           ytt
fortitude         hishtory          kapp              mox               punktf            sdl3_ttf          tml               unciv
==> New Casks
candy-crisis                              font-winky-sans                           opera-air                                 trae-cn
consul                                    fuse-t                                    pairpods                                  ua-midi-control
focu                                      macskk                                    pareto-security                           veracrypt-fuse-t
font-sf-mono-nerd-font-ligaturized        nvidia-nsight-compute                     trae

You have 13 outdated formulae and 1 outdated cask installed.

==> Downloading /ollama/ollama/releases/download/v0.5.13/
==> Downloading from /github-production-release-asset-2e65be/658928958/2dc24c17-0bc0-487a-92d1-0265efd65a14?X-Amz-Algorithm=AWS4-
############################################################################################################################################################### 100.0%
==> Installing Cask ollama
==> Moving App '' to '/Applications/'
==> Linking Binary 'ollama' to '/opt/homebrew/bin/ollama'
🍺  ollama was successfully installed!

Check whether Ollama is installed successfully. The version number will be displayed, such as: ollama version is 0.5.13

$ ollama --version
Warning: could not connect to a running Ollama instance
Warning: client version is 0.5.13

2. Download the model

Download the DeepSeek-R1 model
Model download address:/library/deepseek-r1

This command will automatically download the DeepSeek-R1 version 1.5B model and store it locally.

$  ollama pull deepseek-r1:7b
pulling manifest
pulling 96c415656d37... 100% β–•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– 4.7 GB
pulling 369ca498f347... 100% β–•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–  387 B
pulling 6e4c38e1172f... 100% β–•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– 1.1 KB
pulling f4d24e9138dd... 100% β–•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–  148 B
pulling 40fb844194b2... 100% β–•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–  487 B
verifying sha256 digest
writing manifest
success

3. Run locally DeepSeek-R1

After execution, the terminal will enter interactive mode and can directly enter text to talk to the model for dialogue.

# View downloaded models
 $ ollama list
 NAME ID SIZE MODIFIED
 deepseek-r1:7b 0a8c26691023 4.7 GB 24 seconds ago
 # Run the model
 $ ollama run deepseek-r1:7b
 >>> Send a message (/? for help)

4. Run through Open-WebUI

Using the Web interface interaction model, Open-WebUI can be installed. The tool provides a user-friendly web front-end, making DeepSeek-R1 easier to use.

Clone Open-WebUI repository

$ git clone /open-webui/
Cloning into 'open-webui'...
remote: Enumerating objects: 91391, done.
remote: Counting objects: 100% (131/131), done.
remote: Compressing objects: 100% (74/74), done.
remote: Total 91391 (delta 70), reused 57 (delta 57), pack-reused 91260 (from 2)
Receiving objects: 100% (91391/91391), 177.81 MiB | 3.98 MiB/s, done.
Resolving deltas: 100% (60008/60008), done.
Updating files: 100% (4575/4575), done.

Start the Open-WebUI container

Install docker by mac. After the installation is completed, there will be a docker program in the application. Click to start

brew install --cask --appdir=/Applications docker

Start docker

docker run -d \
  -p 3000:8080 \
  --add-host=:host-gateway \
  -v open-webui:/app/backend/data \
  --name open-webui \
  --restart always \
  /open-webui/open-webui:main
Unable to find image '/open-webui/open-webui:main' locally
main: Pulling from open-webui/open-webui
d51c377d94da: Pull complete
987cac002684: Pull complete
076b75118273: Pull complete
157e623d2984: Pull complete
40d5353a5918: Pull complete
4f4fb700ef54: Pull complete
aebeb0b4e5d0: Pull complete
03f562834d64: Pull complete
dc0f62a912f5: Pull complete
93fdf9ebd111: Pull complete
596be9ce6130: Pull complete
07dc67f42781: Pull complete
7c2ef53b15e7: Pull complete
e5511c24fa69: Pull complete
69de4f91fd38: Pull complete
Digest: sha256:74fc3c741a5f3959c116dd5abc61e4b27d36d97dff83a247dbb4209ffde56372
Status: Downloaded newer image for /open-webui/open-webui:main
26b786db658d187c2b82256fcbf33102c8c10c25b1087393483272e53708908b

β€’ -p 3000:8080: Map the container's port 8080 to the native port 3000;
β€’ --add-host=:host-gateway: Allows containers to access the host network;
β€’ -v open-webui:/app/backend/data: Mount the data storage directory and save the status and data of the container.
β€’ --restart always: Ensure that the container runs automatically after restarting;
β€’ /open-webui/open-webui:main: Pull the latest version of Open-WebUI image.

After running the container, accesshttp://localhost:3000You can access Open-WebUI.

#Stop the container
 docker stop open-webui
 #Delete container
 docker rm open-webui
 #Delete stored data
 docker volume rm open-webui