Difference between revisions of "Ollama installation"
From Notes_Wiki
m |
m |
||
Line 6: | Line 6: | ||
#:<pre> | #:<pre> | ||
#:: curl -fsSL https://ollama.com/install.sh | sh | #:: curl -fsSL https://ollama.com/install.sh | sh | ||
#:</pre> | |||
# Check whether ollama service is running via: | |||
#:<pre> | |||
#:: systemctl status ollama | |||
#:</pre> | #:</pre> | ||
# After this run ollama on local system via below commands and test: | # After this run ollama on local system via below commands and test: | ||
Line 11: | Line 15: | ||
#:: ollama run deepseek-r1:1.5b | #:: ollama run deepseek-r1:1.5b | ||
#:</pre> | #:</pre> | ||
#:: The above command may take considerable time when run for first time as it download the entire model. | #:: The above command may take considerable time when run for first time as it download the entire model. deepseek-r1:1.5b model in above example is 1.1GB in size. So first the command will download 1.1GB model and only after that we can prompt ">>>" to type our queries. | ||
# | # The ollama service runs as ollama user with home folder as =/usr/share/ollama= Consider moving /usr/share/ollama to partition with more space and create symoblic link at the original place. | ||
# To close the system use: | # To close the system use: | ||
#:<pre> | #:<pre> | ||
Line 22: | Line 26: | ||
#:</pre> | #:</pre> | ||
[[Main Page|Home]] > [[Local system based AI tools]] > [[Ollama]] > [[Ollama installation]] | [[Main Page|Home]] > [[Local system based AI tools]] > [[Ollama]] > [[Ollama installation]] |
Revision as of 09:42, 5 July 2025
Home > Local system based AI tools > Ollama > Ollama installation
Installing ollama on Rocky 9.x
To install Ollama on local system use following steps:
- Install ollama directly from site using:
- curl -fsSL https://ollama.com/install.sh | sh
- Check whether ollama service is running via:
- systemctl status ollama
- After this run ollama on local system via below commands and test:
- ollama run deepseek-r1:1.5b
-
- The above command may take considerable time when run for first time as it download the entire model. deepseek-r1:1.5b model in above example is 1.1GB in size. So first the command will download 1.1GB model and only after that we can prompt ">>>" to type our queries.
- The ollama service runs as ollama user with home folder as =/usr/share/ollama= Consider moving /usr/share/ollama to partition with more space and create symoblic link at the original place.
- To close the system use:
- /bye
- Between any two queries which are unrelated we can clear the context using:
-
- /clear
-
Home > Local system based AI tools > Ollama > Ollama installation