Difference between revisions of "Ollama installation"
From Notes_Wiki
| m | m | ||
| (4 intermediate revisions by the same user not shown) | |||
| Line 3: | Line 3: | ||
| =Installing ollama on Rocky 9.x= | =Installing ollama on Rocky 9.x= | ||
| To install Ollama on local system use following steps: | To install Ollama on local system use following steps: | ||
| #  | # Install ollama directly from site using: | ||
| #:<pre> | #:<pre> | ||
| #:: curl - | #:: curl -fsSL https://ollama.com/install.sh | sh | ||
| #:</pre> | |||
| #: We can also install specific version of ollama using syntax similar to below example: | |||
| #::<pre> | |||
| #::: curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh | |||
| #::</pre> | |||
| #:: For this look at model release versions at https://github.com/ollama/ollama/releases | |||
| # Check whether ollama service is running via: | |||
| #:<pre> | |||
| #:: systemctl status ollama | |||
| #:</pre> | #:</pre> | ||
| # After this run ollama on local system via below commands and test: | # After this run ollama on local system via below commands and test: | ||
| #:<pre> | #:<pre> | ||
| #:: ollama run deepseek-r1:1.5b | #:: ollama run deepseek-r1:1.5b | ||
| #:</pre> | #:</pre> | ||
| #:: The above command may take considerable time when run for first time as it download the entire model.   | #:: The above command may take considerable time when run for first time as it download the entire model.  deepseek-r1:1.5b model in above example is 1.1GB in size.  So first the command will download 1.1GB model and only after that we can prompt ">>>" to type our queries. | ||
| #  | # The ollama service runs as ollama user with home folder as <tt>/usr/share/ollama</tt>.   Consider moving /usr/share/ollama to partition with more space and create symoblic link at the original place.  | ||
| # Edit /etc/systemd/system/ollama.service and append one more Environment condition  | |||
| #:<pre> | |||
| #:: Environment="OLLAMA_HOST=0.0.0.0" | |||
| #:</pre> | |||
| # Restart service | |||
| #:<pre> | |||
| #:: systemctl daemon-reload | |||
| #:: systemctl restart ollama | |||
| #:</pre> | |||
| #:: Without above we cannot use Ollama from n8n etc. over http://localhost:11434/ | |||
| # To close the system use: | # To close the system use: | ||
| #:<pre> | #:<pre> | ||
| Line 27: | Line 41: | ||
| #:</pre> | #:</pre> | ||
| [[Main Page|Home]] > [[Local system based AI tools]] > [[Ollama]] > [[Ollama installation]] | [[Main Page|Home]] > [[Local system based AI tools]] > [[Ollama]] > [[Ollama installation]] | ||
Latest revision as of 13:36, 27 July 2025
Home > Local system based AI tools > Ollama > Ollama installation
Installing ollama on Rocky 9.x
To install Ollama on local system use following steps:
- Install ollama directly from site using:
- curl -fsSL https://ollama.com/install.sh | sh
 
- We can also install specific version of ollama using syntax similar to below example:
- curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh
 
- For this look at model release versions at https://github.com/ollama/ollama/releases
 
 
- Check whether ollama service is running via:
- systemctl status ollama
 
 
- After this run ollama on local system via below commands and test:
- ollama run deepseek-r1:1.5b
 
- 
- The above command may take considerable time when run for first time as it download the entire model. deepseek-r1:1.5b model in above example is 1.1GB in size. So first the command will download 1.1GB model and only after that we can prompt ">>>" to type our queries.
 
 
- The ollama service runs as ollama user with home folder as /usr/share/ollama. Consider moving /usr/share/ollama to partition with more space and create symoblic link at the original place.
- Edit /etc/systemd/system/ollama.service and append one more Environment condition
- Environment="OLLAMA_HOST=0.0.0.0"
 
 
- Restart service
- systemctl daemon-reload
- systemctl restart ollama
 
- 
- Without above we cannot use Ollama from n8n etc. over http://localhost:11434/
 
 
- To close the system use:
- /bye
 
 
- Between any two queries which are unrelated we can clear the context using:
- 
- /clear
 
 
- 
Home > Local system based AI tools > Ollama > Ollama installation

