Difference between revisions of "Ollama installation"

From Notes_Wiki
m
m
 
(One intermediate revision by the same user not shown)
Line 7: Line 7:
#:: curl -fsSL https://ollama.com/install.sh | sh
#:: curl -fsSL https://ollama.com/install.sh | sh
#:</pre>
#:</pre>
#: We can also install specific version of ollama using syntax similar to below example:
#::<pre>
#::: curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh
#::</pre>
#:: For this look at model release versions at https://github.com/ollama/ollama/releases
# Check whether ollama service is running via:
# Check whether ollama service is running via:
#:<pre>
#:<pre>
Line 17: Line 22:
#:: The above command may take considerable time when run for first time as it download the entire model.  deepseek-r1:1.5b model in above example is 1.1GB in size.  So first the command will download 1.1GB model and only after that we can prompt ">>>" to type our queries.
#:: The above command may take considerable time when run for first time as it download the entire model.  deepseek-r1:1.5b model in above example is 1.1GB in size.  So first the command will download 1.1GB model and only after that we can prompt ">>>" to type our queries.
# The ollama service runs as ollama user with home folder as <tt>/usr/share/ollama</tt>.  Consider moving /usr/share/ollama to partition with more space and create symoblic link at the original place.  
# The ollama service runs as ollama user with home folder as <tt>/usr/share/ollama</tt>.  Consider moving /usr/share/ollama to partition with more space and create symoblic link at the original place.  
# Edit /etc/systemd/system/ollama.service and append one more Environment condition
#:<pre>
#:: Environment="OLLAMA_HOST=0.0.0.0"
#:</pre>
# Restart service
#:<pre>
#:: systemctl daemon-reload
#:: systemctl restart ollama
#:</pre>
#:: Without above we cannot use Ollama from n8n etc. over http://localhost:11434/
# To close the system use:
# To close the system use:
#:<pre>
#:<pre>

Latest revision as of 13:36, 27 July 2025

Home > Local system based AI tools > Ollama > Ollama installation

Installing ollama on Rocky 9.x

To install Ollama on local system use following steps:

  1. Install ollama directly from site using:
    curl -fsSL https://ollama.com/install.sh | sh
    We can also install specific version of ollama using syntax similar to below example:
    curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh
    For this look at model release versions at https://github.com/ollama/ollama/releases
  2. Check whether ollama service is running via:
    systemctl status ollama
  3. After this run ollama on local system via below commands and test:
    ollama run deepseek-r1:1.5b
    The above command may take considerable time when run for first time as it download the entire model. deepseek-r1:1.5b model in above example is 1.1GB in size. So first the command will download 1.1GB model and only after that we can prompt ">>>" to type our queries.
  4. The ollama service runs as ollama user with home folder as /usr/share/ollama. Consider moving /usr/share/ollama to partition with more space and create symoblic link at the original place.
  5. Edit /etc/systemd/system/ollama.service and append one more Environment condition
    Environment="OLLAMA_HOST=0.0.0.0"
  6. Restart service
    systemctl daemon-reload
    systemctl restart ollama
    Without above we cannot use Ollama from n8n etc. over http://localhost:11434/
  7. To close the system use:
    /bye
  8. Between any two queries which are unrelated we can clear the context using:
    /clear


Home > Local system based AI tools > Ollama > Ollama installation