Ollama installation

From Notes_Wiki
Revision as of 09:42, 5 July 2025 by Saurabh (talk | contribs)

Home > Local system based AI tools > Ollama > Ollama installation

Installing ollama on Rocky 9.x

To install Ollama on local system use following steps:

  1. Install ollama directly from site using:
    curl -fsSL https://ollama.com/install.sh | sh
  2. Check whether ollama service is running via:
    systemctl status ollama
  3. After this run ollama on local system via below commands and test:
    ollama run deepseek-r1:1.5b
    The above command may take considerable time when run for first time as it download the entire model. deepseek-r1:1.5b model in above example is 1.1GB in size. So first the command will download 1.1GB model and only after that we can prompt ">>>" to type our queries.
  4. The ollama service runs as ollama user with home folder as =/usr/share/ollama= Consider moving /usr/share/ollama to partition with more space and create symoblic link at the original place.
  5. To close the system use:
    /bye
  6. Between any two queries which are unrelated we can clear the context using:
    /clear


Home > Local system based AI tools > Ollama > Ollama installation