Difference between revisions of "Ollama"
From Notes_Wiki
(Created page with "Home > Local system based AI tools > Ollama *Ollama installation *Using Ollama with Thunderbird ThunderAI plugin *Ollama model feed-back Home > Local system based AI tools > Ollama") |
m |
||
(One intermediate revision by the same user not shown) | |||
Line 1: | Line 1: | ||
[[Main Page|Home]] > [[Local system based AI tools]] > [[Ollama]] | [[Main Page|Home]] > [[Local system based AI tools]] > [[Ollama]] | ||
Ollama allows running AI models locally without requiring Internet access and without uploading our data to Internet to get output from AI tools. It has CLI and API to use the models directly from CLI or using other AI tools. | |||
*[[Ollama installation]] | *[[Ollama installation]] | ||
*[[Using Ollama with Thunderbird ThunderAI plugin]] | *[[Using Ollama with Thunderbird ThunderAI plugin]] | ||
*[[Ollama model feed-back]] | *[[Ollama model feed-back]] | ||
*[[Ollama GPU usage validation]] | |||
[[Main Page|Home]] > [[Local system based AI tools]] > [[Ollama]] | [[Main Page|Home]] > [[Local system based AI tools]] > [[Ollama]] |
Latest revision as of 10:05, 5 July 2025
Home > Local system based AI tools > Ollama
Ollama allows running AI models locally without requiring Internet access and without uploading our data to Internet to get output from AI tools. It has CLI and API to use the models directly from CLI or using other AI tools.