Difference between revisions of "Open Interpreter"
From Notes_Wiki
| m | m | ||
| Line 18: | Line 18: | ||
| </pre> | </pre> | ||
| Ideally for this local [[Ollama]] should be setup with at least one model eg phi4 for it to work properly.  You can ask queries such as  | Ideally for this local [[Ollama]] should be setup with at least one model eg phi4 for it to work properly.    | ||
| You can ask queries such as   | |||
| * What operating system is installed on this system?   | |||
| * Which are top 5 processes using most RAM on this system? | |||
| * How much free RAM is there on this computer? | |||
| and it quickly writes code to get answer and prompts (y/n) before running it. | |||
| Use "%reset" to reset old chat and start new session. | Use "%reset" to reset old chat and start new session. | ||
Revision as of 10:16, 5 July 2025
Home > Local system based AI tools > Open Interpreter
Interpreter can write code and will prompt before running on the system:
As root ensure python3-pip package is installed:
dnf -y install python3-pip
As a non-root / local user we can install interpreter via:
pip install open-interpreter
This install interpreter in ~/.local/bin path. Then if this is already in path we can run interpreter via:
interpreter --local
Ideally for this local Ollama should be setup with at least one model eg phi4 for it to work properly.
You can ask queries such as 
- What operating system is installed on this system?
- Which are top 5 processes using most RAM on this system?
- How much free RAM is there on this computer?
and it quickly writes code to get answer and prompts (y/n) before running it.
Use "%reset" to reset old chat and start new session.
Refer:

