Open Interpreter

From Notes_Wiki
Revision as of 17:24, 20 July 2025 by Saurabh (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Home > Local system based AI tools > Open Interpreter

About

Interpreter can write code and will prompt before running on the system:


Installation

As root ensure python3-pip package is installed:

On Rocky 9.x

dnf -y install python3-pip

Then as a non-root / local user we can install interpreter via:

pip install open-interpreter


On Ubuntu 24.04

As root user:

apt -y install python3-pip
apt -y install python3.12-venv

Then as saurabh user:

python3 -m venv open-interpreter
./open-interpreter/bin/pip install open-interpreter


Executing

Ideally for this local Ollama should be setup with at least one non-thinking model eg phi4 for it to work properly.

Rocky 9.x

This install interpreter in ~/.local/bin path. Then if this is already in path we can run interpreter via:

interpreter --local


Ubuntu 24.04

Execute it from the virtual environment via:

./open-interpreter/bin/interpreter --local



Sample Queries

You can ask queries such as

  • What operating system is installed on this system?
  • Which are top 5 processes using most RAM on this system?
  • How much free RAM is there on this computer?

and it quickly writes code to get answer and prompts (y/n) before running it.


Usage

Use "%reset" to reset old chat and start new session.

Use 'Ctrl+C' to quit


Refer:


Home > Local system based AI tools > Open Interpreter