Msty
Home > Local system based AI tools > Msty
About
Msty gives access to most of the modern AI tools including local Ollama from a easy to use interface.
It also supports RAG for ollama using Internet search. However try to use models which fit completely in GPU for RAG as the overall prompt including the web page becomes fairly long and if tokens/second is less then the output would be too slow.
Install and execute
To install Msty use following steps:
- Download AppImage from https://msty.app/
- Install by doing "chmod +x" and running it
- Choose "Integrate and Run" option in the AppImageLauncher application
See Rocky 9.x Owncloud client via AppImage
In case of Ubuntu 24.04 run using ./Msty.AppImage --no-sandbox
Configure
- In Msty better to configure remote ollama with http://localhost:11434/ instead of using local so that we dont have to redownload the models again and also so that the same in-memory cache works for other products also
- Optionally delete ~/.config/Msty/models folder and make it point to ~/.ollama/models folder. This is going to impact only if you try to use local Msty so that same models dont need to be downloaded again.
Sample queries
We can ask queries using local knowledge such as:
- How to check Linux flavor installed on current system using shell command
Then to search internet before reply, click on Globe icon to enable "Real Time Data" or (RAG) and then search
- What are best ollama models available today to run on local system
Tool usage
Tool has option of cloning the chat, editing already given reply before asking further query, splitting chat, context sheild, enable / disable real-time data, etc. options.
Add more remote options
We can add remote options such as:
- Perplexity via its API key
- OpenAI via its API key
and use these directly from Msty via API