diff --git a/README.md b/README.md index 8b1d7c3..42d3d1b 100644 --- a/README.md +++ b/README.md @@ -83,3 +83,17 @@ Do not you larger models on your CPU or you will die of old age! Please make sure that your shell doesn't use something like Starship or the posh packages! Otherwise VS Code can not run terminal command! ![Roo Code](images/2025-02-02_18-39.png) + +## Setting of roo code + +You need to select Ollama. Then you need to connect to Ollama. If Ollama is local on your machine, you don't so anything here. For home office +I use +``` +ssh -p [PORT OLLAMA SERVER] -L 11434:127.0.0.1:11434 [USERNAME]@[IP OLLAMA SERVER] +``` +to connect directly to the computer with Ollama and tunnel the 11434 to my local machine. (Or in other words: I make the external Ollama available as local service on port 11434). + +Select a code model, which you created by using the modelfiles. + +![Roo Code Setting](images/2025-02-02_18-45_1.png) +