Update README.md

This commit is contained in:
David Rotermund 2025-02-02 18:52:07 +01:00
parent da3fe1ffe2
commit 3232bbaa1b

View file

@ -83,3 +83,17 @@ Do not you larger models on your CPU or you will die of old age!
Please make sure that your shell doesn't use something like Starship or the posh packages! Otherwise VS Code can not run terminal command!
![Roo Code](images/2025-02-02_18-39.png)
## Setting of roo code
You need to select Ollama. Then you need to connect to Ollama. If Ollama is local on your machine, you don't so anything here. For home office
I use
```
ssh -p [PORT OLLAMA SERVER] -L 11434:127.0.0.1:11434 [USERNAME]@[IP OLLAMA SERVER]
```
to connect directly to the computer with Ollama and tunnel the 11434 to my local machine. (Or in other words: I make the external Ollama available as local service on port 11434).
Select a code model, which you created by using the modelfiles.
![Roo Code Setting](images/2025-02-02_18-45_1.png)