Update README.md
This commit is contained in:
parent
100d8bf516
commit
830de85151
1 changed files with 34 additions and 0 deletions
34
README.md
34
README.md
|
@ -21,6 +21,7 @@ Check the status of the service:
|
|||
```
|
||||
systemctl status ollama.service
|
||||
```
|
||||
## Getting the models
|
||||
|
||||
Now we can get the models:
|
||||
```
|
||||
|
@ -47,3 +48,36 @@ deepseek-r1:7b 0a8c26691023 4.7 GB
|
|||
deepseek-r1:70b 0c1615a8ca32 42 GB
|
||||
deepseek-r1:8b 28f8fd6cdc67 4.9 GB
|
||||
```
|
||||
## Test
|
||||
|
||||
```
|
||||
ollama run deepseek-r1:1.5b
|
||||
```
|
||||
|
||||
```
|
||||
>>> Hello
|
||||
<think>
|
||||
|
||||
</think>
|
||||
|
||||
Hello! How can I assist you today? 😊
|
||||
|
||||
>>> /bye
|
||||
```
|
||||
|
||||
# Using it with VS Code
|
||||
|
||||
For using Ollama we need a special setting for VS Code. Thus we need to produce instances with different model parameter (or in other words: got to the Modelfile subfolder and check the information their)
|
||||
```
|
||||
code_ds32b:latest 995e2d04e071 19 GB
|
||||
code_ds70b:latest 4930f987452d 42 GB
|
||||
code_ds7b:latest 0438bd669fa8 4.7 GB
|
||||
code_ds8b:latest 643346a4074c 4.9 GB
|
||||
code_ds1.5b:latest 2d66604e7b60 1.1 GB
|
||||
code_ds14b:latest 76c930e3d70a 9.0 GB
|
||||
```
|
||||
Do not you larger models on your CPU or you will die of old age!
|
||||
|
||||
|
||||
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue