n4ze3m
fd654cafdb
feat: Add max tokens setting for model generations
...
Adds a new setting to control the maximum number of tokens generated by the model. This provides more control over the length of responses and can be useful for limiting the amount of text generated in certain situations.
2024-11-09 16:56:47 +05:30
n4ze3m
4ef17ff479
feat: support for GPU layer
2024-08-20 16:11:50 +05:30
n4ze3m
692c0887cc
new settings for rag
2024-06-03 00:30:10 +05:30
n4ze3m
23435318c5
feat: Add localization support for Copilot resume chat and hide current chat model settings
2024-05-24 21:19:53 +05:30
n4ze3m
961f5180c6
Added ability to resume previous chat on copilot
2024-05-24 21:01:10 +05:30
n4ze3m
9e2ef72486
chore: Update Lucide icons and improve Current Chat Model Settings
2024-05-24 20:00:09 +05:30
n4ze3m
b3a455382c
chore: Update version to 1.1.9 and add Model Settings to Ollama settings page
2024-05-23 00:39:44 +05:30