feat: Add multiple Ollama support
Adds support for using Ollama 2 as a model provider. This includes: - Adding Ollama 2 to the list of supported providers in the UI - Updating the model identification logic to properly handle Ollama 2 models - Modifying the model loading and runtime configuration to work with Ollama 2 - Implementing Ollama 2 specific functionality in the embedding and chat models This change allows users to leverage the capabilities of Ollama 2 for both embeddings and conversational AI tasks.
This commit is contained in:
@@ -47,7 +47,7 @@ export const OpenAIApp = () => {
|
||||
})
|
||||
setOpen(false)
|
||||
message.success(t("addSuccess"))
|
||||
const noPopupProvider = ["lmstudio", "llamafile"]
|
||||
const noPopupProvider = ["lmstudio", "llamafile", "ollama2"]
|
||||
if (!noPopupProvider.includes(provider)) {
|
||||
setOpenaiId(data)
|
||||
setOpenModelModal(true)
|
||||
|
||||
Reference in New Issue
Block a user