-
Notifications
You must be signed in to change notification settings - Fork 40
Feature Request: Add Ollama Support #6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
👋 Welcome! Thanks for opening your first issue. If you'd like to take a crack at fixing it, feel free to open a pull request — otherwise, we'll take a look as soon as we can! |
@techteam-pro-discount Have you looked into unsloth inference? Unsloth supports lots of vision language models compared to ollama. Will integrate one of these two. |
This I can add for ollama also. But you will need to host the ollama model. |
never heard of that, ill take a look later.
im havin ollama running for some other services already, so i would love to have your project join the "circle" :D |
@techteam-pro-discount This is added. You can use your existing ollama server https://github.com/NanoNets/docext?tab=readme-ov-file#models-with-ollama-linux-and-macos Feel free to reopen the issue if you face any problem |
I like the vision of the project but to have it running fully locally i would love to see an ollama support in the future.
Just a field or ENV to put the ollama url would be cool :D
The text was updated successfully, but these errors were encountered: