Skip to content

Feature Request: Add Ollama Support #6

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
techteam-pro-discount opened this issue Apr 8, 2025 · 5 comments · Fixed by #7
Closed

Feature Request: Add Ollama Support #6

techteam-pro-discount opened this issue Apr 8, 2025 · 5 comments · Fixed by #7
Assignees
Labels
enhancement New feature or request

Comments

@techteam-pro-discount
Copy link

I like the vision of the project but to have it running fully locally i would love to see an ollama support in the future.
Just a field or ENV to put the ollama url would be cool :D

Copy link

github-actions bot commented Apr 8, 2025

👋 Welcome! Thanks for opening your first issue. If you'd like to take a crack at fixing it, feel free to open a pull request — otherwise, we'll take a look as soon as we can!

@mandalsouvik3333 mandalsouvik3333 self-assigned this Apr 9, 2025
@mandalsouvik3333 mandalsouvik3333 added the enhancement New feature or request label Apr 9, 2025
@mandalsouvik3333
Copy link
Collaborator

mandalsouvik3333 commented Apr 9, 2025

@techteam-pro-discount Have you looked into unsloth inference? Unsloth supports lots of vision language models compared to ollama. Will integrate one of these two.

@mandalsouvik3333
Copy link
Collaborator

Just a field or ENV to put the ollama url would be cool :D

This I can add for ollama also. But you will need to host the ollama model.

@techteam-pro-discount
Copy link
Author

@techteam-pro-discount Have you looked into unsloth inference? Unsloth supports lots of vision language models compared to ollama. Will integrate one of these two.

never heard of that, ill take a look later.

Just a field or ENV to put the ollama url would be cool :D

This I can add for ollama also. But you will need to host the ollama model.

im havin ollama running for some other services already, so i would love to have your project join the "circle" :D

@mandalsouvik3333
Copy link
Collaborator

mandalsouvik3333 commented Apr 10, 2025

@techteam-pro-discount This is added. You can use your existing ollama server https://github.com/NanoNets/docext?tab=readme-ov-file#models-with-ollama-linux-and-macos

Feel free to reopen the issue if you face any problem

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants