Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot support local hosted model #204

Open
Neildave999 opened this issue Feb 20, 2025 · 1 comment
Open

Cannot support local hosted model #204

Neildave999 opened this issue Feb 20, 2025 · 1 comment

Comments

@Neildave999
Copy link

Hi Team,

I am trying to run the AI Suite on local model served but i cannot see current source code is supporting it, by any chance we can use local fine tuned model or locally model serve can be used for AI Agent framework? If yes, please share more details

@annuszulfiqar2021
Copy link

Hi @Neildave999. I've tested locally-hosted models with Ollama provider and the API works fine. How are you hosting your models?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants