Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

V2 Module: AI #301

Closed
manparvesh opened this issue Oct 7, 2024 · 0 comments · Fixed by #299
Closed

V2 Module: AI #301

manparvesh opened this issue Oct 7, 2024 · 0 comments · Fixed by #299
Milestone

Comments

@manparvesh
Copy link
Member

What would you like to be added: use local LLM models and communicate via yoda

Any ideas you might have for implementation: We can create a yoda plugin for it and make it communicate to local LLM models via ollama

@manparvesh manparvesh added this to the V2 milestone Oct 7, 2024
@manparvesh manparvesh mentioned this issue Oct 7, 2024
4 tasks
@manparvesh manparvesh linked a pull request Oct 9, 2024 that will close this issue
4 tasks
@manparvesh manparvesh mentioned this issue Oct 10, 2024
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant