Using Ollama via runpod serverless #195
Unanswered
oschmidteu
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey guys,
Has anyone tried using Ollama via Runpod serverless? I'm not sure what the best value/price option is for someone without a dedicated GPU. Most of the time, I'm using the OpenAI API, but if I use it for my paperless archive, I'll be broke! ;)
Beta Was this translation helpful? Give feedback.
All reactions