-
Is there a MCP client example code that shows where to setup the configuration of LLM (other than Claude, e.g. Ollama, OpenAI, etc.) when not using Claude Desktop App? |
Beta Was this translation helpful? Give feedback.
Replies: 0 comments 4 replies
-
Servers provide context but usually don't call LLMs directly. They provide the context to clients, which are responsbile for getting completions from the LLM. This means that any application using MCP is responsible for this. E.g. in Cody and Zed you can freely choose the LLM you use. Please refer to their respective configuration. |
Beta Was this translation helpful? Give feedback.
-
Does that mean the client must connect to either Claude Desktop App, Cody or Zed to get completion from LLM? Does the python-sdk provide any functionality to allow implement our own Host app without reliance of any of them? If not, this doesn't seem very practical for custom integration because user is forced to install Claude Desktop App, Cody or Zed? |
Beta Was this translation helpful? Give feedback.
Clients use MCP to obtain context. How they sample is up to them and an MCP concern. Any app can implement an MCP client. they take the context from the servers combine it what ever they want and then get a completion in any way they want.