Releases: intitni/CustomSuggestionServiceForCopilotForXcode
Releases · intitni/CustomSuggestionServiceForCopilotForXcode
0.6.0
What's Changed
- fix: correct stream completion detection in OpenAI response by @gbfansheng in #19
- Add Anthropic API support (Claude) for code completion by @bastianX6 in #21
New Contributors
- @gbfansheng made their first contribution in #19
- @bastianX6 made their first contribution in #21
Full Changelog: 0.5.0...0.6.0
0.5.0
0.4.0
- Fill-in-the-middle strategy now supports custom prompt template so you can use it with other models.
- Support models with an FIM API, for example, Mistral AI FIM endpoint.
- Support setting suggestion token limit.
- OpenAI (compatible) APIs will now use stream response. So it will response more promptly when you give it a line limit.
0.3.0
0.2.0
- Supports Ollama when setting up custom models.
- Added CodeLlama Fill-in-the-Middle strategy, which works very well. It's recommended to be used along with
codellama:xb-code
series models. You can try other models that support FIM, too. But I am not sure if they share the same prompt format. - Support trimming repeated suffixes from the response #2
- When Tabby is used, the app will use its own strategy.
- Fix response parsing in strategies.