Skip to content

Releases: intitni/CustomSuggestionServiceForCopilotForXcode

0.6.0

30 Dec 06:37
Compare
Choose a tag to compare

What's Changed

  • fix: correct stream completion detection in OpenAI response by @gbfansheng in #19
  • Add Anthropic API support (Claude) for code completion by @bastianX6 in #21

New Contributors

Full Changelog: 0.5.0...0.6.0

0.5.0

24 Sep 09:00
Compare
Choose a tag to compare
  • Support Ollama FIM API
  • Support setting the prompt as raw when the strategy is set to Fill-in-the-middle
  • Update the Mistral FIM API to be more inclusive so that it can also support APIs like the DeepSeek one.
  • Add error toast in app.

0.4.0

24 Jun 09:14
Compare
Choose a tag to compare
  • Fill-in-the-middle strategy now supports custom prompt template so you can use it with other models.
  • Support models with an FIM API, for example, Mistral AI FIM endpoint.
  • Support setting suggestion token limit.
  • OpenAI (compatible) APIs will now use stream response. So it will response more promptly when you give it a line limit.

0.3.0

15 Apr 16:25
Compare
Choose a tag to compare
  • Support setting a line limit to suggestions.
    Running LLMs locally can be slow, if you are having this issue, you can, for example, set it to 1 to get a faster full line suggestion.

0.2.0

04 Mar 05:03
Compare
Choose a tag to compare
  • Supports Ollama when setting up custom models.
  • Added CodeLlama Fill-in-the-Middle strategy, which works very well. It's recommended to be used along with codellama:xb-code series models. You can try other models that support FIM, too. But I am not sure if they share the same prompt format.
  • Support trimming repeated suffixes from the response #2
  • When Tabby is used, the app will use its own strategy.
  • Fix response parsing in strategies.

0.1.0

22 Feb 08:36
Compare
Choose a tag to compare

Initial release!