From 3a2c17946ed8d140f7e4427903f6c8d6f3bb599a Mon Sep 17 00:00:00 2001 From: Peter Banda Date: Fri, 24 Feb 2023 12:46:26 +0100 Subject: [PATCH] Version 0.2.0 - README --- README.md | 33 ++++++++++++++++++++++++++++----- openai-client-stream/README.md | 19 +++++++++++++++---- openai-client/README.md | 8 ++++---- openai-core/README.md | 6 +++--- openai-guice/README.md | 6 +++--- 5 files changed, 53 insertions(+), 19 deletions(-) diff --git a/README.md b/README.md index cdec4883..140e624b 100755 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ -# OpenAI Scala Client 🤖 [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) +# OpenAI Scala Client 🤖 [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) -This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are: +This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are: * **Models**: [listModels](https://beta.openai.com/docs/api-reference/models/list), and [retrieveModel](https://beta.openai.com/docs/api-reference/models/retrieve) * **Completions**: [createCompletion](https://beta.openai.com/docs/api-reference/completions/create) @@ -28,7 +28,7 @@ The currently supported Scala versions are **2.12** and **2.13** ~~but **Scala 3 To pull the library you have to add the following dependency to your *build.sbt* ``` -"io.cequence" %% "openai-scala-client" % "0.1.1" +"io.cequence" %% "openai-scala-client" % "0.2.0" ``` or to *pom.xml* (if you use maven) @@ -37,10 +37,12 @@ or to *pom.xml* (if you use maven) io.cequence openai-scala-client_2.12 - 0.1.1 + 0.2.0 ``` +If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.2.0"` instead. + ## Config ⚙️ - Env. variables: `OPENAI_SCALA_CLIENT_API_KEY` and optionally also `OPENAI_SCALA_CLIENT_ORG_ID` (if you have one) @@ -57,7 +59,7 @@ First you need to provide an implicit execution context as well as akka material implicit val materializer = Materializer(ActorSystem()) ``` -Then you can obtain a service in one of the following ways: +Then you can obtain a service in one of the following ways. - Default config (expects env. variable(s) to be set as defined in `Config` section) ```scala @@ -79,6 +81,8 @@ Then you can obtain a service in one of the following ways: ) ``` +**✔️ Important**: If you want to use streaming use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib instead of `OpenAIServiceFactory` (in the three examples above). Two additional functions - `createCompletionStreamed` and `listFineTuneEventsStreamed` provided by [OpenAIServiceStreamedExtra](./openai-client-stream/src/main/scala/io/cequence/openaiscala/service/OpenAIServiceStreamedExtra.scala) will be then available. + - Via dependency injection (requires `openai-scala-guice` lib) ```scala @@ -146,6 +150,25 @@ Examples: ) ``` +- Create completion with streaming and a custom setting + +```scala + val source = service.createCompletionStreamed( + prompt = "Write me a Shakespeare poem about two cats playing baseball in Russia using at least 2 pages", + settings = CreateCompletionSettings( + model = ModelId.text_davinci_003, + max_tokens = Some(1500), + temperature = Some(0.9), + presence_penalty = Some(0.2), + frequency_penalty = Some(0.2) + ) + ) + + source.map(completion => + println(completion.choices.head.text) + ).runWith(Sink.ignore) +``` +(For this to work you need to use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib) ## FAQ 🤔 diff --git a/openai-client-stream/README.md b/openai-client-stream/README.md index 828688bc..baed38ed 100755 --- a/openai-client-stream/README.md +++ b/openai-client-stream/README.md @@ -1,4 +1,4 @@ -# OpenAI Scala Client - Stream Support [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) +# OpenAI Scala Client - Stream Support [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) This module provides streaming support for the client. Note that the full project documentation can be found [here](../README.md). @@ -6,7 +6,18 @@ This module provides streaming support for the client. Note that the full projec The currently supported Scala versions are **2.12** and **2.13**. -~~To pull the library you have to add the following dependency to your *build.sbt*~~ +To pull the library you have to add the following dependency to your *build.sbt* -**Not yet released** -... +``` +"io.cequence" %% "openai-scala-client-stream" % "0.2.0" +``` + +or to *pom.xml* (if you use maven) + +``` + + io.cequence + openai-scala-client-stream_2.12 + 0.2.0 + +``` diff --git a/openai-client/README.md b/openai-client/README.md index f6dbbc6b..d9427c3a 100755 --- a/openai-client/README.md +++ b/openai-client/README.md @@ -1,6 +1,6 @@ -# OpenAI Scala Client - Client [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) +# OpenAI Scala Client - Client [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) -This module provided the actual meat, i.e. WS client implementation ([OpenAIServiceImpl and OpenAIServiceFactory](./src/main/scala/io/cequence/openaiscala/service/OpenAIServiceImpl.scala)). +This module provides the actual meat, i.e. WS client implementation ([OpenAIServiceImpl and OpenAIServiceFactory](./src/main/scala/io/cequence/openaiscala/service/OpenAIServiceImpl.scala)). Note that the full project documentation can be found [here](../README.md). ## Installation 🚀 @@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**. To pull the library you have to add the following dependency to your *build.sbt* ``` -"io.cequence" %% "openai-scala-client" % "0.1.1" +"io.cequence" %% "openai-scala-client" % "0.2.0" ``` or to *pom.xml* (if you use maven) @@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven) io.cequence openai-scala-client_2.12 - 0.1.1 + 0.2.0 ``` \ No newline at end of file diff --git a/openai-core/README.md b/openai-core/README.md index f39dc60c..478484d0 100755 --- a/openai-core/README.md +++ b/openai-core/README.md @@ -1,4 +1,4 @@ -# OpenAI Scala Client - Core [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) +# OpenAI Scala Client - Core [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) This is the core module, which contains mostly domain classes and the [OpenAIService](./src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala) definition. Note that the full project documentation can be found [here](../README.md). @@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**. To pull the library you have to add the following dependency to your *build.sbt* ``` -"io.cequence" %% "openai-scala-core" % "0.1.1" +"io.cequence" %% "openai-scala-core" % "0.2.0" ``` or to *pom.xml* (if you use maven) @@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven) io.cequence openai-scala-core_2.12 - 0.1.1 + 0.2.0 ``` diff --git a/openai-guice/README.md b/openai-guice/README.md index 8363e39e..5d18a1a9 100755 --- a/openai-guice/README.md +++ b/openai-guice/README.md @@ -1,4 +1,4 @@ -# OpenAI Scala Client - Guice [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) +# OpenAI Scala Client - Guice [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) This module provides dependency injection for the OpenAI Scala client with a help of `Guice` library. Note that the full project documentation can be found [here](../README.md). @@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12** and **2.13**. To pull the library you have to add the following dependency to your *build.sbt* ``` -"io.cequence" %% "openai-scala-guice" % "0.1.1" +"io.cequence" %% "openai-scala-guice" % "0.2.0" ``` or to *pom.xml* (if you use maven) @@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven) io.cequence openai-scala-guice_2.12 - 0.1.1 + 0.2.0 ```