Skip to content

Commit

Permalink
Version 0.2.0 - README
Browse files Browse the repository at this point in the history
  • Loading branch information
peterbanda committed Feb 24, 2023
1 parent 765c2ea commit 3a2c179
Show file tree
Hide file tree
Showing 5 changed files with 53 additions and 19 deletions.
33 changes: 28 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# OpenAI Scala Client 🤖 [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
# OpenAI Scala Client 🤖 [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)

This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:

* **Models**: [listModels](https://beta.openai.com/docs/api-reference/models/list), and [retrieveModel](https://beta.openai.com/docs/api-reference/models/retrieve)
* **Completions**: [createCompletion](https://beta.openai.com/docs/api-reference/completions/create)
Expand Down Expand Up @@ -28,7 +28,7 @@ The currently supported Scala versions are **2.12** and **2.13** ~~but **Scala 3
To pull the library you have to add the following dependency to your *build.sbt*

```
"io.cequence" %% "openai-scala-client" % "0.1.1"
"io.cequence" %% "openai-scala-client" % "0.2.0"
```

or to *pom.xml* (if you use maven)
Expand All @@ -37,10 +37,12 @@ or to *pom.xml* (if you use maven)
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-client_2.12</artifactId>
<version>0.1.1</version>
<version>0.2.0</version>
</dependency>
```

If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.2.0"` instead.

## Config ⚙️

- Env. variables: `OPENAI_SCALA_CLIENT_API_KEY` and optionally also `OPENAI_SCALA_CLIENT_ORG_ID` (if you have one)
Expand All @@ -57,7 +59,7 @@ First you need to provide an implicit execution context as well as akka material
implicit val materializer = Materializer(ActorSystem())
```

Then you can obtain a service in one of the following ways:
Then you can obtain a service in one of the following ways.

- Default config (expects env. variable(s) to be set as defined in `Config` section)
```scala
Expand All @@ -79,6 +81,8 @@ Then you can obtain a service in one of the following ways:
)
```

**✔️ Important**: If you want to use streaming use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib instead of `OpenAIServiceFactory` (in the three examples above). Two additional functions - `createCompletionStreamed` and `listFineTuneEventsStreamed` provided by [OpenAIServiceStreamedExtra](./openai-client-stream/src/main/scala/io/cequence/openaiscala/service/OpenAIServiceStreamedExtra.scala) will be then available.

- Via dependency injection (requires `openai-scala-guice` lib)

```scala
Expand Down Expand Up @@ -146,6 +150,25 @@ Examples:
)
```

- Create completion with streaming and a custom setting

```scala
val source = service.createCompletionStreamed(
prompt = "Write me a Shakespeare poem about two cats playing baseball in Russia using at least 2 pages",
settings = CreateCompletionSettings(
model = ModelId.text_davinci_003,
max_tokens = Some(1500),
temperature = Some(0.9),
presence_penalty = Some(0.2),
frequency_penalty = Some(0.2)
)
)

source.map(completion =>
println(completion.choices.head.text)
).runWith(Sink.ignore)
```
(For this to work you need to use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib)

## FAQ 🤔

Expand Down
19 changes: 15 additions & 4 deletions openai-client-stream/README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,23 @@
# OpenAI Scala Client - Stream Support [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
# OpenAI Scala Client - Stream Support [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)

This module provides streaming support for the client. Note that the full project documentation can be found [here](../README.md).

## Installation 🚀

The currently supported Scala versions are **2.12** and **2.13**.

~~To pull the library you have to add the following dependency to your *build.sbt*~~
To pull the library you have to add the following dependency to your *build.sbt*

**Not yet released**
...
```
"io.cequence" %% "openai-scala-client-stream" % "0.2.0"
```

or to *pom.xml* (if you use maven)

```
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-client-stream_2.12</artifactId>
<version>0.2.0</version>
</dependency>
```
8 changes: 4 additions & 4 deletions openai-client/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# OpenAI Scala Client - Client [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
# OpenAI Scala Client - Client [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)

This module provided the actual meat, i.e. WS client implementation ([OpenAIServiceImpl and OpenAIServiceFactory](./src/main/scala/io/cequence/openaiscala/service/OpenAIServiceImpl.scala)).
This module provides the actual meat, i.e. WS client implementation ([OpenAIServiceImpl and OpenAIServiceFactory](./src/main/scala/io/cequence/openaiscala/service/OpenAIServiceImpl.scala)).
Note that the full project documentation can be found [here](../README.md).

## Installation 🚀
Expand All @@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
To pull the library you have to add the following dependency to your *build.sbt*

```
"io.cequence" %% "openai-scala-client" % "0.1.1"
"io.cequence" %% "openai-scala-client" % "0.2.0"
```

or to *pom.xml* (if you use maven)
Expand All @@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-client_2.12</artifactId>
<version>0.1.1</version>
<version>0.2.0</version>
</dependency>
```
6 changes: 3 additions & 3 deletions openai-core/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# OpenAI Scala Client - Core [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
# OpenAI Scala Client - Core [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)

This is the core module, which contains mostly domain classes and the [OpenAIService](./src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala) definition.
Note that the full project documentation can be found [here](../README.md).
Expand All @@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
To pull the library you have to add the following dependency to your *build.sbt*

```
"io.cequence" %% "openai-scala-core" % "0.1.1"
"io.cequence" %% "openai-scala-core" % "0.2.0"
```

or to *pom.xml* (if you use maven)
Expand All @@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-core_2.12</artifactId>
<version>0.1.1</version>
<version>0.2.0</version>
</dependency>
```
6 changes: 3 additions & 3 deletions openai-guice/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# OpenAI Scala Client - Guice [![version](https://img.shields.io/badge/version-0.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
# OpenAI Scala Client - Guice [![version](https://img.shields.io/badge/version-0.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)

This module provides dependency injection for the OpenAI Scala client with a help of `Guice` library.
Note that the full project documentation can be found [here](../README.md).
Expand All @@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12** and **2.13**.
To pull the library you have to add the following dependency to your *build.sbt*

```
"io.cequence" %% "openai-scala-guice" % "0.1.1"
"io.cequence" %% "openai-scala-guice" % "0.2.0"
```

or to *pom.xml* (if you use maven)
Expand All @@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-guice_2.12</artifactId>
<version>0.1.1</version>
<version>0.2.0</version>
</dependency>
```

0 comments on commit 3a2c179

Please sign in to comment.