You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **ChatGPT completion**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
4
4
@@ -27,7 +27,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**. Note that
27
27
To pull the library you have to add the following dependency to your *build.sbt*
28
28
29
29
```
30
-
"io.cequence" %% "openai-scala-client" % "0.3.0"
30
+
"io.cequence" %% "openai-scala-client" % "0.3.1"
31
31
```
32
32
33
33
or to *pom.xml* (if you use maven)
@@ -36,11 +36,11 @@ or to *pom.xml* (if you use maven)
36
36
<dependency>
37
37
<groupId>io.cequence</groupId>
38
38
<artifactId>openai-scala-client_2.12</artifactId>
39
-
<version>0.3.0</version>
39
+
<version>0.3.1</version>
40
40
</dependency>
41
41
```
42
42
43
-
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.3.0"` instead.
43
+
If you want a streaming support use `"io.cequence" %% "openai-scala-client-stream" % "0.3.1"` instead.
44
44
45
45
## Config ⚙️
46
46
@@ -80,7 +80,7 @@ Then you can obtain a service in one of the following ways.
80
80
)
81
81
```
82
82
83
-
**✔️ Important**: If you want streaming support use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib instead of `OpenAIServiceFactory` (in the three examples above). Two additional functions - `createCompletionStreamed` and `listFineTuneEventsStreamed` provided by [OpenAIServiceStreamedExtra](./openai-client-stream/src/main/scala/io/cequence/openaiscala/service/OpenAIServiceStreamedExtra.scala) will be then available.
83
+
**✔️ Important**: If you want streaming support use `OpenAIServiceStreamedFactory` from `openai-scala-client-stream` lib instead of `OpenAIServiceFactory` (in the three examples above). Three additional functions - `createCompletionStreamed`, `createChatCompletionStreamed`, and `listFineTuneEventsStreamed` provided by [OpenAIServiceStreamedExtra](./openai-client-stream/src/main/scala/io/cequence/openaiscala/service/OpenAIServiceStreamedExtra.scala) will be then available.
84
84
85
85
- Via dependency injection (requires `openai-scala-guice` lib)
This module provides the actual meat, i.e. WS client implementation ([OpenAIServiceImpl and OpenAIServiceFactory](./src/main/scala/io/cequence/openaiscala/service/OpenAIServiceImpl.scala)).
4
4
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
10
10
To pull the library you have to add the following dependency to your *build.sbt*
11
11
12
12
```
13
-
"io.cequence" %% "openai-scala-client" % "0.3.0"
13
+
"io.cequence" %% "openai-scala-client" % "0.3.1"
14
14
```
15
15
16
16
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
This is the core module, which contains mostly domain classes and the [OpenAIService](./src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala) definition.
4
4
Note that the full project documentation can be found [here](../README.md).
@@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
10
10
To pull the library you have to add the following dependency to your *build.sbt*
11
11
12
12
```
13
-
"io.cequence" %% "openai-scala-core" % "0.3.0"
13
+
"io.cequence" %% "openai-scala-core" % "0.3.1"
14
14
```
15
15
16
16
or to *pom.xml* (if you use maven)
@@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
0 commit comments