Releases: valentinegb/openai
v1.0.0-alpha.8
Three new contributors! Nice. I've been paying a bit more attention to the main
branch again and working less on the generate-from-schema
branch. I think that it might be more work than it's worth, it might just be easier to keep working on the main
branch.
What's Changed
- Bump openssl from 0.10.45 to 0.10.48 in /examples/completions_cli by @dependabot in #47
- Bump openssl from 0.10.45 to 0.10.48 in /examples/chat_cli by @dependabot in #48
- Add feature flags to use native-tls or rustls by @scottbot95 in #49
- Disable edit test since the edit models seem to have been pulled by @scottbot95 in #51
- fix: typo / extra context in pre prompt by @antonio-hickey in #52
- Add Moderations API by @rukolahasser in #54
New Contributors
- @scottbot95 made their first contribution in #49
- @antonio-hickey made their first contribution in #52
- @rukolahasser made their first contribution in #54
Full Changelog: v1.0.0-alpha.7...v1.0.0-alpha.8
v1.0.0-alpha.7
This version pretty much just adds parity with the generate-from-schema
branch. The biggest change is the way authentication works now:
use openai::set_key;
use dotenvy::dotenv;
use std::env;
#[tokio::main]
fn main() {
dotenv().unwrap();
set_key(env::var("OPENAI_KEY").unwrap());
}
This set_key
function is much more flexible, so you don't have to use an environment variable. (You still should if you can though.)
With this shift away from depending on an environment variable, the magical ModelID
enumerator is gone. This may be saddening to hear by some, but the truth is it caused more headaches than it prevented. All references to ModelID
have been replaced with String
or &str
. Please make sure you replace code like ModelID::TextDavinci003
with code similar to "text-davinci-003"
.
What's Changed
- Bump serde_json from 1.0.93 to 1.0.94 by @dependabot in #36
- Bump quote from 1.0.23 to 1.0.25 by @dependabot in #39
- Bump serde from 1.0.152 to 1.0.155 by @dependabot in #40
- fix(error): make error type public by @ashleygwilliams in #42
- Bump serde from 1.0.155 to 1.0.157 by @dependabot in #43
- Bump quote from 1.0.25 to 1.0.26 by @dependabot in #44
New Contributors
- @ashleygwilliams made their first contribution in #42
Full Changelog: v1.0.0-alpha.6...v1.0.0-alpha.7
Sponsors
A massive thank you to the individual who became my very first GitHub sponsor! The reward for sponsoring is supposed to be a mention in release notes, but they sponsored privately, so I hope that mentioning their contribution without mentioning their name is alright.
v1.0.0-alpha.6
What's Changed
- Make
ChatCompletion
useopenai::Usage
instead of dedicated structure - Make
Embeddings
use dedicated structure,EmbeddingsUsage
- Switch to MIT license
- Skip
stream
field on applicable request structures (until properly implemented) - Rename lots of structures and methods (see full changelog)
- Implement
Clone
for... pretty much all of the structures - Implement
Copy
forEmbeddingsUsage
andUsage
... and more that I skipped over. If want to see everything that changed, look at the full changelog.
Full Changelog: v1.0.0-alpha.5...v1.0.0-alpha.6
v1.0.0-alpha.5
Red alert! Red alert! Wee woo wee woo!
This morning I woke up to find that OpenAI has finally released the API for ChatGPT! So, today I worked diligently to get this very important update out. I'm proud to say that this library is one of the very first unofficial OpenAI libraries to implement the new Chat API endpoint! So, no more waiting, the time is now to experiment to your hearts content and make some cool stuff. Here's how:
let chat_completion = ChatCompletion::builder(
ModelID::Gpt3_5Turbo,
[ChatCompletionRequestMessage {
role: ChatCompletionMessageRole::User,
content: "Hello!".to_string(),
name: None,
}],
)
.temperature(0.0)
.create()
.await
.unwrap()
.unwrap();
assert_eq!(
chat_completion.choices.first().unwrap().message.content,
"\n\nHello there! How can I assist you today?"
);
Full Changelog: v1.0.0-alpha.4...v1.0.0-alpha.5
v1.0.0-alpha.4
What's Changed
- Add some more keywords to
Cargo.toml
- Host documentation with GitHub Pages (https://valentinegb.github.io/openai/)
Full Changelog: v1.0.0-alpha.3...v1.0.0-alpha.4
v1.0.0-alpha.3
With this release, I have an exciting announcement: this library is now available on crates.io as openai!
I'm very pleased that the previous owner kindly transferred ownership of the package to me. I think this will give a good boost in popularity of the project, now that it can be discovered on crates.io and much more easily added as a dependency.
What's Changed
- Make
Completion.temperature/top_p
of typef32
by @valentinegb in #17 - Add embeddings distance function by @valentinegb in #19
- Handle API errors by @valentinegb in #20
- Create builders for
Completion
andEdit
structures by @valentinegb in #21
Full Changelog: v1.0.0-alpha.2...v1.0.0-alpha.3
v1.0.0-alpha.2
I'll be doing release notes a bit different from here on, since it's a bit tedious to write them. I write code, not... normal... words! See? I'm bad at this.
Anyway, the instructions for installation have been moved to the README
so I don't have to write it every release. And instead of relying on my own recollection of what changed, I'll be depending on automatically generated release notes. To make sure the generated release notes actually list every notable change, I'll be opening issues and pull requests more often to separate and define features and bug fixes.
Now without further ado, take it away automatically-generated-release-notes! (Wow that really rolls off the tongue, huh?)
What's Changed
- Include secrets in workflows triggered by pull requests by @valentinegb in #12
- Bump tokio from 1.24.1 to 1.25.0 by @dependabot in #10
Full Changelog: v1.0.0-alpha.1...v1.0.0-alpha.2
v1.0.0-alpha.1
As per usual, here's how you can add this release as a dependency via git:
[dependencies]
openai = { git = "https://github.com/valentinegb/openai", tag = "v1.0.0-alpha.1" }
and via its path:
[dependencies]
openai = { path = "openai", version = "1.0.0-alpha.1" }
There have been a lot of internal changes that will make future development easier, but here are the most important changes that effect developers using the library:
What's Changed
ModelID
is now procedurally generated, so whenever OpenAI adds a new model, it will immediately be available in the enum- Added the
embeddings
module - Added the
edits
module
Full Changelog: v1.0.0-alpha...v1.0.0-alpha.1
v1.0.0-alpha
It's all different now
Yep, this is a major version bump. We've gone from 0.x to 1.x, neat! If you were using v0.x in a project, for one, why? Secondly, upgrading to v1.x will break said project. This version is completely different!
In this version, lots of work has been done on the models
module and the embeddings
module is almost complete!
So, I realized, that I don't think you can actually use the built binary as a local dependency- whoops! The dependency should be built when the rest of your project is built. Similar to last release, you can add this project as a dependency via GitHub like so:
[dependencies]
openai = { git = "https://github.com/valentinegb/openai", tag = "v1.0.0-alpha" }
If you'd like to use the project as a local dependency, download the source code (not a built binary!) and reference it like so:
[dependencies]
openai = { path = "openai", version = "1.0.0-alpha" }
Full Changelog: v0.1.0...v1.0.0-alpha
v0.1.0
First release, woo! 🎉
The Models category has been completely implemented, as indicated in the project README. Unfortunately, the library is not yet available on crates.io, as openai
has been taken by a crate that hasn't been touched in 6 years... so I need to figure out another name that makes sense, and isn't redundant. Until then, you can add this package via git like so:
[dependencies]
openai = { git = "https://github.com/valentinegb/openai", tag = "v0.1.0" }
or if you'd prefer, you can download the built binary attached to this release and specify a path to it like this:
[dependencies]
openai = { path = "openai", version = "0.1.0" }
Full Changelog: https://github.com/valentinegb/openai/commits/v0.1.0