Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Vertex AI] Refactor HarmBlockThreshold as a struct and add .off #13863

Merged
merged 3 commits into from
Oct 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions FirebaseVertexAI/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,12 @@
- [changed] The response from `GenerativeModel.countTokens(...)` now includes
`systemInstruction`, `tools` and `generationConfig` in the `totalTokens` and
`totalBillableCharacters` counts, where applicable. (#13813)
- [added] Added a new `HarmCategory` `.civicIntegrity` for filtering content
that may be used to harm civic integrity. (#13728)
- [added] Added a new `HarmBlockThreshold` `.off`, which turns off the safety
filter. (#13863)
- [added] Added new `FinishReason` values `.blocklist`, `.prohibitedContent`,
`.spii` and `.malformedFunctionCall` that may be reported. (#13860)

# 11.3.0
- [added] Added `Decodable` conformance for `FunctionResponse`. (#13606)
Expand Down
35 changes: 29 additions & 6 deletions FirebaseVertexAI/Sources/Safety.swift
Original file line number Diff line number Diff line change
Expand Up @@ -90,18 +90,41 @@ public struct SafetyRating: Equatable, Hashable, Sendable {
@available(iOS 15.0, macOS 11.0, macCatalyst 15.0, tvOS 15.0, watchOS 8.0, *)
public struct SafetySetting {
/// Block at and beyond a specified ``SafetyRating/HarmProbability``.
public enum HarmBlockThreshold: String, Sendable {
// Content with `.negligible` will be allowed.
case blockLowAndAbove = "BLOCK_LOW_AND_ABOVE"
public struct HarmBlockThreshold: EncodableProtoEnum, Sendable {
andrewheard marked this conversation as resolved.
Show resolved Hide resolved
enum Kind: String {
case blockLowAndAbove = "BLOCK_LOW_AND_ABOVE"
case blockMediumAndAbove = "BLOCK_MEDIUM_AND_ABOVE"
case blockOnlyHigh = "BLOCK_ONLY_HIGH"
case blockNone = "BLOCK_NONE"
case off = "OFF"
}

/// Content with `.negligible` will be allowed.
public static var blockLowAndAbove: HarmBlockThreshold {
return self.init(kind: .blockLowAndAbove)
}

/// Content with `.negligible` and `.low` will be allowed.
case blockMediumAndAbove = "BLOCK_MEDIUM_AND_ABOVE"
public static var blockMediumAndAbove: HarmBlockThreshold {
return self.init(kind: .blockMediumAndAbove)
}

/// Content with `.negligible`, `.low`, and `.medium` will be allowed.
case blockOnlyHigh = "BLOCK_ONLY_HIGH"
public static var blockOnlyHigh: HarmBlockThreshold {
return self.init(kind: .blockOnlyHigh)
}

/// All content will be allowed.
case blockNone = "BLOCK_NONE"
public static var blockNone: HarmBlockThreshold {
return self.init(kind: .blockNone)
}

/// Turn off the safety filter.
public static var off: HarmBlockThreshold {
return self.init(kind: .off)
}

let rawValue: String
}

enum CodingKeys: String, CodingKey {
Expand Down
12 changes: 12 additions & 0 deletions FirebaseVertexAI/Tests/Integration/IntegrationTests.swift
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,18 @@ final class IntegrationTests: XCTestCase {

func testCountTokens_text() async throws {
let prompt = "Why is the sky blue?"
model = vertex.generativeModel(
modelName: "gemini-1.5-pro",
generationConfig: generationConfig,
safetySettings: [
SafetySetting(harmCategory: .harassment, threshold: .blockLowAndAbove),
SafetySetting(harmCategory: .hateSpeech, threshold: .blockMediumAndAbove),
SafetySetting(harmCategory: .sexuallyExplicit, threshold: .blockOnlyHigh),
SafetySetting(harmCategory: .dangerousContent, threshold: .blockNone),
SafetySetting(harmCategory: .civicIntegrity, threshold: .off),
],
systemInstruction: systemInstruction
)

let response = try await model.countTokens(prompt)

Expand Down
Loading