Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swift 6 Complete Concurrency Checking #85

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

jdisho
Copy link
Collaborator

@jdisho jdisho commented Dec 20, 2024

Addressing: #54

Swift6 - Complete Concurrency Checking

♻️ Current situation & Problem

Xcode is throwing concurrency warnings for the SpeziLLM, SpeziLLMFog, SpeziLLMLocal, SpeziLLMLocalDownload, and SpeziLLMLocalOpenAI targets because they currently have concurrency concerns.

⚙️ Release Notes

📚 Documentation

No additional changes.

✅ Testing

Warnings will be resolved after #64 is merged.

📝 Code of Conduct & Contributing Guidelines

By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines:

Copy link

codecov bot commented Dec 20, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 38.57%. Comparing base (26b1e07) to head (ec16833).

Additional details and impacted files

Impacted file tree graph

@@           Coverage Diff           @@
##             main      #85   +/-   ##
=======================================
  Coverage   38.57%   38.57%           
=======================================
  Files          64       64           
  Lines        2331     2331           
=======================================
  Hits          899      899           
  Misses       1432     1432           
Files with missing lines Coverage Δ
Sources/SpeziLLM/Helpers/LLMContext+Chat.swift 73.69% <ø> (ø)
Sources/SpeziLLM/LLMSessionProvider.swift 87.50% <100.00%> (ø)
Sources/SpeziLLM/Models/LLMContextEntity.swift 35.00% <ø> (ø)

Continue to review full report in Codecov by Sentry.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 26b1e07...ec16833. Read the comment docs.

@philippzagar philippzagar added the enhancement New feature or request label Dec 20, 2024
@philippzagar philippzagar self-requested a review December 20, 2024 20:36
@philippzagar philippzagar marked this pull request as draft December 20, 2024 20:37
Copy link
Member

@philippzagar philippzagar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @jdisho for the PR, great work! 🚀
As you mentioned, it makes sense to wait for the SpeziLLM OpenAI layer to be rewamped, but you can already start tackling the local layer in the meantime.

Also, as mentioned, SpeziSpeech and SpeziChat need to be lifted to Swift 6 language mode as well at some point, let me know if you're interested in that!
Edit: Tackled in StanfordSpezi/SpeziSpeech#10 and StanfordSpezi/SpeziChat#20

Package.swift Outdated
Comment on lines 13 to 18
#if swift(<6)
let swiftConcurrency: SwiftSetting = .enableExperimentalFeature("StrictConcurrency")
#else
let swiftConcurrency: SwiftSetting = .enableUpcomingFeature("StrictConcurrency")
#endif

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to do that if you just specify // swift-tools-version:6.0 at the top of the file.
See https://github.com/StanfordSpezi/Spezi/blob/main/Package.swift

Package.swift Outdated Show resolved Hide resolved
Sources/SpeziLLM/Helpers/LLMContext+Chat.swift Outdated Show resolved Hide resolved
Sources/SpeziLLM/LLMSessionProvider.swift Show resolved Hide resolved
@jdisho
Copy link
Collaborator Author

jdisho commented Dec 21, 2024

@philippzagar - Thanks for the input! This is great 🚀
I'll take care of these two modules.

@jdisho
Copy link
Collaborator Author

jdisho commented Dec 21, 2024

Ah, I just saw that you opened the PRs already.

@philippzagar philippzagar changed the title [WIP] Swift6 - Complete Concurrency Checking Swift 6 Complete Concurrency Checking Dec 21, 2024
}

@MainActor

private func downloadWithHub() async throws {
let repo = Hub.Repo(id: model.hubID)
let modelFiles = ["*.safetensors", "config.json"]

try await HubApi.shared.snapshot(from: repo, matching: modelFiles) { progress in
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is forcing me to make the whole class @unchecked Sendable, because we can't pass progress safely without potentially causing data races. Do you have another idea?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah we don't want to mark the manager Sendable cause it isn't.
The issue is the double capture of self, once within the snapshot() callback and once within the Task.

Did you try something like (rough sketch)

try await HubApi.shared.snapshot(from: repo, matching: modelFiles) { progress in
        Task { @MainActor [mutate = self.mutate] in
            mutate(progress)
        }
    }
}

@MainActor private func mutate(progress: Progress) {
      self.state = .downloading(progress: progress)
}

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That could work; I agree that marking the complete class as sendable without any manual locking implementations might not be great so I would aim to avoid it if we can or mark everything with main actor (which is also not great) which could basically also guarantee that?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As you mentioned, marking everything as MainActor would be the simple solution here. However, we should then double-check that most long running operations in the download manager are performed on some arbitrary thread (e.g. the download of the model itself, copying over the model from the temporary directory to a more permanent one, etc.).
However, I'm not sure if that is really necessary if one includes some ugly workarounds (as shown above).

@jdisho jdisho requested a review from philippzagar December 22, 2024 00:56
@PSchmiedmayer PSchmiedmayer linked an issue Jan 20, 2025 that may be closed by this pull request
1 task
Copy link
Member

@PSchmiedmayer PSchmiedmayer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for continuing to work on this @jdisho & @philippzagar! 🚀

@@ -61,7 +54,7 @@ extension LLMFogSession {
}
} catch let error as APIErrorResponse {
// Sadly, there's no better way to check the error messages as there aren't any Ollama error codes as with the OpenAI API
if error.error.message.contains(Self.modelNotFoundRegex) {
if error.error.message.range(of: Self.modelNotFoundRegex, options: .regularExpression) != nil {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice simplification!

@@ -35,7 +34,7 @@ let package = Package(
.package(url: "https://github.com/StanfordSpezi/SpeziFoundation", from: "2.0.0"),
.package(url: "https://github.com/StanfordSpezi/SpeziStorage", from: "1.0.2"),
.package(url: "https://github.com/StanfordSpezi/SpeziOnboarding", from: "1.1.1"),
.package(url: "https://github.com/StanfordSpezi/SpeziChat", .upToNextMinor(from: "0.2.1")),
.package(url: "https://github.com/StanfordSpezi/SpeziChat", .upToNextMinor(from: "0.2.2")),
.package(url: "https://github.com/StanfordSpezi/SpeziViews", from: "1.3.1")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should try to unify this across all packages, some of them use .git at the end, some of them don't. I would suggesting adding .git for all of them for now.

}

@MainActor

private func downloadWithHub() async throws {
let repo = Hub.Repo(id: model.hubID)
let modelFiles = ["*.safetensors", "config.json"]

try await HubApi.shared.snapshot(from: repo, matching: modelFiles) { progress in
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That could work; I agree that marking the complete class as sendable without any manual locking implementations might not be great so I would aim to avoid it if we can or mark everything with main actor (which is also not great) which could basically also guarantee that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: In Progress
Development

Successfully merging this pull request may close these issues.

Complete Concurrency Checking
3 participants