-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to latest event hubs client SDK #385
Conversation
…ncellation exceptions after cancellation
# Conflicts: # src/DurableTask.Netherite/DurableTask.Netherite.csproj
This change has been in a pending state for a long time. Is there any target date for this fix to get checked-In? I am asking this because our service started using Netherite but If we don't have timeline for fix, then to meet compliance requirement, we have to move aways from using Netherite package. |
@sebastianburckhardt any ETA for this PR? |
@sebastianburckhardt, do you have estimate timeline to chekiin this PR? |
Tests seem to be running o.k. |
@akhleshg, @saguiitay: apologies for the delayed response here. This is definitely in our radar, but it's release is scheduled to occur after some breaking changes we're releasing in lower-level dependencies we often, namely DurableTask.Core (we're releasing v3) and the DurableTask Extension (also releasing v3). DurableTask.Core v3 released yesterday: https://github.com/Azure/durabletask/releases/tag/durabletask.core-v3.0.0 At that point, Netherite will upgrade these dependencies to v3, and the Event Hubs dependency, and increase it's own major version to reflect these breaking changes. That said - @akhleshg, @saguiitay : are you all using Durable Functions Netherite or DurableTask Netherite. If you're using DurableTask directly, we may be able to expedite this by a few weeks as that would only require upgrading DurableTask.Core, which is already out. |
Thank David (@davidmrdavid) for your response and for providing additional details about the changes. We are currently using DurableTask Netherite and require the update to ensure security compliance. I would greatly appreciate it if you could expedite the necessary changes to DurableTask Netherite, as we are eagerly awaiting this update. We do have Durable Functions and plan to update them to use Netherite. However, we can wait for the Durable Functions Netherite update before migrating our service to Netherite. |
Hi @davidmrdavid ! We are in the early stages of adoption of DurableTask Netherite and ran into #417 so we could make use of that expedited release if possible :) Thank you! |
Looking to unblock this, trying to see what we can do |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left some comments. The PartitionProcessor
is a bit more complex than I would like (thus hard to review), but I understand this is not the right time to ask for a bit refactoring there. But it would be good to review it as a team to see if we can either simplify it or at least help document it.
I think we can be practical and consider the addition of the new SDK as not a breaking change, even though in the strictest sense it is. We'll see, let's discuss internally.
One test we should absolutely do is ensure old generated by the old EH SDK is compatible with the new EH SDK data, and vice-versa. This was a problem in the Azure Storage SDK upgrade we're doing in DTFx and the WebJobs extension.
src/DurableTask.Netherite/TransportLayer/EventHubs/EventHubsPartitionManager.cs
Outdated
Show resolved
Hide resolved
src/DurableTask.Netherite/TransportLayer/EventHubs/EventHubsTransport.cs
Outdated
Show resolved
Hide resolved
src/DurableTask.Netherite/TransportLayer/EventHubs/PartitionProcessor.cs
Show resolved
Hide resolved
src/DurableTask.Netherite/TransportLayer/EventHubs/PartitionProcessor.cs
Outdated
Show resolved
Hide resolved
// dispose the lock, after quickly acquiring and releasing it to make sure no one is still holding it | ||
(await this.deliveryLock.LockAsync()).Dispose(); | ||
this.deliveryLock.Dispose(); | ||
|
||
this.traceHelper.LogInformation("EventHubsProcessor {eventHubName}/{eventHubPartition} is shut down", this.eventHubName, this.eventHubPartition); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is a bit strange. Can you please clarify, in the code, why this is needed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or ideally, can the code be refactored so we don't need this trick? I understand if not, it's just a bit odd
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's actually quite tricky. I will simplify this by modifying the AsyncLock itself so it supports a safe method of disposing.
Co-authored-by: David Justo <david.justo.1996@gmail.com>
src/DurableTask.Netherite/TransportLayer/EventHubs/EventHubsTransport.cs
Outdated
Show resolved
Hide resolved
Co-authored-by: David Justo <david.justo.1996@gmail.com>
…letask-netherite into pr/update-eh-sdk
…ve timeouts with nontrivial duration
…tate analysis of hangs in CI
Almost done, just need another pass over the PartitionProcessor before approving, all other files have been reviewed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Before releasing, let's please rev the major version of Netherite in all csprojs.
I do want to emphasize again that I think we refactoring our PartitionProcesor would be good, it's a tricky piece of code to review. FYI @AnatoliB, I think this file would be a good place to get the eng side to contribute to and help document + refactor to make it easier to follow. I think it would pay dividends.
Anyways, I've gone over all files, looks good to me.
Before releasing, let's also make sure there's no compatibility issues with the previous EH SDK, to avoid issues like what we had with the Azure Storage SDK upgrade. We can help with this.
Update to the latest Event Hubs SDK, to remove our dependency on deprecated packages.
This was not straightforward: lots of API surface changes, and some minor functionality changes as a result. In the end, the solution seems reasonably similar though and better in some ways.
For motivation and discussion, see #310.
Fixes #310.