-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix deadlock in messager and health streamer #17230
Conversation
Signed-off-by: Manan Gupta <manan@planetscale.com>
Signed-off-by: Manan Gupta <manan@planetscale.com>
Signed-off-by: Manan Gupta <manan@planetscale.com>
Signed-off-by: Manan Gupta <manan@planetscale.com>
Review ChecklistHello reviewers! 👋 Please follow this checklist when reviewing this Pull Request. General
Tests
Documentation
New flags
If a workflow is added or modified:
Backward compatibility
|
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #17230 +/- ##
=======================================
Coverage 67.39% 67.39%
=======================================
Files 1570 1570
Lines 252903 252912 +9
=======================================
+ Hits 170438 170461 +23
+ Misses 82465 82451 -14 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, @GuptaManan100 ! ❤️
|
||
// TestDeadlockBwCloseAndSchemaChange tests the deadlock observed between Close and schemaChanged | ||
// functions. More details can be found in the issue https://github.com/vitessio/vitess/issues/17229. | ||
func TestDeadlockBwCloseAndSchemaChange(t *testing.T) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I ran this on main in a loop while reviewing the PR. I didn't get any failures:
while true; do
go test -v -count 1 -timeout 5s -run ^TestDeadlockBwCloseAndSchemaChange$ vitess.io/vitess/go/vt/vttablet/tabletserver/messager
if [[ $? -ne 0 ]]; then
say "got one"
break
fi
done
But we already know that it's not easy to trigger and we can logically see how it can happen. Just noting that the test doesn't seem to be as useful as one would hope.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This tracks. I commented on the issue that we haven't seen this in practice, at least that I'm aware of.
#17229 (comment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really!? It fails for me -
while true; do
go test -v -count 1 -timeout 5s -run ^TestDeadlockBwCloseAndSchemaChange$ vitess.io/vitess/go/vt/vttablet/tabletserver/messager
if [[ $? -ne 0 ]]; then
say "got one"
break
fi
done
=== RUN TestDeadlockBwCloseAndSchemaChange
--- PASS: TestDeadlockBwCloseAndSchemaChange (0.00s)
PASS
ok vitess.io/vitess/go/vt/vttablet/tabletserver/messager 0.579s
=== RUN TestDeadlockBwCloseAndSchemaChange
panic: test timed out after 5s
running tests:
TestDeadlockBwCloseAndSchemaChange (5s)
goroutine 66 [running]:
testing.(*M).startAlarm.func1()
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/testing/testing.go:2373 +0x30c
created by time.goFunc
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/time/sleep.go:215 +0x38
goroutine 1 [chan receive]:
testing.(*T).Run(0x140001be820, {0x1012cb3fe?, 0x1400059fb48?}, 0x101ab0970)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/testing/testing.go:1751 +0x328
testing.runTests.func1(0x140001be820)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/testing/testing.go:2168 +0x40
testing.tRunner(0x140001be820, 0x1400059fc68)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/testing/testing.go:1690 +0xe4
testing.runTests(0x140001c7d58, {0x1024c7e00, 0x19, 0x19}, {0xfc00000000000000?, 0xfcaacd9be1871533?, 0x102a6b640?})
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/testing/testing.go:2166 +0x3ac
testing.(*M).Run(0x14000342f00)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/testing/testing.go:2034 +0x588
main.main()
_testmain.go:93 +0x90
goroutine 35 [select]:
github.com/golang/glog.(*fileSink).flushDaemon(0x102a6b6d8)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/github.com/golang/glog@v1.2.3/glog_file.go:352 +0xa4
created by github.com/golang/glog.init.1 in goroutine 1
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/github.com/golang/glog@v1.2.3/glog_file.go:167 +0x1c8
goroutine 10 [syscall]:
os/signal.signal_recv()
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/runtime/sigqueue.go:149 +0x2c
os/signal.loop()
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/os/signal/signal_unix.go:23 +0x1c
created by os/signal.Notify.func1.1 in goroutine 1
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/os/signal/signal.go:151 +0x28
goroutine 11 [chan receive]:
vitess.io/vitess/go/vt/dbconfigs.init.0.func1()
/Users/manangupta/vitess/go/vt/dbconfigs/credentials.go:90 +0x34
created by vitess.io/vitess/go/vt/dbconfigs.init.0 in goroutine 1
/Users/manangupta/vitess/go/vt/dbconfigs/credentials.go:89 +0x164
goroutine 12 [semacquire]:
sync.runtime_Semacquire(0x14000663340?)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/runtime/sema.go:71 +0x2c
sync.(*WaitGroup).Wait(0x140003b47d0)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/sync/waitgroup.go:118 +0x74
vitess.io/vitess/go/vt/vttablet/tabletserver/messager.TestDeadlockBwCloseAndSchemaChange(0x140001be9c0?)
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/messager/engine_test.go:201 +0x110
testing.tRunner(0x140001be9c0, 0x101ab0970)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/testing/testing.go:1690 +0xe4
created by testing.(*T).Run in goroutine 1
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/testing/testing.go:1743 +0x314
goroutine 13 [select]:
vitess.io/vitess/go/stats.(*Rates).track(0x14000465100)
/Users/manangupta/vitess/go/stats/rates.go:110 +0xac
created by vitess.io/vitess/go/stats.NewRates in goroutine 12
/Users/manangupta/vitess/go/stats/rates.go:97 +0x240
goroutine 14 [select]:
vitess.io/vitess/go/stats.(*Rates).track(0x14000465280)
/Users/manangupta/vitess/go/stats/rates.go:110 +0xac
created by vitess.io/vitess/go/stats.NewRates in goroutine 12
/Users/manangupta/vitess/go/stats/rates.go:97 +0x240
goroutine 15 [sync.Mutex.Lock]:
sync.runtime_SemacquireMutex(0x2?, 0x0?, 0x1012d7c7b?)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/runtime/sema.go:95 +0x28
sync.(*Mutex).lockSlow(0x14000410048)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/sync/mutex.go:173 +0x174
sync.(*Mutex).Lock(...)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/sync/mutex.go:92
vitess.io/vitess/go/vt/vttablet/tabletserver/schema.(*Engine).UnregisterNotifier(0x14000410000, {0x10129dacb, 0x8})
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/schema/engine.go:846 +0xc0
vitess.io/vitess/go/vt/vttablet/tabletserver/messager.(*Engine).Close(0x14000061c20)
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/messager/engine.go:108 +0x144
vitess.io/vitess/go/vt/vttablet/tabletserver/messager.TestDeadlockBwCloseAndSchemaChange.func1()
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/messager/engine_test.go:188 +0x5c
created by vitess.io/vitess/go/vt/vttablet/tabletserver/messager.TestDeadlockBwCloseAndSchemaChange in goroutine 12
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/messager/engine_test.go:185 +0xb8
goroutine 16 [sync.Mutex.Lock]:
sync.runtime_SemacquireMutex(0x1298a9bf0?, 0x50?, 0x1298a9bf0?)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/runtime/sema.go:95 +0x28
sync.(*Mutex).lockSlow(0x14000061c20)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/sync/mutex.go:173 +0x174
sync.(*Mutex).Lock(...)
/Users/manangupta/.gvm/pkgsets/go1.23.0/global/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.3.darwin-arm64/src/sync/mutex.go:92
vitess.io/vitess/go/vt/vttablet/tabletserver/messager.(*Engine).schemaChanged(0x14000061c20, 0x140001b7dc8?, {0x0, 0x0, 0x0}, {0x0, 0x0, 0x1012570f4?}, {0x0, 0x0, ...}, ...)
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/messager/engine.go:152 +0xa8
vitess.io/vitess/go/vt/vttablet/tabletserver/schema.(*Engine).broadcast(0x14000410000, {0x0, 0x0, 0x0}, {0x0, 0x0, 0x0}, {0x0, 0x0, 0x0}, ...)
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/schema/engine.go:864 +0x264
vitess.io/vitess/go/vt/vttablet/tabletserver/schema.(*Engine).BroadcastForTesting(0x0?, {0x0?, 0x0?, 0x0?}, {0x0?, 0x0?, 0x0?}, {0x0?, 0x0?, 0x0?}, ...)
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/schema/engine.go:872 +0x114
vitess.io/vitess/go/vt/vttablet/tabletserver/messager.TestDeadlockBwCloseAndSchemaChange.func2()
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/messager/engine_test.go:196 +0x88
created by vitess.io/vitess/go/vt/vttablet/tabletserver/messager.TestDeadlockBwCloseAndSchemaChange in goroutine 12
/Users/manangupta/vitess/go/vt/vttablet/tabletserver/messager/engine_test.go:193 +0x108
FAIL vitess.io/vitess/go/vt/vttablet/tabletserver/messager 5.648s
FAIL
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made sure the test was failing before I added any fixes. After the changes, the test doesn't fail across 1000 or so runs 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can try running this on e525c1a169
commit hash. In this PR I've added the test in one commit, before fixing the issue in the next. So it should be reproducible for you too.
@@ -76,15 +83,12 @@ func NewEngine(tsv TabletService, se *schema.Engine, vs VStreamer) *Engine { | |||
// Open starts the Engine service. | |||
func (me *Engine) Open() { | |||
me.mu.Lock() | |||
defer me.mu.Unlock() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
me.managersMu.Lock() | ||
defer me.managersMu.Unlock() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
For some historical context, when the original code was written, |
Signed-off-by: Manan Gupta <manan@planetscale.com>
Signed-off-by: Manan Gupta <manan@planetscale.com>
Signed-off-by: Manan Gupta <manan@planetscale.com>
Signed-off-by: Manan Gupta <manan@planetscale.com>
Signed-off-by: Manan Gupta <manan@planetscale.com> Signed-off-by: Renan Rangel <rrangel@slack-corp.com>
Description
This PR fixes the deadlocks in messager and health streamer as pointed out in #17229.
The fix to the deadlock for both the cases was to use two different mutexes. One that only protects against calling setup functions like
Open
andClose
concurrently, and the other one that actually protects the fields. This helps in removing the deadlock since the function that runs on a broadcast from the schema engine would acquire a different mutex than the call to Close.Related Issue(s)
Checklist
Deployment Notes