Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test merging IL_Conformance #80597

Merged
merged 8 commits into from
Feb 1, 2023

Conversation

BrianBohe
Copy link
Member

@BrianBohe BrianBohe commented Jan 13, 2023

This is a first draft of what would be repeated over each leading subdirectory in src/tests/JIT. I followed the same steps @markples is doing using @trylek ILTransform tool, this is my version of the tool. It is easier to review each commit separately. Each commit name is the command used with its params.

I am able to build the locally, but tests are not allocated under artifacts/tests/coreclr/<os>.<arch>.<build_type>/ so probably I am missing something. I am opening this pr to see whether some pipeline is running them.

I guess these tests are run because I can see there are some being excluded but I cannot find a *.proj file referencing the ilproj files inside IL_Conformance as I saw on trylek pr. Do you know @trylek how these tests are run? This is the only reference to IL_Conformance I can find, and it seems similar to the leading references to specific Methodical tests, so I guess we shouldn't remove it. Any opinions on that?

@BrianBohe
Copy link
Member Author

CC @markples @trylek

@ghost
Copy link

ghost commented Jan 13, 2023

Tagging subscribers to this area: @dotnet/area-system-reflection-metadata
See info in area-owners.md if you want to be subscribed.

Issue Details

This is a first draft of what would be repeated over each leading subdirectory in src/tests/JIT. I followed the same steps @markples is doing based on @trylek ILTransform tool. It is easier to review each commit separately. Each commit name is the command used with its params.

I am able to build the locally, but tests are not allocated under artifacts/tests/coreclr/<os>.<arch>.<build_type>/ so probably I am missing something. I am opening this pr to see whether some pipeline is running them.

I guess these tests are run because I can see there are some being excluded but I cannot find a *.proj file referencing the ilproj files inside IL_Conformance as I saw on trylek pr. Do you know @trylek how these tests are run? This is the only reference to IL_Conformance I can find, and it seems similar to the leading references to specific Methodical tests, so I guess we shouldn't remove it. Any opinions on that?

Author: BrianBohe
Assignees: BrianBohe
Labels:

area-System.Reflection.Metadata

Milestone: -

@markples
Copy link
Contributor

I'm pretty sure that a missing -priority 1 is the issue with your local testing so far. All csproj under src\tests will be included automatically (common\dirs.proj), but priority 1 tests will be filtered out if that isn't specified. I'm not sure how il_conformance is executed in the test jobs, but I see results for it and am guessing that there is some sort of default handling.

Your test runs have enough to indicate at least one fix - check out the logs. The ckfinite issue is that it is a keyword.. it should be easy to add a _ to keywords in iltransform (just add on-demand as breaks are found).

@BrianBohe
Copy link
Member Author

I'm pretty sure that a missing -priority 1 is the issue with your local testing so far. All csproj under src\tests will be included automatically (common\dirs.proj), but priority 1 tests will be filtered out if that isn't specified. I'm not sure how il_conformance is executed in the test jobs, but I see results for it and am guessing that there is some sort of default handling.

Thanks @markples, you are right and it is needed the -priority=1 and use the tree switch (not dir)

Your test runs have enough to indicate at least one fix - check out the logs. The ckfinite issue is that it is a keyword.. it should be easy to add a _ to keywords in iltransform (just add on-demand as breaks are found).

Thanks, I fixed those. The leading errors seems to not be related to this pr. Can I schedule priority 1 tests with these failures?

@BrianBohe
Copy link
Member Author

/azp run outerloop

@azure-pipelines
Copy link

No pipelines are associated with this pull request.

@markples
Copy link
Contributor

Thanks, I fixed those. The leading errors seems to not be related to this pr. Can I schedule priority 1 tests with these failures?

I assume you mean test errors/timeouts on some platforms, in which case you'll be fine with outerloop. It should just be the obvious dependencies that would block it (if you can't build the runtime or can't build the tests, then you can't run them).

@BrianBohe
Copy link
Member Author

/azp list

@BrianBohe
Copy link
Member Author

/azp run runtime-coreclr outerloop

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

Copy link
Member

@trylek trylek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks very promising, thanks Brian! I cannot wait for it to get merged in and analyze the CI perf results.

@@ -17,12 +17,12 @@
//-----------------------------------------------------


.assembly 'add'{
.assembly 'add_'{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think something may have gone wrong in the dedup renaming. I don't think this and others need to be renamed at the project/assembly level, which will add a lot of churn to the test source history. Only the namespace should need the rewrite.

(old\base\add.il is the only add.il, but old\base\add.il, old\conformance_base\add_i.il, and others all have namespace 'add')

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IIRC the only limitations are that the assembly name must match the project name and that assembly names must be unique within a single merged test wrapper, otherwise we cannot generate functional C# code calling their entrypoints - in the former case (assembly vs. project name mismatch) the Roslyn generator at some point thinks that the project (dll) name is the assembly name and uses that for module qualification of the method call which subsequently fails at runtime because runtime compares it with the logical assembly name. In the latter case (assembly names unique within a test wrapper) this is due to the fact that all the test assemblies are ultimately copied to the merged test wrapper output folder so same-named assemblies would stomp on each other; this is "silent" in the sense that nothing fails but you'll see a lower number of tests compared to the non-merged version. A lower number of tests executed by src\tests\run can also indicate premature termination of the merged test wrapper that happened to me for several JIT/Methodical tests that were using Environment.Exit to report success.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the most annoying for silly recursive tests that need to exit early, in a handful of cases I had to do non-trivial rewrites for this reason. IIRC I typically introduced a static variable representing the exit code, changed the code calling Environment.Exit to set that instead, patched the recursive descents to check the variable prior to the call and the main method to use the variable for the final return value. There were less than half a dozen cases like these so I guess each of them is unique, feel free to devise your own method to overcome the Environment.Exit problem.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right, I was matching namespaces with il files, but it is no need.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm pretty sure that there is something required at the namespace level (or more specifically that type names need to be unique), but I'm not sure exactly what the restriction is. My original understanding (I think from ILTransform code) was that this applied to the type containing the test entrypoint (perhaps due to invoking it without assembly scoping?). However, I then hit build errors for types that weren't the container for test entrypoints, so I adjusted ILTransform to look at all of the type names.

It is quite possible that I misinterpreted what was going on.

That said, deduplicating the types will be necessary if we try to combine multiple tests into one C# compiler invocation with many files (as a future consolidation step to save time). Perhaps this could be relevant on the IL side though the tool invocation might be trickier.

We have ILTransform scanning for Environment.Exit, adding RequiresProcessIsolation for them, and reporting them in the log. I think we've been pretty lucky that they are rare / non-existent across jit\regression / il_conformance, but we will likely hit this again.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds great. IIRC @jkoritzinsky was looking into that at some point; there's some hacky partially supported form of syntax that lets you address the same-typed name in multiple assemblies via some obscure form of qualification, however @jkotas suggested it would be better to clean these cases up so that's what I did, typically in the Loader/classloader/TypeGeneratorTests all tests were using the same namespace and class name i.o.w. the fully qualified entrypoint name was the same for all of them, I believe that was the point of this particular deduplication.

@BrianBohe
Copy link
Member Author

BrianBohe commented Jan 18, 2023

Running .\src\tests\build.cmd tree JIT\IL_Conformance -priority=1; .\src\tests\run.cmd on powershell.
This is the output I get in a clean build of main:

MSBuild version 17.5.0-preview-23056-01+d89d4bc91 for .NET
  "C:\runtimes\runtime2\\dotnet.cmd" --roll-forward latestmajor C:\runtimes\runtime2\artifacts\tests\coreclr\windows.x64.Debug\Tests\Core_Root\xunit\xunit.
  console.dll .\JIT\IL_Conformance\JIT.IL_Conformance.XUnitWrapper.dll -parallel collections -html C:\runtimes\runtime2\artifacts\log\TestRun.html -xml C:\
  runtimes\runtime2\artifacts\log\TestRun.xml  -notrait category=outerloop -notrait category=failing -nocolor
  C:\runtimes\runtime2\.dotnet
  Microsoft.DotNet.XUnitConsoleRunner v2.5.0 (64-bit .NET 8.0.0-alpha.1.23058.2)
    Discovering: JIT.IL_Conformance.XUnitWrapper (method display = ClassAndMethod, method display options = None)
    Discovered:  JIT.IL_Conformance.XUnitWrapper (found 405 test cases)
    Starting:    JIT.IL_Conformance.XUnitWrapper (parallel test collections = on, max threads = 20)
    Finished:    JIT.IL_Conformance.XUnitWrapper
  === TEST EXECUTION SUMMARY ===
     JIT.IL_Conformance.XUnitWrapper  Total: 405, Errors: 0, Failed: 0, Skipped: 0, Time: 188.654s

Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:03:10.80
Test run finished.
Parsing test results from (C:\runtimes\runtime2\artifacts\log\TestRunResults_windows_x64_Debug)
Analyzing C:\runtimes\runtime2\artifacts\log\TestRun.xml
Time [secs] | Total | Passed | Failed | Skipped | Assembly Execution Summary
============================================================================
    188.654 |   405 |    405 |      0 |       0 | JIT.IL_Conformance.XUnitWrapper.dll
----------------------------------------------------------------------------
    188.654 |   405 |    405 |      0 |       0 | (total)

This is what I get from this PR without the wrapper merge commit:

MSBuild version 17.5.0-preview-23056-01+d89d4bc91 for .NET
  "C:\runtimes\runtime\\dotnet.cmd" --roll-forward latestmajor C:\runtimes\runtime\artifacts\tests\coreclr\windows.x64.Debug\Tests\Core_Root\xunit\xunit.co
  nsole.dll .\JIT\IL_Conformance\JIT.IL_Conformance.XUnitWrapper.dll -parallel collections -html C:\runtimes\runtime\artifacts\log\TestRun.html -xml C:\run
  times\runtime\artifacts\log\TestRun.xml  -notrait category=outerloop -notrait category=failing -nocolor
  C:\runtimes\runtime\.dotnet
  Microsoft.DotNet.XUnitConsoleRunner v2.5.0 (64-bit .NET 8.0.0-alpha.1.23058.2)
    Discovering: JIT.IL_Conformance.XUnitWrapper (method display = ClassAndMethod, method display options = None)
    Discovered:  JIT.IL_Conformance.XUnitWrapper (found 405 test cases)
    Starting:    JIT.IL_Conformance.XUnitWrapper (parallel test collections = on, max threads = 20)
    Finished:    JIT.IL_Conformance.XUnitWrapper
  === TEST EXECUTION SUMMARY ===
     JIT.IL_Conformance.XUnitWrapper  Total: 405, Errors: 0, Failed: 0, Skipped: 0, Time: 150.940s

Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:02:32.26
Test run finished.
Parsing test results from (C:\runtimes\runtime\artifacts\log\TestRunResults_windows_x64_Debug)
Analyzing C:\runtimes\runtime\artifacts\log\JIT.IL_Conformance.IL_Conformance.testRun.xml
Analyzing C:\runtimes\runtime\artifacts\log\TestRun.xml
Time [secs] | Total | Passed | Failed | Skipped | Assembly Execution Summary
============================================================================
      0.453 |     0 |      0 |      0 |       0 | JIT.IL_Conformance.IL_Conformance
    150.940 |   405 |    405 |      0 |       0 | JIT.IL_Conformance.XUnitWrapper.dll
----------------------------------------------------------------------------
    151.393 |   405 |    405 |      0 |       0 | (total)

And now the state of art of the PR:

MSBuild version 17.5.0-preview-23056-01+d89d4bc91 for .NET
  C:\runtimes\runtime\artifacts\tests\coreclr\windows.x64.Debug\JIT\IL_Conformance\IL_Conformance\IL_Conformance.cmd >C:\runtimes\runtime\artifacts\tests\c
  oreclr\windows.x64.Debug\JIT\IL_Conformance\IL_Conformance\IL_Conformance.log

Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:07.93
Test run finished.
Parsing test results from (C:\runtimes\runtime\artifacts\log\TestRunResults_windows_x64_Debug)
Analyzing C:\runtimes\runtime\artifacts\log\JIT.IL_Conformance.IL_Conformance.testRun.xml
Time [secs] | Total | Passed | Failed | Skipped | Assembly Execution Summary
============================================================================
      2.834 |     3 |      3 |      0 |       0 | JIT.IL_Conformance.IL_Conformance
----------------------------------------------------------------------------
      2.834 |     3 |      3 |      0 |       0 | (total)

PS C:\runtimes\runtime> .\src\tests\build.cmd tree JIT\IL_Conformance -priority=1; .\src\tests\run.cmd
BUILDTEST: Starting Build at 14:49:05.43
BUILDTEST: Commencing CoreCLR test build

@BrianBohe
Copy link
Member Author

BrianBohe commented Jan 18, 2023

The 3 tests inside JIT.IL_Conformance.IL_Conformance (TestConvertFromIntegral, AutoInit and conv_ovf_i1_un ) are included in JIT.IL_Conformance.XUnitWrapper.dll

@BrianBohe
Copy link
Member Author

I can see all the test dlls under IL_Conformance being generated and placed under \artifacts\tests\coreclr\windows.x64.Debug\JIT\IL_Conformance\ subdirectories

@BrianBohe
Copy link
Member Author

I can see all the test dlls under IL_Conformance being generated and placed under \artifacts\tests\coreclr\windows.x64.Debug\JIT\IL_Conformance\ subdirectories

Removing the tests it runs does not help running others. So there is no particular problem running those tests. Any guess of what could be @trylek ? How would you debug this?

@teo-tsirpanis teo-tsirpanis added area-CodeGen-coreclr CLR JIT compiler in src/coreclr/src/jit and related components such as SuperPMI and removed area-System.Reflection.Metadata labels Jan 22, 2023
@ghost
Copy link

ghost commented Jan 22, 2023

Tagging subscribers to this area: @JulieLeeMSFT, @jakobbotsch, @kunalspathak
See info in area-owners.md if you want to be subscribed.

Issue Details

This is a first draft of what would be repeated over each leading subdirectory in src/tests/JIT. I followed the same steps @markples is doing using @trylek ILTransform tool, this is my version of the tool. It is easier to review each commit separately. Each commit name is the command used with its params.

I am able to build the locally, but tests are not allocated under artifacts/tests/coreclr/<os>.<arch>.<build_type>/ so probably I am missing something. I am opening this pr to see whether some pipeline is running them.

I guess these tests are run because I can see there are some being excluded but I cannot find a *.proj file referencing the ilproj files inside IL_Conformance as I saw on trylek pr. Do you know @trylek how these tests are run? This is the only reference to IL_Conformance I can find, and it seems similar to the leading references to specific Methodical tests, so I guess we shouldn't remove it. Any opinions on that?

Author: BrianBohe
Assignees: BrianBohe
Labels:

area-CodeGen-coreclr

Milestone: -

@BrianBohe
Copy link
Member Author

After adding the xunix FactAttrivute::cctor inside every il test entry point it just run the same number of tests.

MSBuild version 17.5.0-preview-23056-01+d89d4bc91 for .NET
  C:\runtimes\runtime\artifacts\tests\coreclr\windows.x64.Debug\JIT\IL_Conformance\IL_Conformance\IL_Conformance.cmd >C:\runtimes\runtime\artifacts\tests\c
  oreclr\windows.x64.Debug\JIT\IL_Conformance\IL_Conformance\IL_Conformance.log

Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:20.29
Test run finished.
Parsing test results from (C:\runtimes\runtime\artifacts\log\TestRunResults_windows_x64_Debug)
Analyzing C:\runtimes\runtime\artifacts\log\JIT.IL_Conformance.IL_Conformance.testRun.xml
Time [secs] | Total | Passed | Failed | Skipped | Assembly Execution Summary
============================================================================
     13.289 |   405 |    405 |      0 |       0 | JIT.IL_Conformance.IL_Conformance
----------------------------------------------------------------------------
     13.289 |   405 |    405 |      0 |       0 | (total)

@BrianBohe
Copy link
Member Author

I reverted all il/ilproj/assembly renames removing the '_' that should have been only for namespaces. All tests are still reachable.

@BrianBohe
Copy link
Member Author

/azp run runtime-coreclr outerloop

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@BrianBohe BrianBohe force-pushed the testmerging_IL_Conformance branch from 115324f to aecff40 Compare January 31, 2023 18:49
@dotnet dotnet deleted a comment from azure-pipelines bot Jan 31, 2023
@BrianBohe
Copy link
Member Author

It seems that now merging to main would run the tests in the legacy and the merged way. I have not realized why yet.

Time [secs] | Total | Passed | Failed | Skipped | Assembly Execution Summary
============================================================================
     12.860 |   405 |    405 |      0 |       0 | JIT.IL_Conformance.IL_Conformance
    159.358 |   405 |    404 |      1 |       0 | JIT.IL_Conformance.XUnitWrapper.dll
----------------------------------------------------------------------------
    172.218 |   810 |    809 |      1 |       0 | (total)

@trylek
Copy link
Member

trylek commented Jan 31, 2023

Hmm, at the very least the encouraging news is that running the tests through the legacy wrapper takes about 12 times longer than in the merged version. The legacy wrappers are based on locating cmd / sh test execution scripts; for out-of-process merged tests that do produce execution scripts we produce a special file with the extension .OutOfProcessTest or something similar to the folder to make sure they get filtered out. When you run the tests locally using src\tests\run, do you also see them getting run twice? I'll try to poke around the binlog files from your run to see if I can spot something.

@trylek
Copy link
Member

trylek commented Jan 31, 2023

I'm wondering where you're getting this table from as the runs haven't finished yet. If it's from your local testing, IIRC there's a weird catch that you need to delete the folder artifacts\log prior to running src\tests\run otherwise it potentially aggregates both the legacy and merged test results from that folder.

@trylek
Copy link
Member

trylek commented Jan 31, 2023

Just to clarify, this only applies to local innerloop of developers actively working on test merging so that at some point they have the "non-merged" xUnit summary files in their artifacts\log folders and subsequently they start emitting the new-style xUnit summaries produced by the merged test wrappers. The Python script is very simple and just grabs whatever xUnit summaries it can find.

@trylek
Copy link
Member

trylek commented Jan 31, 2023

The bottom line is that outputting stable-named test results into the logs folder along with "normal" timestamped logs is a very hacky thing to do.

@BrianBohe
Copy link
Member Author

I'm wondering where you're getting this table from as the runs haven't finished yet. If it's from your local testing, IIRC there's a weird catch that you need to delete the folder artifacts\log prior to running src\tests\run otherwise it potentially aggregates both the legacy and merged test results from that folder.

This is the result of running git checkout .; git clean -xdf;./build clr+libs -rc debug -lc Release;.\src\tests\build debug generatelayoutonly; .\src\tests\build tree JIT\IL_Conformance -priority=1; .\src\tests\run locally

@BrianBohe
Copy link
Member Author

This is the result of running git checkout .; git clean -xdf;./build clr+libs -rc debug -lc Release;.\src\tests\build debug generatelayoutonly; .\src\tests\build tree JIT\IL_Conformance -priority=1; .\src\tests\run locally

@trylek My wrong, it seems it is running only the new tests so probably happened what you said before. Thanks!

@BrianBohe BrianBohe marked this pull request as ready for review January 31, 2023 21:31
@BrianBohe
Copy link
Member Author

/azp run runtime-coreclr outerloop

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@BrianBohe
Copy link
Member Author

This now looks ready to merge to me @markples @trylek, final thoughts?

@markples
Copy link
Contributor

markples commented Feb 1, 2023

Copying to the PR - test execution is 4 seconds. (This opens a new question as to whether this is worth an entire helix job, but that is beyond the scope of this PR.)

BEGIN EXECUTION
 "C:\h\w\A75B0933\p\corerun.exe" -p "System.Reflection.Metadata.MetadataUpdater.IsSupported=false"  IL_Conformance.dll 
00:04:25.577 Running test: JIT/IL_Conformance/Old/Base/add/add.dll
Test SUCCESS

...

00:04:29.514 Passed test: JIT/IL_Conformance/Old/Conformance_Base/conv_ovf_i8_i/conv_ovf_i8_i.cmd
Expected: 100
Actual: 100
END EXECUTION - PASSED

@BrianBohe
Copy link
Member Author

Copying to the PR - test execution is 4 seconds. (This opens a new question as to whether this is worth an entire helix job, but that is beyond the scope of this PR.)

BEGIN EXECUTION
 "C:\h\w\A75B0933\p\corerun.exe" -p "System.Reflection.Metadata.MetadataUpdater.IsSupported=false"  IL_Conformance.dll 
00:04:25.577 Running test: JIT/IL_Conformance/Old/Base/add/add.dll
Test SUCCESS

...

00:04:29.514 Passed test: JIT/IL_Conformance/Old/Conformance_Base/conv_ovf_i8_i/conv_ovf_i8_i.cmd
Expected: 100
Actual: 100
END EXECUTION - PASSED

Should we follow up this on a new or related issue?

@BrianBohe
Copy link
Member Author

Related issue #71732

@BrianBohe BrianBohe merged commit 3c47b2c into dotnet:main Feb 1, 2023
@BrianBohe BrianBohe added this to the 8.0.0 milestone Feb 1, 2023
@trylek
Copy link
Member

trylek commented Feb 1, 2023

@BrianBohe - are you saying that the entire 400-something test suite takes just 4 seconds to execute? That would be more optimistic than I'd have hoped. Anyway, the entire Pri1 file is just about 10500 tests or so, that would mean approximately 25 Helix work items this size, I don't think we need to squash more tests into a single merged test. After all please remember that in stress modes they'll take 10~100 times longer, the more tests the more hassle when a test crashes, potential for one test to affect a later test via GC allocation or whatnot, we're not merging the tests because that would make running them easier, we're doing it solely to reduce times and costs and it has its own cost. My gut feeling is that several hundred tests in a single merged wrapper is more than enough. @BruceForstall has actually been pushing for finer-grained split with fewer test per wrapper as that would allow for better Helix work balancing in the super-slow GC stress tests.

@trylek
Copy link
Member

trylek commented Feb 1, 2023

For the tests I merged previously, IIRC JIT/Methodical comprise about 2K tests and use five wrappers i.e. approximately 400 tests each, the TypeGeneratorTests use 15 wrappers 100 tests each (the last wrapper actually has 101 tests as there are 1501 tests total).

@markples
Copy link
Contributor

markples commented Feb 1, 2023

4 seconds is plausible for il_conformance, as it doesn't do very much. For example, base\add is ~50 straight-line IL instructions. I suppose small repros for specific JIT bugs will be similarly quick, but then "real" tests will be much slower. Perhaps in the future we can be more sophisticated with groupings, but I agree that we don't need to do anything now or track anything.

@trylek
Copy link
Member

trylek commented Feb 1, 2023

Sounds good. When I worked with Jeremy on xUnit reporting for merged tests by the end of 2021, I had to increase the number of decimal digits from 3 to 6 because it turned out that some tests are now taking something like 50 microseconds, such a thing was naturally not observed before the merging.

@BrianBohe BrianBohe deleted the testmerging_IL_Conformance branch February 10, 2023 23:07
@ghost ghost locked as resolved and limited conversation to collaborators Mar 13, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-CodeGen-coreclr CLR JIT compiler in src/coreclr/src/jit and related components such as SuperPMI
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants