Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create a .NET Standard version of the Engine #10

Closed
CharliePoole opened this issue Aug 15, 2016 · 36 comments
Closed

Create a .NET Standard version of the Engine #10

CharliePoole opened this issue Aug 15, 2016 · 36 comments

Comments

@CharliePoole
Copy link
Member

@rprouse commented on Mon Dec 14 2015

This needs design, but could be platform specific agents. If so, it will likely be blocked by #362.


@rprouse commented on Sat Jan 02 2016

In #1168, @CharliePoole said,

I'm not sure there is a problem here since this refers (in my mind anyway) to an engine running on the desktop. What we need to find out is whether an engine running under .NET 2.0/3.5 is able to analyze assemblies built for other targets. I don't see why not, since they are not actually loaded. Consider that we are already analyzing .NET 4.5 assemblies. Of course, this needs to be tested on a VM that has only .NET 2.0 or 3.5 installed.

Based on my spikes into this so far, I think this is the way we need to go. The engine can inspect the test assembly and determine it's target platform. We just can't run it directly.

This is also mixed up with #677 and #362. My idea so far is as follows,

  • The current NUnit3 engine inspects the test assembly in addition to the framework. It looks for the TargetFrameworkAttribute which will tell you for example if it is targeting .NETCore,Version=v5.0.
  • If the target framework is a non-desktop framework, disallow --inprocess since we can't run non-desktop targets. We might also want to disallow --process=Single or we make sure all test assemblies target the same platform.
  • Modify TestAgency to allow it to launch platform specific agents and communicate with them in a manner other than .NET remoting. This is TestExplorer in Visual Studio 15.5.x navigates to Feature.cs code behind  #362. If we are running those agents on the desktop machine, we could even capture Console.In/Out for communications.

The platform specific agents are where decisions still need to be made. It would be nice to have a portable mini-engine (#677) that is capable of loading and running the portable framework. Like the full framework, this would allow us to run test no matter which version of the NUnit 3 framework is used. It would likely be a stripped down version of the engine that can only load from the agent directory and is only intended to be used by agents. This might even boil down to a portable IFrameworkDriver and some supporting classes?

A simplified and not mutually exclusive approach might be that the agents are actually the nunitlite version of the tests. We could modify nunitlite to run directly, or be able to communicate with the engine.


@CharliePoole commented on Sat Jan 02 2016

I think we're pretty much on the same page at a high level. Some details...

  • Can we count on TargetFrameworkAttribute being reliably set?
  • We can't precisely disallow all those options, although we might be able to in some cases. Basically, we can only look at explicit options, but if the defaults are used, we have to wait until we analyze the assembly and then reflect an error back to the runner.
  • I think we still have two very distinct options for running portable and device-based tests: a scaled down engine versus a scaled up driver. Neil has been pushing us strongly toward an engine for CF but I'm not sure that's the easiest approach. At some point, we have to spike both approaches. I think the easiest way to do that initially is right on the desktop itself.

Charlie


@rprouse commented on Mon Jan 04 2016

Can we count on TargetFrameworkAttribute being reliably set?

I have tested quite a few different assemblies and have not found it unset except for .NET 2.0 assemblies. I think that we can safely assume that if it is not set, it targets a full .NET framework. I assume that these days with so many potential targets, it needs to be set.

I think we still have two very distinct options for running portable and device-based tests: a scaled down engine versus a scaled up driver. Neil has been pushing us strongly toward an engine for CF but I'm not sure that's the easiest approach. At some point, we have to spike both approaches. I think the easiest way to do that initially is right on the desktop itself.

I installed the .NET Portability Analyzer extension for Visual Studio and analyzed the engine against the platforms that we want to support. Based on what is not available, I think approaching it from the driver end is probably the best option. At a minimum, we need to be able to load the framework in a version independent way, explore or run the tests and communicate the results. The majority of the functionality of the engine is used in the startup process, not in the agents themselves.


@xied75 commented on Wed Feb 17 2016

👍 to track.


@rprouse commented on Wed Feb 17 2016

@xied75, for future reference, you can just click the subscribe button at the bottom of the toolbar on the left. That way, all other subscribers don't get an email 😉

image


@xied75 commented on Wed Feb 17 2016

Well this is in fact a new trend on GitHub, since GitHub refuses to add the feature to allow one see what he/she subscribed, so now people do +1 in order to find what they want track.


@rprouse commented on Fri Jun 10 2016

I am going to move this out of 3.4. The new dotnet-test-nunit will run .NET Core and UWP apps and the promise of netstandard may change the way that we approach this. I want to see how things go after the release of .NET Core before committing to major changes in the engine.


@CharliePoole commented on Fri Jun 10 2016

That makes perfect sense.

@rprouse rprouse changed the title Extend the engine to run portable tests Create a .NET Standard version of the Engine Feb 2, 2017
@rprouse
Copy link
Member

rprouse commented Feb 2, 2017

I have been working on this and wanted to update people on my plans and progress so that they can provide feedback.

The VS Adapter, Xamarin runner and dotnet test all need a version of the engine that can load and run any .NET Standard based tests. For example, see nunit/nunit3-vs-adapter#297.

All of these runners run in-process, they do not launch an agent. Visual Studio inspects the assembly and launches a platform specific test host (desktop, UWP, .NET Core, etc) which then attempts to load our platform specific version of the VS adapter. (See https://github.com/Microsoft/vstest-docs/tree/master/RFCs).

Because of this, I am concentrating on the parts of the API/Engine that run in-process for the first pass. This will allow us to move forward with the Visual Studio Adapter and drop much of the duplicate code in the Xamarin runner. Future work could allow the desktop version of the engine to launch platform specific agents (.NET Core, Xamarin Android, etc), but I think we should hold off on that until we know more about how the new dotnet CLI plays out. AFAIK, Microsoft's plan is to have the dotnet test command be the standard way to run tests from the command line. If that takes off, then we may not need support in our console runner.

The Engine API converted easily. The only change to the API was dropping AppDomain from one of the method signatures. AppDomain is not supported in .NET Standard.

Conversion of the Engine is nearly done and going well. The only thing I am having trouble with is Assembly resolution in ProvidedPathsAssemblyResolver. Because there are no AppDomains, we will also run into problems with projects that have tests using different versions of NUnit.

We will need to decide if we want to have .NET Standard versions of any of our extensions. Potential candidates would likely be,

  • NUnit V2 result writer
  • NUnit project loader
  • TeamCity event listener

If we do convert them, the NuGet packages will need to be modified to include the various targets and our extension search will need to be updated to find the correct platform versions.

If anyone wants to see the work in progress, I can create a pull request for review.

@rprouse
Copy link
Member

rprouse commented Feb 2, 2017

The change to the Engine.API for .NET Standard that I mentioned is to IDriverFactory. GetDriver(AppDomain domain, AssemblyName reference) becomes GetDriver(AssemblyName reference).

I should also mention that I am targeting .NET Standard 1.3. I tried to target 1.0, but the API is so limiting that I would have had to do a major rewrite. This means that the Engine will not support Windows and Windows Phone 8.x projects (https://docs.microsoft.com/en-us/dotnet/articles/standard/library).

@ChrisMaddock
Copy link
Member

I should also mention that I am targeting .NET Standard 1.3. I tried to target 1.0, but the API is so limiting that I would have had to do a major rewrite. This means that the Engine will not support Windows and Windows Phone 8.x projects (https://docs.microsoft.com/en-us/dotnet/articles/standard/library).

I think this is a good decision. 😄

Sounds like exciting progress! I'd be happy to look at some extensions - although it would be nice leave out the NUnit v2 extensions, and see if we can start leaving them behind a little. 😁

I imagine @NikolayPianikov would be keen for TeamCity integration - I'd be happy to look at the NUnit project loader, if Nikolay is converting the TC extension to work with it?

@CharliePoole
Copy link
Member Author

@rprouse I continue to have doubts about this approach.

The original architecture for NUnit-xtp was to have a single engine, which could be used by all runners. By splitting into multiple engine builds, each used by different runners, it seems to me that our efforts are also divided. That has been the case in the last few years and it seems as if we will only see it intensify in the future.

Essentially, the current engine has two ways to run code "elsewhere"...

  1. By use of an agent. This requires implementing ITestAgent and ITestRunner, but not the full ITestEngine interface. As with the current agent, which uses a reconfigured engine, there is no need to have out-of-process capability or most extensions. Those things are all handled by the actual engine.

  2. By use of a "split driver" i.e. a driver that has separate front and back ends, the front end launching the back end and communicating with it.

This was the original design of the "NUnit Extended Test Platform" with the notion that every runner would be able to run tests in every environment supported, separately or in combination. That is, you could run a test project that included tests using any version of desktop .NET, mobile platforms, and future (which seems to have arrived) platforms. Having different so-called engines for different runners seems to me to be a poor second.

I realize you know that this is my position but others don't so I thought I'd better express it at least once before I pull out.

@rprouse
Copy link
Member

rprouse commented Feb 2, 2017

@CharliePoole I understand your approach and I see that as phase two, but that approach also has limitations. The main limitation is that the full desktop framework or mono needs to be installed to run the .NET 2.0 version of the engine before executing the platform specific code.

The second issue is that the new dotnet test command and the Visual Studio test runner will spin up a platform specific test host that loads our adapter. That test host will require a copy of the adapter that runs on that platform and the adapter requires some sort of engine. .NET 2.0 won't work except for full framework tests.

Going forward, I believe fewer and fewer platforms other than Windows will have a .NET Framework that can run .NET 2.0 installed, so we will end up limiting the launching of tests using the engine to Windows. A second .NET Standard version of the engine will open up nearly every other platform.

@CharliePoole
Copy link
Member Author

I see us supporting two kinds of platforms... as we now do, in fact.

  1. Platforms on which developers work. Heretofore, that platform has been Windows, IoS or Linux. Do you see that changing?

  2. Platforms on which we want to run tests. That includes all the platforms on which developers work. It currently includes portable, netstandard, android, iphone, etc. It formerly included CF, Silverlight and even .NET 1.1.

My thinking is that the developer should be able to run on whatever platform they use to do their work, with the tests running on whatever testing platform they select. They should not have to actually run code on a test-only platform in order to do their work.

WRT the dotnet test command, I see your point but to my mind that's putting the cart before the horse. To put it into the terms of our current main pieces of software, it's as if we let the VS adapter drive the design of the engine or framework. The adapter is not our main thing. It's a severely limited bit of software, because of design compromises we don't have any control over. I'm concerned that the dotnet test command is more of the same. We should make our stuff work under it, as best we can, but without making it our main thing. We definitely shouldn't make an engine just to run under it, at least not in the sense that I understand the phrase "NUnit test engine."

@CharliePoole
Copy link
Member Author

To clarify the last bit in my previous comment: I don't think that dotnet test should run the engine, I think the engine should run dotnet test.

@rprouse
Copy link
Member

rprouse commented Feb 2, 2017

I don't think that dotnet test should run the engine, I think the engine should run dotnet test.

That is fine by me, but on Linux or Mac, developers cannot run our engine unless they have Mono installed. If they are only interested in developing .NET Core on those platforms, we are forcing them to install Mono just to run our engine so that it can then spin up a .NET Core agent.

I also see the .NET Standard version of the engine to be the same as the way we are currently using the engine when linked by nunit-agent in that it is just running tests and aggregating results. We happen to use the same engine for both sides, but we use it in a lighter weight manner on the agent side. The .NET Standard version is only intended to replace that side. (I should really draw a diagram of the various scenarios 😄 )

@CharliePoole
Copy link
Member Author

How would you develop on Linux for .NET Core without installing Mono? I agree that would change things, but I'm not clear about how it would work. Would you just edit and compile from the command line?

I think a diagram would be a fabulous idea. First a simple one and then maybe integrate it into the architectural diagram we already have.

@xied75
Copy link

xied75 commented Feb 2, 2017

Sorry guys I might be confused, why you two keeps on mentioning Mono? It is NO longer needed for .NET core on Linux, .NET core on Linux can do self-bootstrapping now, that's how coreclr and corefx are built, no Mono involved what's so ever.

@CharliePoole
Copy link
Member Author

Well, we mention mono because we support mono. If I'm developing on Linux for mono I already have it, and I'm probably using monodevelop, which runs under mono.

@rprouse is ahead of me on the .NET Core front and he took me by surprise in saying you could develop on Linux without mono so I was asking how. Note that I'm not asking how to run on .NET core on Linux, but how to develop and test. Feel free to answer if you know. 😄

@rprouse
Copy link
Member

rprouse commented Feb 3, 2017

@CharliePoole most people who develop .NET Core on other platforms these days don't have Mono installed. .NET Core includes command line tools to create new projects, build, debug and run .NET Core and .NET Standard projects. Mono is not used and not needed.

To edit the files, most people use Visual Studio Code, not MonoDevelop. VS Code is a native application that does not require .NET to run. VS Code uses OmniSharp to provide intellisense which also works with other code editors.

I expect that on Linux and Mac, Mono will only be used for desktop apps and I am not seeing much going on there. Even Miguel recommends against using Mono for server side code which is what most people are using .NET Core for (ASP.NET MVC websites and REST APIs). Because of that, most Linux/Mac .NET developers these days don't need or want Mono installed. The only reason I have it installed on any machines is to test NUnit 😄

Even on Windows, .NET Core has nothing to do with the Full .NET Framework. It does not compile, debug or run using the framework and cannot run within the full CLR. It runs in the CoreCLR.

I should also mention that testing is built into the Core as a first class citizen now. It is a command just like build, debug or run and is the standard way of running unit tests in .NET Core. That is why we should support that scenario.

I will see if I can come up with some diagrams to explain how this version of the engine would be used.

@CharliePoole
Copy link
Member Author

Thanks Rob. I knew the stuff about Windows but not as applies to Linux.

How sad! End of an era... end of diversity... MS hegemony spreads. 😞

@CharliePoole
Copy link
Member Author

I think I'm leaving at the right time!

@rprouse
Copy link
Member

rprouse commented Feb 3, 2017

How sad! End of an era... end of diversity... MS hegemony spreads.

I would disagree, I see this as the start of an exciting new era where .NET developers can develop for any platform. I don't see it as MS hegemony because .NET Core is fully open source and one of the most active projects on GitHub. Personally, I see it as a great time to be a .NET developer.

@CharliePoole
Copy link
Member Author

I guess we'll have to agree to disagree. Mono was a wonderful instance of David challenging Goliath and doing quite well at it for a while. On a smaller scale, our own project is somewhat the same, having survived the advent of MsTest for a good decade longer than people predicted.

So, because of that similarity and because of having worked closely with the guys who invented mono in the past, I can't help but be sad.

Whether it turns out to be a spread of accessibility and transparency, as you believe, or of MS hegemony as I do, only the future will tell.

Of course, all this depressive talk is coming from the contemporary USA, so you can make appropriate allowance.

@jcansdale
Copy link

It's funny, it now seems to be MSTest that is fighting to remain relevant! Their test runner is now mostly a way to run your NUnit and xUnit tests. 😄

I can't help but be somewhat optimistic. Miguel is charismatic and opinionated and would be a dangerous hire for Microsoft if they haven't bought into the new vision. Fingers crossed...

Of course, all this depressive talk is coming from the contemporary USA, so you can make appropriate allowance.

My sympathies, Charlie!
https://goo.gl/photos/KSKp9rF74Y4QDVS66

BTW, I'm really sorry about the radio silence recently! I seem to have acquired a new badge on my GitHub profile and they've been keeping me super busy! 😉

@CharliePoole
Copy link
Member Author

Thanks @jcansdale !

@rprouse I guess I really have to pay more attention to the new technology if it's really cross-platform. Maybe I should set up a VM without Mono and find out what's possible.

@roji
Copy link

roji commented Feb 27, 2017

Very late to the discussion but just wanted to add my vote that nunit should be runnable without mono anyway as per the above. mono is still very relevant for xamarin and perhaps some desktop apps as well, but is definitely going to disappear for whole class of of applications (probably the majority).

Note also that recent versions of mono have been replacing a lot of code with Microsoft's reference implementation, so in a way the implementations are already intertwined. I also share the excitement with .NET Core and think it's the start of a great new era for .NET.

(BTW another great IDE option on non-Windows is Jetbrains' Project Rider (still prerelease), which also brings R# to Linux...)

@ghost
Copy link

ghost commented Mar 25, 2017

Guys, was just wondering where this at? I could not for the life of me get Castle.Core to compile and run tests using netstandard1.6 targetting netcoreapp1.1 via dotnet test. I eventually had to switch everything over to xunit and it worked great(VS2017 test run, debug & dotnet test via the console). I think the guys at JetBrains are having the same issue. Resharper is also stone walling me with test discovery in VS2017 for NUnit. What is weird is this is happening for Desktop CLR stuff via VS2017 but works via the console runner.

@rprouse
Copy link
Member

rprouse commented Mar 26, 2017

@Fir3pho3nixx I have a working .NET Standard 1.3 version of the engine. It is a lightweight version of the engine code that contains only what is required for the Visual Studio adapter and dotnet test. It still needs some work in the build script and minor cleaning up. For example, I need to figure out how I am going to inject things like version into the new CSPROJ.

Today I am starting work on creating a multi-target version of the adapter. I will start with the NuGet package only and may release a couple of alpha/beta releases of the adapter as NuGet only. I will not release alpha/beta versions of the VSIX because the market does not support that. I may however release early versions to select testers.

Work on the adapter shouldn't be too invasive. It is mainly upgrading the CSPROJ file to the new format and multi-targeting.

For the adapter work, follow nunit/nunit3-vs-adapter#297

@ghost
Copy link

ghost commented Mar 26, 2017

It still needs some work in the build script and minor cleaning up. For example, I need to figure out how I am going to inject things like version into the new CSPROJ.

Are you talking about how to inject/download the NuGet into a 'multi-targetting' CSPROJ structure authored in VS2017? If so I have found a way of doing this.

@rprouse
Copy link
Member

rprouse commented Mar 27, 2017

@Fir3pho3nixx yes, it is mainly the NuGet package version that I would like to inject into the CSPROJ file. If you can point me to examples, I would appreciate it. Thanks.

@ghost
Copy link

ghost commented Mar 27, 2017

This is what the CSPROJ looks like for the Castle Windsor fork(fortress) I am working on:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
	<TargetFrameworks>netstandard1.6</TargetFrameworks>
  </PropertyGroup>
  <ItemGroup>
	<PackageReference Include="Microsoft.AspNetCore.Http.Abstractions" Version="1.1.1" />
	<PackageReference Include="System.Collections" Version="4.3.0" />
	<PackageReference Include="System.ComponentModel" Version="4.3.0" />
	<PackageReference Include="System.ComponentModel.TypeConverter" Version="4.3.0" />
	<PackageReference Include="System.Reflection.Emit" Version="4.3.0" />
	<PackageReference Include="System.Runtime.Loader" Version="4.3.0" />
	<PackageReference Include="System.Threading.Thread" Version="4.3.0" />
	<PackageReference Include="System.Xml.ReaderWriter" Version="4.3.0" />
	<PackageReference Include="System.Xml.XmlDocument" Version="4.3.0" />
	<PackageReference Include="System.Xml.XmlSerializer" Version="4.3.0" />
  </ItemGroup>
  <ItemGroup>
	<ProjectReference Include="..\Castle.Core.DynamicProxy\Castle.Core.DynamicProxy.csproj" />
	<ProjectReference Include="..\Castle.Core\Castle.Core.csproj" />
  </ItemGroup>
</Project>

Your next problem is to get them to download in your build process. You can cheat by using the latest version of the dotnetcli (I borrowed this from the aspnet core guys). A bit of powershell will download this into your user profile on your build agent:

. { iwr -useb https://raw.githubusercontent.com/dotnet/cli/master/scripts/obtain/dotnet-install.ps1 } | iex; install

You can then bootstrap the executable from the following path:

%LOCALAPPDATA%\Microsoft\dotnet\dotnet.exe

Then you can get all the packages to install by running dotnet install:

%LOCALAPPDATA%\Microsoft\dotnet\dotnet.exe install src\Standard\Castle.Windsor\Castle.Windsor.csproj

Then building via MsBuild should work. Here is a reference to my FSharp FAKE file: https://github.com/cryosharp/fortress/blob/master/build.fsx

This took a few days of wading through builds for various dotnet oss projects before I settled on this approach.

Good luck!

@jnm2
Copy link
Collaborator

jnm2 commented Mar 28, 2017

@rprouse Do you mean the thing that replaces AssemblyInfo and also drives NuGet package creation?

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <TargetFrameworks>netstandard1.6</TargetFrameworks>
  </PropertyGroup>

  <PropertyGroup>
    <Product>(Windows Explorer product name - leave out to use the project name for this)</Product>
    <Version>(Windows Explorer product version string)</Version>
    <AssemblyVersion>(Windows Explorer file version)</AssemblyVersion>
    <AssemblyTitle>(Windows Explorer file description)</AssemblyTitle>
    <Authors>(Windows Explorer company)</Authors>
    <Copyright>(Windows Explorer copyright)</Copyright>
  </PropertyGroup>

</Project>

@jnm2
Copy link
Collaborator

jnm2 commented Mar 28, 2017

@Fir3pho3nixx Why doesn't msbuild /t:restore do everything you need?

@ghost
Copy link

ghost commented Mar 28, 2017

@jnm2 👍, Just noticed you can also msbuild /t:pack from this http://blog.nuget.org/20170316/NuGet-now-fully-integrated-into-MSBuild.html

@rprouse
Copy link
Member

rprouse commented Mar 29, 2017

@jnm2 yes, that is what I meant. I did figure out how to turn most of the auto-generated values off and use our various AssemblyInfo files. Now I just need to inject the NuGet package version.

@ghost
Copy link

ghost commented Mar 29, 2017

@rprouse managed to get this going by using the <BuildVersion> property.

Here is my example for appveyor:

<BuildVersion Condition="'$(APPVEYOR_BUILD_VERSION)'==''">0.5.0</BuildVersion>
<BuildVersion Condition="'$(APPVEYOR_BUILD_VERSION)'!=''">$(APPVEYOR_BUILD_VERSION)</BuildVersion>

@hikalkan helped me with this :)

@ghost
Copy link

ghost commented Mar 29, 2017

@ghost
Copy link

ghost commented Mar 29, 2017

Sorry on second thought this pipes through to <PackageVersion>, use this instead.

@rprouse
Copy link
Member

rprouse commented Mar 30, 2017

@Fir3pho3nixx that is pretty much exactly what I was thinking, good to see it works. Thanks.

@jnm2
Copy link
Collaborator

jnm2 commented Mar 31, 2017

See also NuGet pack and restore as MSBuild targets: https://docs.microsoft.com/en-us/nuget/schema/msbuild-targets

@rprouse Once you're ready to move away from AssemblyInfo.cs and put the info in the csproj, all the Cake script will have to do to create the nupkg is msbuild /t:pack.

@rprouse
Copy link
Member

rprouse commented Mar 31, 2017

@jnm2 I am already creating a nuget package whenever it is built and the NuGet specific info is already in the csproj file. I prefer to keep some of it in the various assembly info files though because it is shared between projects and the existing projects need to continue to build in older versions of Visual Studio.

@yaakov-h
Copy link

@rprouse with #205 merged and an alpha deployed to NuGet, can we use this with dotnet test yet, or is there further work to be done for that?

@rprouse
Copy link
Member

rprouse commented Apr 30, 2017

@yaakov-h this is just half of the solution. We still need to release the alpha of the Visual Studio Adapter to NuGet to get everything working. See nunit/nunit3-vs-adapter#313.

@rprouse rprouse modified the milestone: 3.7 Jul 12, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants