Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a VSTest Adapter #2438

Merged
merged 11 commits into from
Dec 26, 2023
Merged

Conversation

caaavik-msft
Copy link
Contributor

This is a feature that builds upon my recent work on adding support for event processors. This is a fully featured and working VSTest Adapter for running BenchmarkDotNet benchmarks. Below you can see two examples of what the experience looks like running the BDN sample benchmarks in Visual Studio's Test Explorer and VS Code's C# Dev Kit Testing UI. Apologies if the images are difficult to read, I would recommend opening them in another tab to view them.

Visual Studio's Test Explorer

image

Visual Studio Code's Testing UI with C# Dev Kit

image

Why VSTest?

While most people associate VSTest with being a framework for running unit tests, integration tests, and other tests which have a boolean outcome of "Passed" or "Failed", there is nothing inherent to VSTest that restricts it to these cases and it turns out it is flexible enough to support performance tests as well. While VSTest does not have any functionality built in today for storing numerical results and doing anything meaningful with them, its benefits are that it is a standardised test discovery and execution protocol to build on top of with good IDE support in the .NET ecosystem, and it may be preferable over using the CLI to invoke benchmarks.

Seeing the benchmark results

One feature of VSTest is that you can add arbitrary messages onto your test results, so what I did was just put the benchmark results in those messages, and so you can see in the Visual Studio screenshot above that the results are easy to view from the Test Explorer window. Unfortunately, C# Dev Kit's testing functionality does not display these messages as of the time of writing. However as an alternative, the full BDN console output is forwarded to VSTest as well and you can view it in the Tests Output Window on Visual Studio, or the Test Explorer Output Window for VS Code as seen on the bottom-right of the VS Code screenshot above.

Quirks around .NET Framework support

I'd also like to point out that for .NET Framework it was quite difficult to get this working with proper assembly loading. The approach I ended up with was invoking BenchmarkDotNet in a new AppDomain that is configured to have the same base directory as the BenchmarkDotNet dll/exe. Since there is communication across AppDomains, there is quite a bit of code in this PR spent towards "remoting" and passing test cases and test results across this boundary. Note this is only talking about the BDN program itself being compiled for .NET Framework, this doesn't apply for example if you are compiling the BDN program with .NET 6 and having benchmarks that target .NET Framework.

How to add VSTest support to an existing project.

Only three lines need to be added to the project file of the BDN project, and you can see these changes done in the BenchmarkDotNet.Samples and BenchmarkDotNet.Samples.FSharp projects in this PR. The first change is setting the property GenerateProgramFile to false which will instruct the test SDK to not generate a program file. The other two changes would be to add references to the BenchmarkDotNet.TestAdapter and Microsoft.NET.Test.Sdk libraries. Should this PR be accepted, I imagine there would be some work needed to get this published as a new NuGet package, so I would need some assistance with that so that people can start using this.

Future work

There are three things that I did not complete in this PR which I would like to do in a future PR after discussing it some more in a separate issue:

  • Adding CancellationToken support to BDN to cancel the current benchmark run at any time
    • My hack to get this working in this PR was to throw an exception inside the event processor on the OnStartRunBenchmark event, but I think in the long run having proper CancellationToken support would be preferable
  • Adding a way to specify a default global config so that the VSTest adapter can use it
    • Right now, it just loads everything with the default config with no way to modify this.
  • An entrypoint-less BDN project structure, much like how unit test projects work today where the entry point is generated at build time. This would be combined with the above feature of specifying a default global config. The default entry point that is generated could just be a call to BenchmarkSwitcher.

More reading about VSTest

If you are unfamiliar with VSTest, and wanted to understand it more so that you can review this PR better, the docs can be found in the docs folder in the microsoft/vstest repo. In particular I would recommend reading the Test Platform Architecture and the Adapter Extensibility pages. I also learnt a lot by reading the MSTest GitHub repo as well.

Other comments

  • As for the package name, I chose BenchmarkDotNet.TestAdapter because VSTest will by default load and packages that end in .TestAdapter, this same approach is used almost universally for other testing libraries too. If we want to change it, as long as it ends in .TestAdapter it should be ok.
  • The TestAdapter project is not too big, so I didn't add too many abstractions, but as a result the code is a bit messy in some places. I think there are definitely places that could be structured better so if you have ideas I am happy to implement them to make this easier to maintain.
  • This PR was my first time interacting with AppDomains and creating them, as well as how to pass things across AppDomain boundaries. I think it's likely that some of my code around that space is not following best practices or I have not implemented it in a way that handles all the edge cases. I can confirm though that the sample benchmarks in this repo all run perfectly fine on .NET framework for me, but that was the extent of my testing.

@timcassell
Copy link
Collaborator

timcassell commented Sep 30, 2023

Very cool! I'm curious why in your screenshots, it seems the Visual Studio details are very minimal (not even including the summary table) compared to VS Code. [Edit] Oh I guess because you selected an individual test run instead of the group. Makes sense.

As for creating a new nuget package, that'd fall on @AndreyAkinshin.

@AndreyAkinshin
Copy link
Member

@caaavik-msft thanks for the contribution! The idea of using VSTest Adapter to run benchmarks looks interesting. However, I would need some time to properly review it and validate that the adapter doesn't produce a significant impact on the obtained measurements.

Copy link
Member

@adamsitnik adamsitnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Big thanks for your contribution @caaavik-msft !

It works very smoothly! I added some comments, most of them are just questions.

Here are some extra questions unrelated to the code:

  1. Currently all tests and benchmarks are hard to distinguish. Is there any way we could categorize them? For example by using Traits and always assigning all benchmarks to "perf" trait so users could group them: test vs benchmarks (sample)

image

  1. The "Test Detail Summary" does not contain the table that we all love. Why is that?

image

Is it because the tooling does not support displaying tables? Does it support HTML (we have an exporter for that)?

If it's impossible to support it, could we swap the order and print histogram first? So the results would be "somehow" visualized for those who have little space on the screen?

image

  1. Would it be possible to show the benchmarks only for Release configuration? So the users don't try to run them in Debug just to get an error?

Thank you!

@timcassell
Copy link
Collaborator

  1. Currently all tests and benchmarks are hard to distinguish. Is there any way we could categorize them? For example by using Traits and always assigning all benchmarks to "perf" trait so users could group them: test vs benchmarks (sample)

👍

  1. The "Test Detail Summary" does not contain the table that we all love. Why is that?

Is it because the tooling does not support displaying tables? Does it support HTML (we have an exporter for that)?

OP shows an image with the table. It looks to me like selecting an individual benchmark result displays only the histogram for that benchmark, but you can select the group which displays the summary table of all the benchmarks in that group. It makes sense to me.

  1. Would it be possible to show the benchmarks only for Release configuration? So the users don't try to run them in Debug just to get an error?

I think debugging could be useful, maybe there's a way to force a Debug config, and display a warning about it? Or maybe we can add a [DebugBuild] attribute that does the same thing as DebugBuildConfig (and similar for DebugInProcess), and only show those benchmarks with that attribute.

@caaavik-msft
Copy link
Contributor Author

I've added some code to support VSTest traits. All benchmarks will have a BenchmarkDotNet trait, and all BenchmarkCategory attributes are surfaced as traits too:
image

Tim is correct in that the reason that we don't get the standard BDN output table is because each test case in the VSTest Explorer window is for each individual set of arguments. The Test Explorer does not expose any way for us to provide information for a group of tests so that we can display a comparison for them easily. To see all the tests compared, it is still possible to do so by viewing the "Tests" output window.

Also I have made the change for now to make it so that when Debug mode is selected, that the Test Explorer does not show any benchmarks. I think maybe there might be a nicer way to allow benchmarks in debug mode. Instead of preventing debug assemblies from being searched, another option would be to allow them if the InProcess toolchain is used which the user might be able to set with a assembly-level IConfigSource which checks if the current assembly is debug and loads the InProcess toolchain when that occurs.

Copy link
Member

@adamsitnik adamsitnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for addressing my feedback. The last thing I need is a doc with a few screen shots and most important info (build in Release, how to see the table etc). You could put it here: https://github.com/dotnet/BenchmarkDotNet/tree/master/docs/articles/features

Thank you for your amazing contribution @caaavik-msft !

Copy link
Member

@AndreyAkinshin AndreyAkinshin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks again for a great pull request! (And sorry for such a long review.)

In general, looks good to me. I left some minor comments. Once they are resolved, I'm ready to merge the PR.

Allow benchmarks to be run in Debug when in-process toolchain is used
@caaavik-msft
Copy link
Contributor Author

Have made two more changes to the functionality:

  • The histogram now displays above the other statistics:
    image
  • When the assembly is compiled with Debug, it will allow you to run benchmarks that are using an InProcess toolchain. I've tested out as well that if I put a breakpoint inside a BDN method and run the test using Debug in Test Explorer, that it hits the breakpoint, so should make it easier to debug your benchmarks this way too.

@caaavik-msft
Copy link
Contributor Author

@adamsitnik I was writing up the feature doc and I realised that I think there are two things that need to be done before people can really start using this feature and for it to make sense to be documented:

  1. The BenchmarkDotNet.TestAdapter project needs to be published to NuGet
  2. This PR needs to be merged (or discussed if we even want it): Auto-generating the entry point #2445

At the very least, the PR from the 2nd bullet point makes it so that GenerateProgramFile gets set to false by any projects that references it as a PackageReference, so we don't need people to include this line in their project file to support it. It would also be neat to include that GenerateBDNEntryPoint from this PR is available, and you could use it in collaboration with an assembly-level IConfigSource to allow specifying a default configuration that can be used when running through VSTest.

Another thing we could do if we want to is extend the PR above further by making it so that referencing BenchmarkDotNet includes the TestAdapter project as a dependency so that everyone gets the VSTest adapter without having to import any extra dependencies. I'm not sure exactly how that would look, but I think it can be achieved with a custom nuspec?

Copy link
Member

@adamsitnik adamsitnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The BenchmarkDotNet.TestAdapter project needs to be published to NuGet

We will publish the docs to benchmarkdotnet.org after we ship new packages to nuget.org. So when writing the docs you can assume that this package exists.

This PR needs to be merged (or discussed if we even want it)

I've provided my feedback, sorry for the delay.

referencing BenchmarkDotNet includes the TestAdapter project as a dependency

Currently I am not convinced that it should be the default. For years we have been telling the users to close all the apps (including VS) when running the benchmarks. Just to minimize the chance of other apps consuming resources during benchmark run and affecting the stability of the benchmarks.
For now we can ship it as a standalone package with good docs and wait for user feedback.

FWIW For me the most important aspect of the doc is what needs to be done to get it working and how to find the results.

@caaavik-msft
Copy link
Contributor Author

I have included in this PR the functionality from #2445. From the discussion there, I realised that this was better to include as part of the Test Adapter itself.

Just to reply to some comments left on that PR here since they are applicable here too:

So far, we’ve supported only Console Apps. Will this change make it possible to support libraries?
If so, what command will be needed to run the benchmarks? dotnet test?
If so, what will be the behavior for libraries that target .NET standard rather than a specific runtime (example: netstandard2.0)?

Unfortunately, for the VSTest host to be launched, the Microsoft.NET.Test.Sdk package needs to be referenced, and that package being referenced will force any project that references it to be a Console Application even if it does not have the OutputType set to exe. This is why it is necessary to always have an entry point, and so this functionality means that a BDN project can look like a library in the same way that unit tests projects look like libraries since they don't have a defined Main. When using the auto-generated entry point, an executable will still be produced that just runs the benchmarks using the BenchmarkSwitcher

Edge case: will it work with the "old" project files?

By old do you mean before the SDK-style project structure? All this functionality does is inject a custom property through a nuget package reference, so I think it should but I'm not sure how to create an older project file to test it on.

BTW I am not convinced that this is the right direction. Currently all BDN users are used to the fact that only Console Apps are supported (providing an entry point is a must-have). With this change, they won’t need to provide the entry point, but they will still have to do something to get it working (setting GenerateBDNEntryPoint to true). From my perspective it’s less intuitive (something new) and it does not solve the problem of passing the config.

I think now that I have moved this under the TestAdapter project it should alleviate most of the concerns? This way it is not seen as the recommended way to go, but just an option if you are intending to use the VSTest adapter and want an experience similar to that of other unit testing libraries.

I'll get the docs finished ASAP as well now.

@timcassell
Copy link
Collaborator

  • When the assembly is compiled with Debug, it will allow you to run benchmarks that are using an InProcess toolchain. I've tested out as well that if I put a breakpoint inside a BDN method and run the test using Debug in Test Explorer, that it hits the breakpoint, so should make it easier to debug your benchmarks this way too.

(For future) it would be nice to also be able to debug build benchmarks (the default configuration). I wonder if there's a way to automatically attach a debugger to the benchmark process when it's launched if the host process has a debugger attached.

Update F# and VB entry points to use alternative approach to get current assembly
@caaavik-msft
Copy link
Contributor Author

  • When the assembly is compiled with Debug, it will allow you to run benchmarks that are using an InProcess toolchain. I've tested out as well that if I put a breakpoint inside a BDN method and run the test using Debug in Test Explorer, that it hits the breakpoint, so should make it easier to debug your benchmarks this way too.

(For future) it would be nice to also be able to debug build benchmarks (the default configuration). I wonder if there's a way to automatically attach a debugger to the benchmark process when it's launched if the host process has a debugger attached.

I know this doesn't solve the general problem, but I could maybe modify the auto-generated entry point so that if it's compiled with Debug that it will launch it with an InProcess toolchain by default. That way, most projects that are using the VSTest adapter will get this functionality for free.

@timcassell
Copy link
Collaborator

I know this doesn't solve the general problem, but I could maybe modify the auto-generated entry point so that if it's compiled with Debug that it will launch it with an InProcess toolchain by default. That way, most projects that are using the VSTest adapter will get this functionality for free.

Let's not do that. It will break any benchmarks that expect to be built. Let's just leave this as a future improvement.

@caaavik-msft
Copy link
Contributor Author

One thing I forgot to mention as well, is that it seems that there are two ways to define what being in "debug" mode means. I originally did the check using AssemblyExtensions.IsDebug which apparently looks at IsJITTrackingEnabled on the Debuggable attribute, but I discovered that for F# projects this was returning true always even when compiled in Release mode so they stopped working with the adapter. I changed that line of code now to check IsJITOptimizerDisabled instead to test whether debug mode is on.

@caaavik-msft
Copy link
Contributor Author

I've added the docs page now: https://github.com/dotnet/BenchmarkDotNet/blob/c432f887d274e9aa7f902ed75353a0d99615c804/docs/articles/features/vstest.md

I'm not sure how it looks when on the website though, but the markdown rendering looks good.

@timcassell
Copy link
Collaborator

timcassell commented Dec 21, 2023

@caaavik-msft It seems I misunderstood Adam's question earlier. I can see the summary table from the output window (with show output from Tests selected), but not from the group summary of the test explorer window. Can we also display it in the tests group summary?

image

@caaavik-msft
Copy link
Contributor Author

caaavik-msft commented Dec 22, 2023

I've tried a bunch of things but have not been able to find a way to have anything display in the output section for a test group. It doesn't seem like there is any way to display information there,

Copy link
Member

@adamsitnik adamsitnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Last comment from my side, I promise!

Thank you for providing great docs @caaavik-msft !

docs/articles/features/vstest.md Outdated Show resolved Hide resolved
Copy link
Member

@adamsitnik adamsitnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thank you for your contribution @caaavik-msft !

@adamsitnik adamsitnik dismissed AndreyAkinshin’s stale review December 26, 2023 17:03

The feedback has been addressed

@adamsitnik adamsitnik merged commit 91f3f7e into dotnet:master Dec 26, 2023
7 checks passed
@adamsitnik adamsitnik added this to the v0.13.12 milestone Dec 26, 2023
@CameronAavik
Copy link

CameronAavik commented Jan 5, 2024

Thanks @AndreyAkinshin for getting this published and fixing the issues with the packing!

@AndreyAkinshin
Copy link
Member

@CameronAavik thanks again for the awesome PR!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants