New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fatal error. System.AccessViolationException #2503
Comments
https://andrewlock.net/tracking-down-a-hanging-xunit-test-in-ci-building-a-custom-test-framework/ If there was an actual general problem with Xunit itself, one of the millions of people affected would do the legwork to work out the exact issue Perhaps at the end of narrowing things down, you might be able to formulate a feature idea that would help you e.g. an ordered list of which tests ran and/or how they overlapped. But, even then, whether that's directly a XUnit feature will be debatable If you have a heap of ideas and circumstantial evidence, logging it on stackoverflow.com might not be the worst idea. Even there, people will insist on you bringing more than a stacktrace and a one liner fixit/helpme plea, you need to supply more information as to the sorts of things you are doing in your context... |
@bartelink Thanks for the links. The point was asking for help and direction in localizing the problem, trying to provide what I'm able. I've been trying various paths to localize and characterize the error for the past week of 10 hour days, so it seemed time for a consult. I sent the stack trace and a detailed dump file, as you can see above. As noted I can't deliver our proprietary code and have not found a smaller repo to hand-off. I'm using a test framework already so I'll work on generating messages and trying to eliminate or confirm the framework as a possible cause. The timing of the intermittent failure doesn't suggest a single test causing the failure, but perhaps eliminating the single-test cause will be possible. |
The problem is that GH discussions and issues are both low traffic and rely on two things:
SO is better in terms of how many will see them - only people that watch the repo will see these issues, While many people around the planet can, given time and a dump, infer things, the chances are they wont as your work so far has yield the universe the following contextual info:
TL;DR I'd hate you to think either
All there is here now is:
In general for I'd be trying to narrow down my haystack by doing things like:
The stuff I linked to can give circumstantial evidence about test order etc, but my key tip would be: the test framework is definitely the last suspect, even if everything you can rule out is a win in the process of narrowing things down. I've sat in your seat debugging stuff like this and its definitely no fun though, as there's no real roadmap.... So all I can do is wish you luck in your debugging. |
I was getting this error message recently. I guessed in my case that it was due to parallelised tests contending over shared data structures in linked native libraries (in this case one or more underlying format drivers that go along with the GDAL.Native NuGet package). The reason for the hunch was that I know some of the underlying native libraries are not properly threadsafe (The C/C++ codebases for them are 20+ years old in some cases). The underlying code was also opening file handles on the same source test datasets across multiple test collections with slightly tweaked fixtures, and so on. Anyway, I couldn't figure out a way to get the tests to run in parallel, but overriding the default global using Xunit;
[assembly: CollectionBehavior(DisableTestParallelization = true)] I hope this comment will be helpful for someone else who drops past this thread experiencing one of these glitches. |
@attentive There are two issues here. First, xUnit.net should not be hard crashing here. That's something that should be fixable. Second, it appears that some library is corrupting memory. Catching this exception so that it doesn't hard crash isn't guaranteed to fix everything, since corrupted memory issues may then spring up all over, depending on the source and extent of the corruption. So it may not necessarily be super helpful for us (in this case) to just return back the original error with no stack trace attached, since it's the parsing of the stack trace that appears to be causing the problem. My gut feeling is that a corrupted stack trace is very bad. |
(Note: there's a copy/paste error in the v2 tree that I didn't catch until after I pushed: 57af1d9) |
Attempted fix available in v2: |
Very responsive @bradwilson 👏 … I have made a note to see how this goes soon (might be after Easter). I didn't really consider this an xUnit bug though it was a bit confusing at first, but also I knew I was doing some pretty hectic stuff with some pretty old underlying libraries. |
Discussed in #2501
Originally posted by paultobey March 29, 2022
xUnit 2.4.1
I'm getting intermittent crashes of a large, multiple test assembly, test set (1800+ tests, 4 test assemblies, 150 tests creating/using/dropping SQL Server databases):
The point in the process at which the test crashes varies and there does not appear to be a correlation with any individual tests which are in process. Any command-line switches that would deliver better information? Any other suggestions? I'll try to set up memory dumps when it fails...
Thanks,
Paul T.
Update 1:
Here's a sample dump file when the error occurs:
dotnet.exe.34148.dmp.zip
Creating an issue from the unanswered discussion. Sorry I can't provide a simple replication case. This happens 1/4 of the time when running 'dot net test' as noted above.
Thanks,
Paul T.
The text was updated successfully, but these errors were encountered: