Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Roslyn source generator crash on mono/linux/arm64 #81123

Closed
radical opened this issue Jan 24, 2023 · 30 comments
Closed

Roslyn source generator crash on mono/linux/arm64 #81123

radical opened this issue Jan 24, 2023 · 30 comments
Assignees
Labels
area-VM-meta-mono blocking-clean-ci Blocking PR or rolling runs of 'runtime' or 'runtime-extra-platforms' Known Build Error Use this to report build issues in the .NET Helix tab source-generator Indicates an issue with a source generator feature
Milestone

Comments

@radical
Copy link
Member

radical commented Jan 24, 2023

System.Text.RegularExpressions.Tests native crash on Build Libraries Test Run release mono linux arm64 Debug on unrelated #81066 .

Build, and log:

  Starting:    System.Text.RegularExpressions.Tests (parallel test collections = on, max threads = 2)

=================================================================
	Native Crash Reporting
=================================================================
Got a SIGSEGV while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries 
used by your application.
=================================================================

=================================================================
	Native stacktrace:
=================================================================
	0xff736e1a68e8 - Unknown

=================================================================
	External Debugger Dump:
=================================================================
[New LWP 23]
[New LWP 24]
[New LWP 25]
[New LWP 26]
[New LWP 29]
[New LWP 30]
[New LWP 31]
[New LWP 32]
[New LWP 33]
[New LWP 34]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/aarch64-linux-gnu/libthread_db.so.1".
futex_wait_cancelable (private=0, expected=0, futex_word=0xab99e8177a2c) at ../sysdeps/nptl/futex-internal.h:186
186	../sysdeps/nptl/futex-internal.h: No such file or directory.
  Id   Target Id                                        Frame 
* 1    Thread 0xff736e844010 (LWP 22) "dotnet"          futex_wait_cancelable (private=0, expected=0, futex_word=0xab99e8177a2c) at ../sysdeps/nptl/futex-internal.h:186
  2    Thread 0xff736d3ff1b0 (LWP 23) "SGen worker"     futex_wait_cancelable (private=0, expected=0, futex_word=0xff736e265930 <work_cond+40>) at ../sysdeps/nptl/futex-internal.h:186
  3    Thread 0xff736b81a1b0 (LWP 24) ".NET EventPipe"  0x0000ff736e488ef4 in __GI___poll (fds=0xff7364003b20, nfds=1, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
  4    Thread 0xff736b6191b0 (LWP 25) "Finalizer"       futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xff736e2569f8 <finalizer_sem>) at ../sysdeps/nptl/futex-internal.h:323
  5    Thread 0xff73629d11b0 (LWP 26) ".NET SigHandler" __libc_read (nbytes=1, buf=0xff73629d0977, fd=<optimized out>) at ../sysdeps/unix/sysv/linux/read.c:26
  6    Thread 0xff73625fb1b0 (LWP 29) ".NET Long Runni" futex_wait_cancelable (private=0, expected=0, futex_word=0xff73500683d8) at ../sysdeps/nptl/futex-internal.h:186
  7    Thread 0xff73623fa1b0 (LWP 30) ".NET TP Worker"  futex_abstimed_wait_cancelable (private=0, abstime=0xff73623f90e8, clockid=<optimized out>, expected=0, futex_word=0xff73540236d8) at ../sysdeps/nptl/futex-internal.h:323
  8    Thread 0xff73620b71b0 (LWP 31) ".NET TP Gate"    futex_abstimed_wait_cancelable (private=0, abstime=0xff73620b62a8, clockid=<optimized out>, expected=0, futex_word=0xff7348022dc8) at ../sysdeps/nptl/futex-internal.h:323
  9    Thread 0xff73620561b0 (LWP 32) ".NET TP Worker"  futex_abstimed_wait_cancelable (private=0, abstime=0xff7362055320, clockid=<optimized out>, expected=0, futex_word=0xff7362055398) at ../sysdeps/nptl/futex-internal.h:323
  10   Thread 0xff7361e251b0 (LWP 33) ".NET Long Runni" mono_arch_flush_icache (code=0xff7360f1da70 "\375{\272\251\375\003", size=396) at /__w/1/s/src/mono/mono/mini/mini-arm64.c:2030
  11   Thread 0xff7361c241b0 (LWP 34) ".NET Long Runni" 0x0000ff736e46477c in __GI___wait4 (pid=<optimized out>, stat_loc=0xff7361c1d0e0, options=0, usage=0x0) at ../sysdeps/unix/sysv/linux/wait4.c:27

Thread 11 (Thread 0xff7361c241b0 (LWP 34) ".NET Long Runni"):
#0  0x0000ff736e46477c in __GI___wait4 (pid=<optimized out>, stat_loc=0xff7361c1d0e0, options=0, usage=0x0) at ../sysdeps/unix/sysv/linux/wait4.c:27
#1  0x0000ff736e1a69e8 in dump_native_stacktrace (signal=<optimized out>, mctx=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:843
#2  mono_dump_native_crash_info (signal=<optimized out>, mctx=0xff7361c1db40, info=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:870
#3  0x0000ff736e1655c0 in mono_handle_native_crash (signal=0xff736df63191 "SIGSEGV", mctx=0xff7361c1db40, info=0xff7361c1dea0) at /__w/1/s/src/mono/mono/mini/mini-exceptions.c:3005
#4  0x0000ff736e0ceb90 in mono_sigsegv_signal_handler_debug (_dummy=11, _info=0xff7361c1dea0, context=0xff7361c1df20, debug_fault_addr=0x0) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:3749
#5  <signal handler called>
#6  0x0000000000000000 in ?? ()
#7  0x0000ff7360f74e68 in ?? ()
#8  0x0000ff736d77ad68 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 10 (Thread 0xff7361e251b0 (LWP 33) ".NET Long Runni"):
#0  mono_arch_flush_icache (code=0xff7360f1da70 "\375{\272\251\375\003", size=396) at /__w/1/s/src/mono/mono/mini/mini-arm64.c:2030
#1  0x0000ff736e0c5790 in mono_codegen (cfg=0xff73408c0e10) at /__w/1/s/src/mono/mono/mini/mini.c:2212
#2  0x0000ff736e0c7df4 in mini_method_compile (method=<optimized out>, opts=374417919, flags=JIT_FLAG_RUN_CCTORS, parts=0, aot_method_index=-1) at /__w/1/s/src/mono/mono/mini/mini.c:3935
#3  0x0000ff736e0c912c in mono_jit_compile_method_inner (method=0xff734096fcc0, opt=396, error=0xff7361e20070) at /__w/1/s/src/mono/mono/mini/mini.c:4129
#4  0x0000ff736e0cdcb0 in mono_jit_compile_method_with_opt (method=0xff734096fcc0, opt=374417919, jit_only=0, error=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:2709
#5  jit_compile_method_with_opt_cb (arg=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:2764
#6  jit_compile_method_with_opt (params=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:2780
#7  0x0000ff736e0cd0dc in mono_jit_compile_method (method=<optimized out>, error=0xff7361e20070) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:2799
#8  0x0000ff736e167a2c in common_call_trampoline (regs=0xff7361e20120, code=0xff7360f1cd04 "\001\034", m=0xff734096fcc0, vt=0xff73409531d0, vtable_slot=<optimized out>, error=0xff7361e20070) at /__w/1/s/src/mono/mono/mini/mini-trampolines.c:618
#9  0x0000ff736e16879c in mono_vcall_trampoline (regs=0xff7361e20120, code=0xff7360f1cd04 "\001\034", slot=-15, tramp=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-trampolines.c:840
#10 0x0000ff736dbf5504 in ?? ()
#11 0x0000ff73409531d0 in ?? ()
#12 0x0000000000000003 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 9 (Thread 0xff73620561b0 (LWP 32) ".NET TP Worker"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0xff7362055320, clockid=<optimized out>, expected=0, futex_word=0xff7362055398) at ../sysdeps/nptl/futex-internal.h:323
#1  __pthread_cond_wait_common (abstime=0xff7362055320, clockid=<optimized out>, mutex=0xff73500a8520, cond=0xff7362055370) at pthread_cond_wait.c:520
#2  __pthread_cond_timedwait (cond=0xff7362055370, mutex=0xff73500a8520, abstime=0xff7362055320) at pthread_cond_wait.c:656
#3  0x0000ff736e061d4c in mono_os_cond_timedwait (cond=0xff7362055370, mutex=0xff73500a8520, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/mono-os-mutex.c:75
#4  0x0000ff736e066e6c in mono_coop_cond_timedwait (cond=0xff7362055370, mutex=<optimized out>, timeout_ms=20000) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-mutex.h:103
#5  mono_lifo_semaphore_timed_wait (semaphore=0xff73500a8520, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/lifo-semaphore.c:48
#6  0x0000ff7361e4f340 in ?? ()
#7  0x0000ff734c002180 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 8 (Thread 0xff73620b71b0 (LWP 31) ".NET TP Gate"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0xff73620b62a8, clockid=<optimized out>, expected=0, futex_word=0xff7348022dc8) at ../sysdeps/nptl/futex-internal.h:323
#1  __pthread_cond_wait_common (abstime=0xff73620b62a8, clockid=<optimized out>, mutex=0xff7348022d70, cond=0xff7348022da0) at pthread_cond_wait.c:520
#2  __pthread_cond_timedwait (cond=0xff7348022da0, mutex=0xff7348022d70, abstime=0xff73620b62a8) at pthread_cond_wait.c:656
#3  0x0000ff736b25ac4c in SystemNative_LowLevelMonitor_TimedWait (monitor=0xff7348022d70, timeoutMilliseconds=500) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:195
#4  0x0000ff736205d0b4 in ?? ()
#5  0x0000ff7362607f10 in ?? ()
#6  0x0000ff73620b68e0 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 7 (Thread 0xff73623fa1b0 (LWP 30) ".NET TP Worker"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0xff73623f90e8, clockid=<optimized out>, expected=0, futex_word=0xff73540236d8) at ../sysdeps/nptl/futex-internal.h:323
#1  __pthread_cond_wait_common (abstime=0xff73623f90e8, clockid=<optimized out>, mutex=0xff7354023680, cond=0xff73540236b0) at pthread_cond_wait.c:520
#2  __pthread_cond_timedwait (cond=0xff73540236b0, mutex=0xff7354023680, abstime=0xff73623f90e8) at pthread_cond_wait.c:656
#3  0x0000ff736b25ac4c in SystemNative_LowLevelMonitor_TimedWait (monitor=0xff7354023680, timeoutMilliseconds=12000) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:195
#4  0x0000ff736205d0b4 in ?? ()
#5  0x0000ff7362607f10 in ?? ()
#6  0x0000ff73623f98e0 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 6 (Thread 0xff73625fb1b0 (LWP 29) ".NET Long Runni"):
#0  futex_wait_cancelable (private=0, expected=0, futex_word=0xff73500683d8) at ../sysdeps/nptl/futex-internal.h:186
#1  __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0xff7350068380, cond=0xff73500683b0) at pthread_cond_wait.c:508
#2  __pthread_cond_wait (cond=0xff73500683b0, mutex=0xff7350068380) at pthread_cond_wait.c:638
#3  0x0000ff736b25aae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xff7350068380) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ff73626084d4 in ?? ()
#5  0x0000ff736d610bd8 in ?? ()
#6  0x0000ff736d6111c8 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 5 (Thread 0xff73629d11b0 (LWP 26) ".NET SigHandler"):
#0  __libc_read (nbytes=1, buf=0xff73629d0977, fd=<optimized out>) at ../sysdeps/unix/sysv/linux/read.c:26
#1  __libc_read (fd=<optimized out>, buf=0xff73629d0977, nbytes=1) at ../sysdeps/unix/sysv/linux/read.c:24
#2  0x0000ff736b25a19c in SignalHandlerLoop (arg=0xab99e83d8fd0) at /__w/1/s/src/native/libs/System.Native/pal_signal.c:331
#3  0x0000ff736e7f7648 in start_thread (arg=0xff73629d0ab0) at pthread_create.c:477
#4  0x0000ff736e492c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 4 (Thread 0xff736b6191b0 (LWP 25) "Finalizer"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xff736e2569f8 <finalizer_sem>) at ../sysdeps/nptl/futex-internal.h:323
#1  do_futex_wait (sem=sem@entry=0xff736e2569f8 <finalizer_sem>, abstime=0x0, clockid=0) at sem_waitcommon.c:112
#2  0x0000ff736e80133c in __new_sem_wait_slow (sem=0xff736e2569f8 <finalizer_sem>, abstime=0x0, clockid=0) at sem_waitcommon.c:184
#3  0x0000ff736e03c874 in mono_os_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../utils/mono-os-semaphore.h:204
#4  mono_coop_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-semaphore.h:41
#5  finalizer_thread (unused=<optimized out>) at /__w/1/s/src/mono/mono/metadata/gc.c:891
#6  0x0000ff736e016ce4 in start_wrapper_internal (start_info=0x0, stack_ptr=<optimized out>) at /__w/1/s/src/mono/mono/metadata/threads.c:1202
#7  0x0000ff736e016b90 in start_wrapper (data=0xab99e7dcc130) at /__w/1/s/src/mono/mono/metadata/threads.c:1264
#8  0x0000ff736e7f7648 in start_thread (arg=0xff736b618ab0) at pthread_create.c:477
#9  0x0000ff736e492c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 3 (Thread 0xff736b81a1b0 (LWP 24) ".NET EventPipe"):
#0  0x0000ff736e488ef4 in __GI___poll (fds=0xff7364003b20, nfds=1, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
#1  0x0000ff736e233c54 in ipc_poll_fds (fds=<optimized out>, nfds=1, timeout=4294967295) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:470
#2  ds_ipc_poll (poll_handles_data=0xff7364003310, poll_handles_data_len=1, timeout_ms=4294967295, callback=0xff736e233004 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:1096
#3  0x0000ff736e231200 in ds_ipc_stream_factory_get_next_available_stream (callback=0xff736e233004 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc.c:395
#4  0x0000ff736e22fa78 in server_thread (data=<optimized out>) at /__w/1/s/src/native/eventpipe/ds-server.c:129
#5  0x0000ff736e232fe4 in ep_rt_thread_mono_start_func (data=0xab99e7da4bb0) at /__w/1/s/src/mono/mono/mini/../eventpipe/ep-rt-mono.h:1332
#6  0x0000ff736e7f7648 in start_thread (arg=0xff736b819ab0) at pthread_create.c:477
#7  0x0000ff736e492c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 2 (Thread 0xff736d3ff1b0 (LWP 23) "SGen worker"):
#0  futex_wait_cancelable (private=0, expected=0, futex_word=0xff736e265930 <work_cond+40>) at ../sysdeps/nptl/futex-internal.h:186
#1  __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0xff736e2658d8 <lock>, cond=0xff736e265908 <work_cond>) at pthread_cond_wait.c:508
#2  __pthread_cond_wait (cond=0xff736e265908 <work_cond>, mutex=0xff736e2658d8 <lock>) at pthread_cond_wait.c:638
#3  0x0000ff736e0b6b70 in mono_os_cond_wait (cond=0xff736e265930 <work_cond+40>, mutex=<optimized out>) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-os-mutex.h:219
#4  get_work (worker_index=<optimized out>, work_context=<optimized out>, do_idle=<optimized out>, job=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:167
#5  thread_func (data=0x0) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:198
#6  0x0000ff736e7f7648 in start_thread (arg=0xff736d3feab0) at pthread_create.c:477
#7  0x0000ff736e492c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 1 (Thread 0xff736e844010 (LWP 22) "dotnet"):
#0  futex_wait_cancelable (private=0, expected=0, futex_word=0xab99e8177a2c) at ../sysdeps/nptl/futex-internal.h:186
#1  __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0xab99e81779d0, cond=0xab99e8177a00) at pthread_cond_wait.c:508
#2  __pthread_cond_wait (cond=0xab99e8177a00, mutex=0xab99e81779d0) at pthread_cond_wait.c:638
#3  0x0000ff736b25aae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xab99e81779d0) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ff73626084d4 in ?? ()
#5  0x0000ff736d60c620 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)
[Inferior 1 (process 22) detached]

=================================================================
	Basic Fault Address Reporting
=================================================================
instruction pointer is NULL, skip dumping
=================================================================
	Managed Stacktrace:
=================================================================
=================================================================
./RunTests.sh: line 168:    22 Aborted                 (core dumped) "$RUNTIME_PATH/dotnet" exec --runtimeconfig System.Text.RegularExpressions.Tests.runtimeconfig.json --depsfile System.Text.RegularExpressions.Tests.deps.json xunit.console.dll System.Text.RegularExpressions.Tests.dll -xml testResults.xml -nologo -nocolor -notrait category=IgnoreForCI -notrait category=OuterLoop -notrait category=failing $RSP_FILE
{
  "ErrorMessage": "futex-internal.h: No such file or directory.",
  "BuildRetry": false
}

Report

Build Definition Test Pull Request
360753 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.WorkItemExecution
360692 dotnet/runtime System.Linq.Queryable.Tests.WorkItemExecution #87773
359812 dotnet/runtime System.Linq.Queryable.Tests.WorkItemExecution #89809
359797 dotnet/runtime LibraryImportGenerator.Unit.Tests.WorkItemExecution #89810
359758 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution #89815
359097 dotnet/runtime System.Linq.Queryable.Tests.WorkItemExecution #89260
357600 dotnet/runtime System.Linq.Queryable.Tests.WorkItemExecution #89728
356702 dotnet/runtime System.Collections.Immutable.Tests.WorkItemExecution #89689
356078 dotnet/runtime System.Text.Json.Tests.WorkItemExecution #89662
355031 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.WorkItemExecution #89620
353986 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.WorkItemExecution #89568
353580 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.WorkItemExecution #89412
353281 dotnet/runtime HardwareIntrinsics_General_r.WorkItemExecution #89223
352565 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.WorkItemExecution #88705
350985 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.WorkItemExecution #89388
350399 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution
349938 dotnet/runtime System.Linq.Parallel.Tests.WorkItemExecution #89387
348668 dotnet/runtime ComInterfaceGenerator.Unit.Tests.WorkItemExecution #86391
346619 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89036
346698 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89258
346702 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89225
346689 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89267
346679 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89204
346610 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #88912
346586 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89260
346558 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89255
346553 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89255
346441 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89220
345466 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89151
346315 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #86089
346241 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89235
346115 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #86089
346082 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89184
346037 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #86089
345082 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89184
345964 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #87865
345957 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #88970
345864 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89203
345438 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89205
345767 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89204
345745 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #88929
345719 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89108
345670 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89217
345664 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89185
345660 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89216
345630 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #88838
345621 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89214
345573 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89211
345567 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #86815
345563 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution
345402 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89197
345382 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #87773
345350 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #86089
345298 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89147
345198 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89150
345194 dotnet/runtime System.Net.Quic.Functional.Tests.WorkItemExecution #88970
345180 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89192
345107 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89188
345077 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89187
345017 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #88743
344968 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89185
344881 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #88167
344880 dotnet/runtime System.Net.Ping.Functional.Tests.WorkItemExecution #88167
344829 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #86089
344826 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89176
344762 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #87108
344737 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #87865
344719 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #88986
344713 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #89172
344704 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #87108
344683 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution
342971 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.WorkItemExecution
342225 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.WorkItemExecution #88723
342099 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.WorkItemExecution
341426 dotnet/runtime System.Net.WebClient.Tests.WorkItemExecution #88970
341231 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #88892
341146 dotnet/runtime System.Private.Xml.Tests.WorkItemExecution #88892
340997 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution #88892
340956 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution #88892
340893 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution #88892
340282 dotnet/runtime System.Runtime.InteropServices.Tests.WorkItemExecution #88892
340231 dotnet/runtime LibraryImportGenerator.Unit.Tests.WorkItemExecution #88929
340185 dotnet/runtime LibraryImportGenerator.Unit.Tests.WorkItemExecution #88930
339986 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution #88892
339453 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution #88892
339408 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution
338964 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.WorkItemExecution #88766
338734 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.WorkItemExecution #88854
338584 dotnet/runtime LibraryImportGenerator.Unit.Tests.WorkItemExecution #86815
337982 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.WorkItemExecution #88809
337963 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.WorkItemExecution #88787
337926 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution
337553 dotnet/runtime LibraryImportGenerator.Unit.Tests.WorkItemExecution #88786
337356 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.WorkItemExecution #88768
337256 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.WorkItemExecution #88758
336840 dotnet/runtime ComInterfaceGenerator.Unit.Tests.WorkItemExecution #85328
335895 dotnet/runtime System.Net.Mail.Unit.Tests.WorkItemExecution #88626
335569 dotnet/runtime Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.WorkItemExecution #88682
335489 dotnet/runtime System.Text.RegularExpressions.Tests.WorkItemExecution #87319
334916 dotnet/runtime PayloadGroup0.WorkItemExecution #88268
Displaying 100 of 110 results

Summary

24-Hour Hit Count 7-Day Hit Count 1-Month Count
2 11 110

Known issue validation

Build: 🔎
Result validation: ⚠️ Validation could not be done without an Azure DevOps build URL on the issue. Please add it to the "Build: 🔎" line.
Validation performed at: 6/28/2023 10:04:05 PM UTC

@dotnet-issue-labeler
Copy link

I couldn't figure out the best area label to add to this issue. If you have write-permissions please help me learn by adding exactly one area label.

@ghost ghost added the untriaged New issue has not been triaged by the area owner label Jan 24, 2023
@radical radical added blocking-clean-ci Blocking PR or rolling runs of 'runtime' or 'runtime-extra-platforms' area-VM-meta-mono area-System.Text.RegularExpressions labels Jan 24, 2023
@ghost
Copy link

ghost commented Jan 24, 2023

Tagging subscribers to this area: @dotnet/area-system-text-regularexpressions
See info in area-owners.md if you want to be subscribed.

Issue Details

System.Text.RegularExpressions.Tests native crash on Build Libraries Test Run release mono linux arm64 Debug on unrelated #81066 .

Build, and log:

  Starting:    System.Text.RegularExpressions.Tests (parallel test collections = on, max threads = 2)

=================================================================
	Native Crash Reporting
=================================================================
Got a SIGSEGV while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries 
used by your application.
=================================================================

=================================================================
	Native stacktrace:
=================================================================
	0xff736e1a68e8 - Unknown

=================================================================
	External Debugger Dump:
=================================================================
[New LWP 23]
[New LWP 24]
[New LWP 25]
[New LWP 26]
[New LWP 29]
[New LWP 30]
[New LWP 31]
[New LWP 32]
[New LWP 33]
[New LWP 34]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/aarch64-linux-gnu/libthread_db.so.1".
futex_wait_cancelable (private=0, expected=0, futex_word=0xab99e8177a2c) at ../sysdeps/nptl/futex-internal.h:186
186	../sysdeps/nptl/futex-internal.h: No such file or directory.
  Id   Target Id                                        Frame 
* 1    Thread 0xff736e844010 (LWP 22) "dotnet"          futex_wait_cancelable (private=0, expected=0, futex_word=0xab99e8177a2c) at ../sysdeps/nptl/futex-internal.h:186
  2    Thread 0xff736d3ff1b0 (LWP 23) "SGen worker"     futex_wait_cancelable (private=0, expected=0, futex_word=0xff736e265930 <work_cond+40>) at ../sysdeps/nptl/futex-internal.h:186
  3    Thread 0xff736b81a1b0 (LWP 24) ".NET EventPipe"  0x0000ff736e488ef4 in __GI___poll (fds=0xff7364003b20, nfds=1, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
  4    Thread 0xff736b6191b0 (LWP 25) "Finalizer"       futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xff736e2569f8 <finalizer_sem>) at ../sysdeps/nptl/futex-internal.h:323
  5    Thread 0xff73629d11b0 (LWP 26) ".NET SigHandler" __libc_read (nbytes=1, buf=0xff73629d0977, fd=<optimized out>) at ../sysdeps/unix/sysv/linux/read.c:26
  6    Thread 0xff73625fb1b0 (LWP 29) ".NET Long Runni" futex_wait_cancelable (private=0, expected=0, futex_word=0xff73500683d8) at ../sysdeps/nptl/futex-internal.h:186
  7    Thread 0xff73623fa1b0 (LWP 30) ".NET TP Worker"  futex_abstimed_wait_cancelable (private=0, abstime=0xff73623f90e8, clockid=<optimized out>, expected=0, futex_word=0xff73540236d8) at ../sysdeps/nptl/futex-internal.h:323
  8    Thread 0xff73620b71b0 (LWP 31) ".NET TP Gate"    futex_abstimed_wait_cancelable (private=0, abstime=0xff73620b62a8, clockid=<optimized out>, expected=0, futex_word=0xff7348022dc8) at ../sysdeps/nptl/futex-internal.h:323
  9    Thread 0xff73620561b0 (LWP 32) ".NET TP Worker"  futex_abstimed_wait_cancelable (private=0, abstime=0xff7362055320, clockid=<optimized out>, expected=0, futex_word=0xff7362055398) at ../sysdeps/nptl/futex-internal.h:323
  10   Thread 0xff7361e251b0 (LWP 33) ".NET Long Runni" mono_arch_flush_icache (code=0xff7360f1da70 "\375{\272\251\375\003", size=396) at /__w/1/s/src/mono/mono/mini/mini-arm64.c:2030
  11   Thread 0xff7361c241b0 (LWP 34) ".NET Long Runni" 0x0000ff736e46477c in __GI___wait4 (pid=<optimized out>, stat_loc=0xff7361c1d0e0, options=0, usage=0x0) at ../sysdeps/unix/sysv/linux/wait4.c:27

Thread 11 (Thread 0xff7361c241b0 (LWP 34) ".NET Long Runni"):
#0  0x0000ff736e46477c in __GI___wait4 (pid=<optimized out>, stat_loc=0xff7361c1d0e0, options=0, usage=0x0) at ../sysdeps/unix/sysv/linux/wait4.c:27
#1  0x0000ff736e1a69e8 in dump_native_stacktrace (signal=<optimized out>, mctx=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:843
#2  mono_dump_native_crash_info (signal=<optimized out>, mctx=0xff7361c1db40, info=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:870
#3  0x0000ff736e1655c0 in mono_handle_native_crash (signal=0xff736df63191 "SIGSEGV", mctx=0xff7361c1db40, info=0xff7361c1dea0) at /__w/1/s/src/mono/mono/mini/mini-exceptions.c:3005
#4  0x0000ff736e0ceb90 in mono_sigsegv_signal_handler_debug (_dummy=11, _info=0xff7361c1dea0, context=0xff7361c1df20, debug_fault_addr=0x0) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:3749
#5  <signal handler called>
#6  0x0000000000000000 in ?? ()
#7  0x0000ff7360f74e68 in ?? ()
#8  0x0000ff736d77ad68 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 10 (Thread 0xff7361e251b0 (LWP 33) ".NET Long Runni"):
#0  mono_arch_flush_icache (code=0xff7360f1da70 "\375{\272\251\375\003", size=396) at /__w/1/s/src/mono/mono/mini/mini-arm64.c:2030
#1  0x0000ff736e0c5790 in mono_codegen (cfg=0xff73408c0e10) at /__w/1/s/src/mono/mono/mini/mini.c:2212
#2  0x0000ff736e0c7df4 in mini_method_compile (method=<optimized out>, opts=374417919, flags=JIT_FLAG_RUN_CCTORS, parts=0, aot_method_index=-1) at /__w/1/s/src/mono/mono/mini/mini.c:3935
#3  0x0000ff736e0c912c in mono_jit_compile_method_inner (method=0xff734096fcc0, opt=396, error=0xff7361e20070) at /__w/1/s/src/mono/mono/mini/mini.c:4129
#4  0x0000ff736e0cdcb0 in mono_jit_compile_method_with_opt (method=0xff734096fcc0, opt=374417919, jit_only=0, error=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:2709
#5  jit_compile_method_with_opt_cb (arg=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:2764
#6  jit_compile_method_with_opt (params=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:2780
#7  0x0000ff736e0cd0dc in mono_jit_compile_method (method=<optimized out>, error=0xff7361e20070) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:2799
#8  0x0000ff736e167a2c in common_call_trampoline (regs=0xff7361e20120, code=0xff7360f1cd04 "\001\034", m=0xff734096fcc0, vt=0xff73409531d0, vtable_slot=<optimized out>, error=0xff7361e20070) at /__w/1/s/src/mono/mono/mini/mini-trampolines.c:618
#9  0x0000ff736e16879c in mono_vcall_trampoline (regs=0xff7361e20120, code=0xff7360f1cd04 "\001\034", slot=-15, tramp=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-trampolines.c:840
#10 0x0000ff736dbf5504 in ?? ()
#11 0x0000ff73409531d0 in ?? ()
#12 0x0000000000000003 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 9 (Thread 0xff73620561b0 (LWP 32) ".NET TP Worker"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0xff7362055320, clockid=<optimized out>, expected=0, futex_word=0xff7362055398) at ../sysdeps/nptl/futex-internal.h:323
#1  __pthread_cond_wait_common (abstime=0xff7362055320, clockid=<optimized out>, mutex=0xff73500a8520, cond=0xff7362055370) at pthread_cond_wait.c:520
#2  __pthread_cond_timedwait (cond=0xff7362055370, mutex=0xff73500a8520, abstime=0xff7362055320) at pthread_cond_wait.c:656
#3  0x0000ff736e061d4c in mono_os_cond_timedwait (cond=0xff7362055370, mutex=0xff73500a8520, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/mono-os-mutex.c:75
#4  0x0000ff736e066e6c in mono_coop_cond_timedwait (cond=0xff7362055370, mutex=<optimized out>, timeout_ms=20000) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-mutex.h:103
#5  mono_lifo_semaphore_timed_wait (semaphore=0xff73500a8520, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/lifo-semaphore.c:48
#6  0x0000ff7361e4f340 in ?? ()
#7  0x0000ff734c002180 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 8 (Thread 0xff73620b71b0 (LWP 31) ".NET TP Gate"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0xff73620b62a8, clockid=<optimized out>, expected=0, futex_word=0xff7348022dc8) at ../sysdeps/nptl/futex-internal.h:323
#1  __pthread_cond_wait_common (abstime=0xff73620b62a8, clockid=<optimized out>, mutex=0xff7348022d70, cond=0xff7348022da0) at pthread_cond_wait.c:520
#2  __pthread_cond_timedwait (cond=0xff7348022da0, mutex=0xff7348022d70, abstime=0xff73620b62a8) at pthread_cond_wait.c:656
#3  0x0000ff736b25ac4c in SystemNative_LowLevelMonitor_TimedWait (monitor=0xff7348022d70, timeoutMilliseconds=500) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:195
#4  0x0000ff736205d0b4 in ?? ()
#5  0x0000ff7362607f10 in ?? ()
#6  0x0000ff73620b68e0 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 7 (Thread 0xff73623fa1b0 (LWP 30) ".NET TP Worker"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0xff73623f90e8, clockid=<optimized out>, expected=0, futex_word=0xff73540236d8) at ../sysdeps/nptl/futex-internal.h:323
#1  __pthread_cond_wait_common (abstime=0xff73623f90e8, clockid=<optimized out>, mutex=0xff7354023680, cond=0xff73540236b0) at pthread_cond_wait.c:520
#2  __pthread_cond_timedwait (cond=0xff73540236b0, mutex=0xff7354023680, abstime=0xff73623f90e8) at pthread_cond_wait.c:656
#3  0x0000ff736b25ac4c in SystemNative_LowLevelMonitor_TimedWait (monitor=0xff7354023680, timeoutMilliseconds=12000) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:195
#4  0x0000ff736205d0b4 in ?? ()
#5  0x0000ff7362607f10 in ?? ()
#6  0x0000ff73623f98e0 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 6 (Thread 0xff73625fb1b0 (LWP 29) ".NET Long Runni"):
#0  futex_wait_cancelable (private=0, expected=0, futex_word=0xff73500683d8) at ../sysdeps/nptl/futex-internal.h:186
#1  __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0xff7350068380, cond=0xff73500683b0) at pthread_cond_wait.c:508
#2  __pthread_cond_wait (cond=0xff73500683b0, mutex=0xff7350068380) at pthread_cond_wait.c:638
#3  0x0000ff736b25aae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xff7350068380) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ff73626084d4 in ?? ()
#5  0x0000ff736d610bd8 in ?? ()
#6  0x0000ff736d6111c8 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 5 (Thread 0xff73629d11b0 (LWP 26) ".NET SigHandler"):
#0  __libc_read (nbytes=1, buf=0xff73629d0977, fd=<optimized out>) at ../sysdeps/unix/sysv/linux/read.c:26
#1  __libc_read (fd=<optimized out>, buf=0xff73629d0977, nbytes=1) at ../sysdeps/unix/sysv/linux/read.c:24
#2  0x0000ff736b25a19c in SignalHandlerLoop (arg=0xab99e83d8fd0) at /__w/1/s/src/native/libs/System.Native/pal_signal.c:331
#3  0x0000ff736e7f7648 in start_thread (arg=0xff73629d0ab0) at pthread_create.c:477
#4  0x0000ff736e492c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 4 (Thread 0xff736b6191b0 (LWP 25) "Finalizer"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xff736e2569f8 <finalizer_sem>) at ../sysdeps/nptl/futex-internal.h:323
#1  do_futex_wait (sem=sem@entry=0xff736e2569f8 <finalizer_sem>, abstime=0x0, clockid=0) at sem_waitcommon.c:112
#2  0x0000ff736e80133c in __new_sem_wait_slow (sem=0xff736e2569f8 <finalizer_sem>, abstime=0x0, clockid=0) at sem_waitcommon.c:184
#3  0x0000ff736e03c874 in mono_os_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../utils/mono-os-semaphore.h:204
#4  mono_coop_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-semaphore.h:41
#5  finalizer_thread (unused=<optimized out>) at /__w/1/s/src/mono/mono/metadata/gc.c:891
#6  0x0000ff736e016ce4 in start_wrapper_internal (start_info=0x0, stack_ptr=<optimized out>) at /__w/1/s/src/mono/mono/metadata/threads.c:1202
#7  0x0000ff736e016b90 in start_wrapper (data=0xab99e7dcc130) at /__w/1/s/src/mono/mono/metadata/threads.c:1264
#8  0x0000ff736e7f7648 in start_thread (arg=0xff736b618ab0) at pthread_create.c:477
#9  0x0000ff736e492c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 3 (Thread 0xff736b81a1b0 (LWP 24) ".NET EventPipe"):
#0  0x0000ff736e488ef4 in __GI___poll (fds=0xff7364003b20, nfds=1, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
#1  0x0000ff736e233c54 in ipc_poll_fds (fds=<optimized out>, nfds=1, timeout=4294967295) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:470
#2  ds_ipc_poll (poll_handles_data=0xff7364003310, poll_handles_data_len=1, timeout_ms=4294967295, callback=0xff736e233004 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:1096
#3  0x0000ff736e231200 in ds_ipc_stream_factory_get_next_available_stream (callback=0xff736e233004 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc.c:395
#4  0x0000ff736e22fa78 in server_thread (data=<optimized out>) at /__w/1/s/src/native/eventpipe/ds-server.c:129
#5  0x0000ff736e232fe4 in ep_rt_thread_mono_start_func (data=0xab99e7da4bb0) at /__w/1/s/src/mono/mono/mini/../eventpipe/ep-rt-mono.h:1332
#6  0x0000ff736e7f7648 in start_thread (arg=0xff736b819ab0) at pthread_create.c:477
#7  0x0000ff736e492c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 2 (Thread 0xff736d3ff1b0 (LWP 23) "SGen worker"):
#0  futex_wait_cancelable (private=0, expected=0, futex_word=0xff736e265930 <work_cond+40>) at ../sysdeps/nptl/futex-internal.h:186
#1  __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0xff736e2658d8 <lock>, cond=0xff736e265908 <work_cond>) at pthread_cond_wait.c:508
#2  __pthread_cond_wait (cond=0xff736e265908 <work_cond>, mutex=0xff736e2658d8 <lock>) at pthread_cond_wait.c:638
#3  0x0000ff736e0b6b70 in mono_os_cond_wait (cond=0xff736e265930 <work_cond+40>, mutex=<optimized out>) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-os-mutex.h:219
#4  get_work (worker_index=<optimized out>, work_context=<optimized out>, do_idle=<optimized out>, job=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:167
#5  thread_func (data=0x0) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:198
#6  0x0000ff736e7f7648 in start_thread (arg=0xff736d3feab0) at pthread_create.c:477
#7  0x0000ff736e492c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 1 (Thread 0xff736e844010 (LWP 22) "dotnet"):
#0  futex_wait_cancelable (private=0, expected=0, futex_word=0xab99e8177a2c) at ../sysdeps/nptl/futex-internal.h:186
#1  __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0xab99e81779d0, cond=0xab99e8177a00) at pthread_cond_wait.c:508
#2  __pthread_cond_wait (cond=0xab99e8177a00, mutex=0xab99e81779d0) at pthread_cond_wait.c:638
#3  0x0000ff736b25aae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xab99e81779d0) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ff73626084d4 in ?? ()
#5  0x0000ff736d60c620 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)
[Inferior 1 (process 22) detached]

=================================================================
	Basic Fault Address Reporting
=================================================================
instruction pointer is NULL, skip dumping
=================================================================
	Managed Stacktrace:
=================================================================
=================================================================
./RunTests.sh: line 168:    22 Aborted                 (core dumped) "$RUNTIME_PATH/dotnet" exec --runtimeconfig System.Text.RegularExpressions.Tests.runtimeconfig.json --depsfile System.Text.RegularExpressions.Tests.deps.json xunit.console.dll System.Text.RegularExpressions.Tests.dll -xml testResults.xml -nologo -nocolor -notrait category=IgnoreForCI -notrait category=OuterLoop -notrait category=failing $RSP_FILE
Author: radical
Assignees: -
Labels:

area-System.Text.RegularExpressions, blocking-clean-ci, untriaged, area-VM-meta-mono

Milestone: -

@carlossanlop
Copy link
Member

I also hit this in an unrelated PR: #81070

The shared failure message seems to be:

../sysdeps/unix/sysv/linux/futex-internal.h: No such file or directory.

It's affecting at least 3 Generator Roslyn tests:

Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests

Callstack
----- start Tue Jan 24 20:15:55 UTC 2023 =============== To repro directly: =====================================================
pushd .
/root/helix/work/correlation/dotnet exec --runtimeconfig Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.runtimeconfig.json --depsfile Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.deps.json xunit.console.dll Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.dll -xml testResults.xml -nologo -nocolor -notrait category=IgnoreForCI -notrait category=OuterLoop -notrait category=failing 
popd
===========================================================================================================
/root/helix/work/workitem/e /root/helix/work/workitem/e
  Discovering: Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests (method display = ClassAndMethod, method display options = None)
  Discovered:  Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests (found 66 of 67 test cases)
  Starting:    Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests (parallel test collections = on, max threads = 2)

=================================================================
	Native Crash Reporting
=================================================================
Got a SIGSEGV while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries 
used by your application.
=================================================================

=================================================================
	Native stacktrace:
=================================================================
	0xffa4edf1d8e8 - Unknown

=================================================================
	External Debugger Dump:
=================================================================
[New LWP 26]
[New LWP 27]
[New LWP 28]
[New LWP 29]
[New LWP 32]
[New LWP 33]
[New LWP 34]
[New LWP 35]
[New LWP 36]
[New LWP 37]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/aarch64-linux-gnu/libthread_db.so.1".
0x0000ffa4ee5232a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xab821bcbc24c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
88	../sysdeps/unix/sysv/linux/futex-internal.h: No such file or directory.
  Id   Target Id         Frame 
* 1    Thread 0xffa4ee562fd0 (LWP 25) "dotnet" 0x0000ffa4ee5232a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xab821bcbc24c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
  2    Thread 0xffa4ed3ff1c0 (LWP 26) "SGen worker" 0x0000ffa4ee5232a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xffa4edfdc930 <work_cond+40>) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
  3    Thread 0xffa4eb81a1c0 (LWP 27) ".NET EventPipe" 0x0000ffa4ee1fdef8 in __GI___poll (fds=0xffa4e4003ae0, nfds=281083829414887, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
  4    Thread 0xffa4eb6191c0 (LWP 28) "Finalizer" 0x0000ffa4ee525a40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xffa4edfcd9f8 <finalizer_sem>) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
  5    Thread 0xffa4e343b1c0 (LWP 29) ".NET SigHandler" 0x0000ffa4ee526ac0 in __libc_read (fd=<optimized out>, buf=0xffa4e343a9a7, nbytes=1) at ../sysdeps/unix/sysv/linux/read.c:27
  6    Thread 0xffa4e2e7a1c0 (LWP 32) ".NET Long Runni" 0x0000ffa4ee5232a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xffa4d00cbf0c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
  7    Thread 0xffa4e307b1c0 (LWP 33) ".NET TP Worker" 0x0000ffa4ee5235bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0xffa400000000, expected=0, futex_word=0xffa4d4002cf8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
  8    Thread 0xffa4e2c091c0 (LWP 34) ".NET TP Gate" 0x0000ffa4ee525a40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xffa4c8000b80) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
  9    Thread 0xffa4e2ba81c0 (LWP 35) ".NET TP Worker" 0x0000ffa4ee527d5c in __waitpid (pid=<optimized out>, stat_loc=0xffa4e2ba29d0, options=<optimized out>) at ../sysdeps/unix/sysv/linux/waitpid.c:30
  10   Thread 0xffa4e29771c0 (LWP 36) ".NET Long Runni" 0x0000ffa4ee5232a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xffa4c000e7ac) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
  11   Thread 0xffa4e27761c0 (LWP 37) ".NET Long Runni" 0x0000ffa4ee525a40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xffa4edfd29f8 <suspend_semaphore>) at ../sysdeps/unix/sysv/linux/futex-internal.h:205

Thread 11 (Thread 0xffa4e27761c0 (LWP 37)):
#0  0x0000ffa4ee525a40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xffa4edfd29f8 <suspend_semaphore>) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
#1  do_futex_wait (sem=sem@entry=0xffa4edfd29f8 <suspend_semaphore>, abstime=0x0) at sem_waitcommon.c:111
#2  0x0000ffa4ee525b60 in __new_sem_wait_slow (sem=0xffa4edfd29f8 <suspend_semaphore>, abstime=0x0) at sem_waitcommon.c:181
#3  0x0000ffa4edde055c in mono_os_sem_wait (sem=<optimized out>, flags=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-os-semaphore.h:204
#4  mono_os_sem_timedwait (sem=0xffa4edfd29f8 <suspend_semaphore>, timeout_ms=4294967295, flags=MONO_SEM_FLAGS_NONE) at /__w/1/s/src/mono/mono/utils/mono-os-semaphore.h:237
#5  0x0000ffa4edde0280 in mono_threads_wait_pending_operations () at /__w/1/s/src/mono/mono/utils/mono-threads.c:323
#6  0x0000ffa4eddbd948 in unified_suspend_stop_world (flags=MONO_THREAD_INFO_FLAGS_NO_GC, thread_stopped_callback=0xffa4eddbdd74 <sgen_client_stop_world_thread_stopped_callback>) at /__w/1/s/src/mono/mono/metadata/sgen-stw.c:345
#7  0x0000ffa4eddbd630 in sgen_client_stop_world (generation=0, serial_collection=0) at /__w/1/s/src/mono/mono/metadata/sgen-stw.c:155
#8  0x0000ffa4eddf8e80 in sgen_stop_world (generation=0, serial_collection=0) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:3995
#9  0x0000ffa4eddf5958 in sgen_perform_collection_inner (requested_size=<optimized out>, generation_to_collect=<optimized out>, reason=<optimized out>, forced_serial=<optimized out>, stw=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:2643
#10 sgen_perform_collection (requested_size=4096, generation_to_collect=0, reason=0xffa4edccb429 "Nursery full", forced_serial=0, stw=1) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:2766
#11 0x0000ffa4eddf5894 in sgen_ensure_free_space (size=4096, generation=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:2622
#12 0x0000ffa4edde9f28 in sgen_alloc_obj_nolock (vtable=0xab821b89bd88, size=1216) at /__w/1/s/src/mono/mono/sgen/sgen-alloc.c:279
#13 0x0000ffa4eddc029c in mono_gc_alloc_vector (vtable=0xab821b89bd88, size=1216, max_length=37) at /__w/1/s/src/mono/mono/metadata/sgen-mono.c:1333
#14 0x0000ffa4e927af34 in ?? ()
#15 0x0000ffa4e276f300 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 10 (Thread 0xffa4e29771c0 (LWP 36)):
#0  0x0000ffa4ee5232a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xffa4c000e7ac) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
#1  __pthread_cond_wait_common (abstime=0x0, mutex=0xffa4c000e750, cond=0xffa4c000e780) at pthread_cond_wait.c:502
#2  __pthread_cond_wait (cond=0xffa4c000e780, mutex=0xffa4c000e750) at pthread_cond_wait.c:655
#3  0x0000ffa4eb130ae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xffa4c000e750) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ffa4e30884e4 in ?? ()
#5  0x0000000000000002 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 9 (Thread 0xffa4e2ba81c0 (LWP 35)):
#0  0x0000ffa4ee527d5c in __waitpid (pid=<optimized out>, stat_loc=0xffa4e2ba29d0, options=<optimized out>) at ../sysdeps/unix/sysv/linux/waitpid.c:30
#1  0x0000ffa4edf1d9e8 in dump_native_stacktrace (signal=<optimized out>, mctx=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:843
#2  mono_dump_native_crash_info (signal=<optimized out>, mctx=0xffa4e2ba3430, info=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:870
#3  0x0000ffa4ededc5c0 in mono_handle_native_crash (signal=0xffa4edcda1d2 "SIGSEGV", mctx=0xffa4e2ba3430, info=0xffa4e2ba3790) at /__w/1/s/src/mono/mono/mini/mini-exceptions.c:3005
#4  0x0000ffa4ede45b90 in mono_sigsegv_signal_handler_debug (_dummy=11, _info=0xffa4e2ba3790, context=0xffa4e2ba3810, debug_fault_addr=0x0) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:3749
#5  <signal handler called>
#6  0x0000000000000000 in ?? ()
#7  0x0000ffa4e1a05fe8 in ?? ()
#8  0x0000ffa4ed6cdfc8 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 8 (Thread 0xffa4e2c091c0 (LWP 34)):
#0  0x0000ffa4ee525a40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xffa4c8000b80) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
#1  do_futex_wait (sem=sem@entry=0xffa4c8000b80, abstime=0x0) at sem_waitcommon.c:111
#2  0x0000ffa4ee525b60 in __new_sem_wait_slow (sem=0xffa4c8000b80, abstime=0x0) at sem_waitcommon.c:181
#3  0x0000ffa4edde00a4 in mono_os_sem_wait (sem=0xffa4c8000b80, flags=MONO_SEM_FLAGS_NONE) at /__w/1/s/src/mono/mono/utils/mono-os-semaphore.h:204
#4  mono_thread_info_wait_for_resume (info=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-threads.c:238
#5  0x0000ffa4edde6408 in mono_threads_exit_gc_safe_region_unbalanced_internal (cookie=0xffa4c8000b20, stackdata=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-threads-coop.c:389
#6  mono_threads_exit_gc_safe_region_unbalanced (cookie=0xffa4c8000b20, stackpointer=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-threads-coop.c:409
#7  0x0000ffa4e2bc3d6c in ?? ()
#8  0x0000ffa4e3087d10 in ?? ()
#9  0x0000ffa4e2c08910 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 7 (Thread 0xffa4e307b1c0 (LWP 33)):
#0  0x0000ffa4ee5235bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0xffa400000000, expected=0, futex_word=0xffa4d4002cf8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
#1  __pthread_cond_wait_common (abstime=0xffa4e307a118, mutex=0xffa4d4002ca0, cond=0xffa4d4002cd0) at pthread_cond_wait.c:533
#2  __pthread_cond_timedwait (cond=0xffa4d4002cd0, mutex=0xffa4d4002ca0, abstime=0xffa4e307a118) at pthread_cond_wait.c:667
#3  0x0000ffa4eb130c4c in SystemNative_LowLevelMonitor_TimedWait (monitor=0xffa4d4002ca0, timeoutMilliseconds=12000) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:195
#4  0x0000ffa4e2bc3d54 in ?? ()
#5  0x0000ffa4e3087d10 in ?? ()
#6  0x0000ffa4e307a910 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 6 (Thread 0xffa4e2e7a1c0 (LWP 32)):
#0  0x0000ffa4ee5232a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xffa4d00cbf0c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
#1  __pthread_cond_wait_common (abstime=0x0, mutex=0xffa4d00cbeb0, cond=0xffa4d00cbee0) at pthread_cond_wait.c:502
#2  __pthread_cond_wait (cond=0xffa4d00cbee0, mutex=0xffa4d00cbeb0) at pthread_cond_wait.c:655
#3  0x0000ffa4eb130ae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xffa4d00cbeb0) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ffa4e30884e4 in ?? ()
#5  0x0000ffa4ed5114b0 in ?? ()
#6  0x0000ffa4ed512120 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 5 (Thread 0xffa4e343b1c0 (LWP 29)):
#0  0x0000ffa4ee526ac0 in __libc_read (fd=<optimized out>, buf=0xffa4e343a9a7, nbytes=1) at ../sysdeps/unix/sysv/linux/read.c:27
#1  0x0000ffa4eb13019c in SignalHandlerLoop (arg=0xab821b88f390) at /__w/1/s/src/native/libs/System.Native/pal_signal.c:331
#2  0x0000ffa4ee51d088 in start_thread (arg=0xffffdf304a3f) at pthread_create.c:463
#3  0x0000ffa4ee2070cc in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 4 (Thread 0xffa4eb6191c0 (LWP 28)):
#0  0x0000ffa4ee525a40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xffa4edfcd9f8 <finalizer_sem>) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
#1  do_futex_wait (sem=sem@entry=0xffa4edfcd9f8 <finalizer_sem>, abstime=0x0) at sem_waitcommon.c:111
#2  0x0000ffa4ee525b60 in __new_sem_wait_slow (sem=0xffa4edfcd9f8 <finalizer_sem>, abstime=0x0) at sem_waitcommon.c:181
#3  0x0000ffa4eddb3874 in mono_os_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../utils/mono-os-semaphore.h:204
#4  mono_coop_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-semaphore.h:41
#5  finalizer_thread (unused=<optimized out>) at /__w/1/s/src/mono/mono/metadata/gc.c:891
#6  0x0000ffa4edd8dce4 in start_wrapper_internal (start_info=0x0, stack_ptr=<optimized out>) at /__w/1/s/src/mono/mono/metadata/threads.c:1202
#7  0x0000ffa4edd8db90 in start_wrapper (data=0xab821ad62590) at /__w/1/s/src/mono/mono/metadata/threads.c:1264
#8  0x0000ffa4ee51d088 in start_thread (arg=0xffffdf304caf) at pthread_create.c:463
#9  0x0000ffa4ee2070cc in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 3 (Thread 0xffa4eb81a1c0 (LWP 27)):
#0  0x0000ffa4ee1fdef8 in __GI___poll (fds=0xffa4e4003ae0, nfds=281083829414887, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
#1  0x0000ffa4edfaaa88 in ipc_poll_fds (fds=<optimized out>, nfds=1, timeout=4294967295) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:470
#2  ds_ipc_poll (poll_handles_data=0xffa4e40032d0, poll_handles_data_len=1, timeout_ms=4294967295, callback=0xffa4edfa9e38 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:1096
#3  0x0000ffa4edfa8034 in ds_ipc_stream_factory_get_next_available_stream (callback=0xffa4edfa9e38 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc.c:395
#4  0x0000ffa4edfa68ac in server_thread (data=<optimized out>) at /__w/1/s/src/native/eventpipe/ds-server.c:129
#5  0x0000ffa4edfa9e18 in ep_rt_thread_mono_start_func (data=0xab821ad38380) at /__w/1/s/src/mono/mono/mini/../eventpipe/ep-rt-mono.h:1332
#6  0x0000ffa4ee51d088 in start_thread (arg=0xffffdf304dff) at pthread_create.c:463
#7  0x0000ffa4ee2070cc in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 2 (Thread 0xffa4ed3ff1c0 (LWP 26)):
#0  0x0000ffa4ee5232a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xffa4edfdc930 <work_cond+40>) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
#1  __pthread_cond_wait_common (abstime=0x0, mutex=0xffa4edfdc8d8 <lock>, cond=0xffa4edfdc908 <work_cond>) at pthread_cond_wait.c:502
#2  __pthread_cond_wait (cond=0xffa4edfdc908 <work_cond>, mutex=0xffa4edfdc8d8 <lock>) at pthread_cond_wait.c:655
#3  0x0000ffa4ede2db70 in mono_os_cond_wait (cond=0xffa4edfdc930 <work_cond+40>, mutex=<optimized out>) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-os-mutex.h:219
#4  get_work (worker_index=<optimized out>, work_context=<optimized out>, do_idle=<optimized out>, job=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:167
#5  thread_func (data=0x0) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:198
#6  0x0000ffa4ee51d088 in start_thread (arg=0xffffdf304d9f) at pthread_create.c:463
#7  0x0000ffa4ee2070cc in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 1 (Thread 0xffa4ee562fd0 (LWP 25)):
#0  0x0000ffa4ee5232a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xab821bcbc24c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
#1  __pthread_cond_wait_common (abstime=0x0, mutex=0xab821bcbc1f0, cond=0xab821bcbc220) at pthread_cond_wait.c:502
#2  __pthread_cond_wait (cond=0xab821bcbc220, mutex=0xab821bcbc1f0) at pthread_cond_wait.c:655
#3  0x0000ffa4eb130ae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xab821bcbc1f0) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ffa4e30884e4 in ?? ()
#5  0x0000ffa4ed50cd48 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

=================================================================
	Basic Fault Address Reporting
=================================================================
instruction pointer is NULL, skip dumping
=================================================================
	Managed Stacktrace:
=================================================================
=================================================================
./RunTests.sh: line 168:    25 Aborted                 (core dumped) "$RUNTIME_PATH/dotnet" exec --runtimeconfig Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.runtimeconfig.json --depsfile Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.deps.json xunit.console.dll Microsoft.Extensions.Logging.Generators.Roslyn3.11.Tests.dll -xml testResults.xml -nologo -nocolor -notrait category=IgnoreForCI -notrait category=OuterLoop -notrait category=failing $RSP_FILE
/root/helix/work/workitem/e
----- end Tue Jan 24 20:16:00 UTC 2023 ----- exit code 134 ----------------------------------------------------------
exit code 134 means SIGABRT Abort. Managed or native assert, or runtime check such as heap corruption, caused call to abort(). Core dumped.
ulimit -c value: unlimited
[ 7037.476987] docker0: port 1(vethb7d5750) entered disabled state
[ 7037.478171] device vethb7d5750 left promiscuous mode
[ 7037.478183] docker0: port 1(vethb7d5750) entered disabled state
[ 7045.347487] docker0: port 1(veth48c7970) entered blocking state
[ 7045.347489] docker0: port 1(veth48c7970) entered disabled state
[ 7045.347535] device veth48c7970 entered promiscuous mode
[ 7045.598016] eth0: renamed from vethadafc2e
[ 7045.649251] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 7045.649275] IPv6: ADDRCONF(NETDEV_CHANGE): veth48c7970: link becomes ready
[ 7045.649311] docker0: port 1(veth48c7970) entered blocking state
[ 7045.649312] docker0: port 1(veth48c7970) entered forwarding state
[ 7310.973129] docker0: port 1(veth48c7970) entered disabled state
[ 7310.974002] vethadafc2e: renamed from eth0
[ 7311.063818] docker0: port 1(veth48c7970) entered disabled state
[ 7311.066329] device veth48c7970 left promiscuous mode
[ 7311.066340] docker0: port 1(veth48c7970) entered disabled state
[ 7348.074358] docker0: port 1(veth02844b6) entered blocking state
[ 7348.074361] docker0: port 1(veth02844b6) entered disabled state
[ 7348.074507] device veth02844b6 entered promiscuous mode
[ 7348.300949] eth0: renamed from veth73658a9
[ 7348.354789] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 7348.354811] IPv6: ADDRCONF(NETDEV_CHANGE): veth02844b6: link becomes ready
[ 7348.354841] docker0: port 1(veth02844b6) entered blocking state
[ 7348.354842] docker0: port 1(veth02844b6) entered forwarding state
[ 7350.859315] docker0: port 1(veth02844b6) entered disabled state
[ 7350.859369] veth73658a9: renamed from eth0
[ 7350.954409] docker0: port 1(veth02844b6) entered disabled state
[ 7350.955580] device veth02844b6 left promiscuous mode
[ 7350.955588] docker0: port 1(veth02844b6) entered disabled state
[ 7359.930538] docker0: port 1(veth2f1a068) entered blocking state
[ 7359.930540] docker0: port 1(veth2f1a068) entered disabled state
[ 7359.930624] device veth2f1a068 entered promiscuous mode
[ 7360.164976] eth0: renamed from veth1ec920f
[ 7360.220042] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 7360.220066] IPv6: ADDRCONF(NETDEV_CHANGE): veth2f1a068: link becomes ready
[ 7360.220098] docker0: port 1(veth2f1a068) entered blocking state
[ 7360.220100] docker0: port 1(veth2f1a068) entered forwarding state
[ 7362.262064] docker0: port 1(veth2f1a068) entered disabled state
[ 7362.262600] veth1ec920f: renamed from eth0
[ 7362.351978] docker0: port 1(veth2f1a068) entered disabled state
[ 7362.354428] device veth2f1a068 left promiscuous mode
[ 7362.354444] docker0: port 1(veth2f1a068) entered disabled state
[ 7371.227126] docker0: port 1(veth18daea4) entered blocking state
[ 7371.227129] docker0: port 1(veth18daea4) entered disabled state
[ 7371.227236] device veth18daea4 entered promiscuous mode
[ 7371.457856] eth0: renamed from vethed8d7fd
[ 7371.510228] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 7371.510248] IPv6: ADDRCONF(NETDEV_CHANGE): veth18daea4: link becomes ready
[ 7371.510276] docker0: port 1(veth18daea4) entered blocking state
[ 7371.510277] docker0: port 1(veth18daea4) entered forwarding state
Waiting a few seconds for any dump to be written..
cat /proc/sys/kernel/core_pattern: /home/helixbot/dotnetbuild/dumps/core.%u.%p
cat /proc/sys/kernel/core_uses_pid: 0
cat: /proc/sys/kernel/coredump_filter: No such file or directory
cat /proc/sys/kernel/coredump_filter:
Looking around for any Linux dump..
... found no dump in /root/helix/work/workitem/e
+ export _commandExitCode=134
+ python /root/helix/work/correlation/reporter/run.py https://dev.azure.com/dnceng-public/ public 3144091 eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsIng1dCI6Im9PdmN6NU1fN3AtSGpJS2xGWHo5M3VfVjBabyJ9.eyJuYW1laWQiOiJjNzczZjJjMi01MTIwLTQyMDctYWZlMi1hZmFmMzVhOGJjMGEiLCJzY3AiOiJhcHBfdG9rZW4iLCJhdWkiOiI4MDk2NTRkNC1iMjdkLTQ4ODItODhiMC01MjdiNDEzMThkMmQiLCJzaWQiOiI4YzBlODU1MS00YmY3LTQ0ZTUtYjAxNi04NjIzNmJkNmQyYzEiLCJCdWlsZElkIjoiY2JiMTgyNjEtYzQ4Zi00YWJiLTg2NTEtOGNkY2I1NDc0NjQ5OzE0Njk5OSIsInBwaWQiOiJ2c3RmczovLy9CdWlsZC9CdWlsZC8xNDY5OTkiLCJvcmNoaWQiOiJiMGIxMWI5MS1kZDgyLTQ2ZGEtYmY5My03ZDMyMTFmOTFlOWMuYnVpbGQubGlicmFyaWVzX3Rlc3RfcnVuX3JlbGVhc2VfbW9ub19saW51eF9hcm02NF9kZWJ1Zy5fX2RlZmF1bHQiLCJyZXBvSWRzIjoiIiwiaXNzIjoiYXBwLnZzdG9rZW4udmlzdWFsc3R1ZGlvLmNvbSIsImF1ZCI6ImFwcC52c3Rva2VuLnZpc3VhbHN0dWRpby5jb218dnNvOjZmY2M5MmU1LTczYTctNGY4OC04ZDEzLWQ5MDQ1YjQ1ZmIyNyIsIm5iZiI6MTY3NDU4ODQzNSwiZXhwIjoxNjc0NTk4NjM1fQ.2F4kxlj-2u_b13NpFwPNbLvm6BUC1xJ64_ZrdCLyvY7s0IjbJ9RR9IP-P0w_BMdZQQIHDCcSN4-Vd2IQuD58uR_Zr4mFyxl9JpCPSzz-SYZlDzAqdELf3S2t12AIvrHE0myXSCjXZxoe9OpK-rpfeXthdrnjf1x9LddoxcruIkzTqjWjJm4vlGZrE4fdazQIRwwBwDiQVPmnKyV4CK3xO64iHbJRUn_cHiESOzu5ueGsvkxkXmUNfLhiLlTSL6h_r39eq2ctuVx5J9Fz5OSz2_WYIZsQGl2XsYYlgEa6OXKFvRNzFPLO5A6ZGzM2bcPBLQzthe7gSb3dSbi9h7HnJQ
2023-01-24T20:16:10.429Z	INFO   	run.py	run(48)	main	Beginning reading of test results.
2023-01-24T20:16:10.429Z	INFO   	run.py	__init__(42)	read_results	Searching '/root/helix/work/workitem/e' for test results files
2023-01-24T20:16:10.430Z	INFO   	run.py	__init__(42)	read_results	Searching '/root/helix/work/workitem/uploads' for test results files
2023-01-24T20:16:10.431Z	WARNING	run.py	__init__(55)	read_results	No results file found in any of the following formats: xunit, junit, trx
2023-01-24T20:16:10.431Z	INFO   	run.py	packing_test_reporter(30)	report_results	Packing 0 test reports to '/root/helix/work/workitem/e/__test_report.json'
2023-01-24T20:16:10.431Z	INFO   	run.py	packing_test_reporter(33)	report_results	Packed 1466 bytes
+ exit 134
+ export _commandExitCode=134
+ chmod -R 777 /home/helixbot/dotnetbuild/dumps
+ exit 134

[END EXECUTION]
Exit Code:134 /home/

System.Text.Json.SourceGeneration.Roslyn4.4.Tests

Callstack
----- start Tue Jan 24 20:16:50 UTC 2023 =============== To repro directly: =====================================================
pushd .
/root/helix/work/correlation/dotnet exec --runtimeconfig System.Text.Json.SourceGeneration.Roslyn4.4.Tests.runtimeconfig.json --depsfile System.Text.Json.SourceGeneration.Roslyn4.4.Tests.deps.json xunit.console.dll System.Text.Json.SourceGeneration.Roslyn4.4.Tests.dll -xml testResults.xml -nologo -nocolor -notrait category=IgnoreForCI -notrait category=OuterLoop -notrait category=failing 
popd
===========================================================================================================
/root/helix/work/workitem/e /root/helix/work/workitem/e
  Discovering: System.Text.Json.SourceGeneration.Roslyn4.4.Tests (method display = ClassAndMethod, method display options = None)
  Discovered:  System.Text.Json.SourceGeneration.Roslyn4.4.Tests (found 2746 of 2838 test cases)
  Starting:    System.Text.Json.SourceGeneration.Roslyn4.4.Tests (parallel test collections = on, max threads = 2)

=================================================================
	Native Crash Reporting
=================================================================
Got a SIGSEGV while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries 
used by your application.
=================================================================

=================================================================
	Native stacktrace:
=================================================================
	0xff55188268e8 - Unknown

=================================================================
	External Debugger Dump:
=================================================================
[New LWP 26]
[New LWP 27]
[New LWP 28]
[New LWP 29]
[New LWP 32]
[New LWP 33]
[New LWP 34]
[New LWP 35]
[New LWP 36]
[New LWP 37]
[New LWP 42]
[New LWP 43]
[New LWP 44]
[New LWP 45]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/aarch64-linux-gnu/libthread_db.so.1".
0x0000ff5518e2c2a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xab49a470585c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
88	../sysdeps/unix/sysv/linux/futex-internal.h: No such file or directory.
  Id   Target Id         Frame 
* 1    Thread 0xff5518e6bfd0 (LWP 25) "dotnet" 0x0000ff5518e2c2a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xab49a470585c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
  2    Thread 0xff5517bff1c0 (LWP 26) "SGen worker" 0x0000ff5518e2c2a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xff55188e5930 <work_cond+40>) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
  3    Thread 0xff551601a1c0 (LWP 27) ".NET EventPipe" 0x0000ff5518b06ef8 in __GI___poll (fds=0xff5510003ae0, nfds=280740946148327, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
  4    Thread 0xff5515e191c0 (LWP 28) "Finalizer" 0x0000ff5518e2ea40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xff55188d69f8 <finalizer_sem>) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
  5    Thread 0xff550def91c0 (LWP 29) ".NET SigHandler" 0x0000ff5518e2fac0 in __libc_read (fd=<optimized out>, buf=0xff550def89a7, nbytes=1) at ../sysdeps/unix/sysv/linux/read.c:27
  6    Thread 0xff550db7f1c0 (LWP 32) ".NET Long Runni" 0x0000ff5518e2ea40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xff55188db9f8 <suspend_semaphore>) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
  7    Thread 0xff550d96e1c0 (LWP 33) ".NET TP Worker" 0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0xff5500000000, expected=0, futex_word=0xff55000870d8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
  8    Thread 0xff550d0eb1c0 (LWP 34) ".NET TP Gate" 0x0000ff5518e2ea40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xff54f4000b80) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
  9    Thread 0xff550d08a1c0 (LWP 35) ".NET TP Worker" 0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0x0, expected=0, futex_word=0xff550d0893c8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
  10   Thread 0xff550ce591c0 (LWP 36) ".NET Long Runni" 0x0000ff5518e30d5c in __waitpid (pid=<optimized out>, stat_loc=0xff550ce50490, options=<optimized out>) at ../sysdeps/unix/sysv/linux/waitpid.c:30
  11   Thread 0xff550cc581c0 (LWP 37) ".NET Long Runni" 0x0000ff5518e2f690 in __lll_lock_wait (futex=futex@entry=0xff55188ed328 <sgen_gc_mutex>, private=0) at lowlevellock.c:46
  12   Thread 0xff54eb95b1c0 (LWP 42) ".NET Timer" 0x0000ff5518e2c2a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xff54e401450c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
  13   Thread 0xff54ead2b1c0 (LWP 43) ".NET TP Worker" 0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0x0, expected=0, futex_word=0xff54ead2a3c8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
  14   Thread 0xff54e85831c0 (LWP 44) ".NET TP Worker" 0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0x0, expected=0, futex_word=0xff54e85823c8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
  15   Thread 0xff54e83821c0 (LWP 45) ".NET TP Worker" 0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0x0, expected=0, futex_word=0xff54e83813c8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142

Thread 15 (Thread 0xff54e83821c0 (LWP 45)):
#0  0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0x0, expected=0, futex_word=0xff54e83813c8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
#1  __pthread_cond_wait_common (abstime=0xff54e8381350, mutex=0xff54fc02e430, cond=0xff54e83813a0) at pthread_cond_wait.c:533
#2  __pthread_cond_timedwait (cond=0xff54e83813a0, mutex=0xff54fc02e430, abstime=0xff54e8381350) at pthread_cond_wait.c:667
#3  0x0000ff55186e1d4c in mono_os_cond_timedwait (cond=0xff54e83813a0, mutex=0xff54fc02e430, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/mono-os-mutex.c:75
#4  0x0000ff55186e6e6c in mono_coop_cond_timedwait (cond=0xff54e83813a0, mutex=<optimized out>, timeout_ms=20000) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-mutex.h:103
#5  mono_lifo_semaphore_timed_wait (semaphore=0xff54fc02e430, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/lifo-semaphore.c:48
#6  0x0000ff550d090b20 in ?? ()
#7  0x0000ff5517c0c218 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 14 (Thread 0xff54e85831c0 (LWP 44)):
#0  0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0x0, expected=0, futex_word=0xff54e85823c8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
#1  __pthread_cond_wait_common (abstime=0xff54e8582350, mutex=0xff54fc02e430, cond=0xff54e85823a0) at pthread_cond_wait.c:533
#2  __pthread_cond_timedwait (cond=0xff54e85823a0, mutex=0xff54fc02e430, abstime=0xff54e8582350) at pthread_cond_wait.c:667
#3  0x0000ff55186e1d4c in mono_os_cond_timedwait (cond=0xff54e85823a0, mutex=0xff54fc02e430, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/mono-os-mutex.c:75
#4  0x0000ff55186e6e6c in mono_coop_cond_timedwait (cond=0xff54e85823a0, mutex=<optimized out>, timeout_ms=20000) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-mutex.h:103
#5  mono_lifo_semaphore_timed_wait (semaphore=0xff54fc02e430, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/lifo-semaphore.c:48
#6  0x0000ff550d090b20 in ?? ()
#7  0x0000000000000001 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 13 (Thread 0xff54ead2b1c0 (LWP 43)):
#0  0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0x0, expected=0, futex_word=0xff54ead2a3c8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
#1  __pthread_cond_wait_common (abstime=0xff54ead2a350, mutex=0xff54fc02e430, cond=0xff54ead2a3a0) at pthread_cond_wait.c:533
#2  __pthread_cond_timedwait (cond=0xff54ead2a3a0, mutex=0xff54fc02e430, abstime=0xff54ead2a350) at pthread_cond_wait.c:667
#3  0x0000ff55186e1d4c in mono_os_cond_timedwait (cond=0xff54ead2a3a0, mutex=0xff54fc02e430, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/mono-os-mutex.c:75
#4  0x0000ff55186e6e6c in mono_coop_cond_timedwait (cond=0xff54ead2a3a0, mutex=<optimized out>, timeout_ms=20000) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-mutex.h:103
#5  mono_lifo_semaphore_timed_wait (semaphore=0xff54fc02e430, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/lifo-semaphore.c:48
#6  0x0000ff550d090b20 in ?? ()
#7  0x0000000000000001 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 12 (Thread 0xff54eb95b1c0 (LWP 42)):
#0  0x0000ff5518e2c2a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xff54e401450c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
#1  __pthread_cond_wait_common (abstime=0x0, mutex=0xff54e40144b0, cond=0xff54e40144e0) at pthread_cond_wait.c:502
#2  __pthread_cond_wait (cond=0xff54e40144e0, mutex=0xff54e40144b0) at pthread_cond_wait.c:655
#3  0x0000ff5515a2bae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xff54e40144b0) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ff550d978274 in ?? ()
#5  0x0000ff5517d2a9b8 in ?? ()
#6  0x0000ff5517d400e8 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 11 (Thread 0xff550cc581c0 (LWP 37)):
#0  0x0000ff5518e2f690 in __lll_lock_wait (futex=futex@entry=0xff55188ed328 <sgen_gc_mutex>, private=0) at lowlevellock.c:46
#1  0x0000ff5518e287c8 in __GI___pthread_mutex_lock (mutex=0xff55188ed328 <sgen_gc_mutex>) at pthread_mutex_lock.c:80
#2  0x0000ff55186ff338 in mono_os_mutex_lock (mutex=<optimized out>) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-os-mutex.h:105
#3  mono_coop_mutex_lock (mutex=<optimized out>) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-mutex.h:57
#4  sgen_gc_lock () at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:3937
#5  0x0000ff55186f3434 in sgen_alloc_obj (vtable=0xff54f0069760, size=128) at /__w/1/s/src/mono/mono/sgen/sgen-alloc.c:453
#6  0x0000ff55186c87a4 in mono_gc_alloc_obj (vtable=0xff55188ed328 <sgen_gc_mutex>, size=128) at /__w/1/s/src/mono/mono/metadata/sgen-mono.c:904
#7  0x0000ff551810c5a4 in ?? ()
#8  0x0000ff55155284e0 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 10 (Thread 0xff550ce591c0 (LWP 36)):
#0  0x0000ff5518e30d5c in __waitpid (pid=<optimized out>, stat_loc=0xff550ce50490, options=<optimized out>) at ../sysdeps/unix/sysv/linux/waitpid.c:30
#1  0x0000ff55188269e8 in dump_native_stacktrace (signal=<optimized out>, mctx=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:843
#2  mono_dump_native_crash_info (signal=<optimized out>, mctx=0xff550ce50ef0, info=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:870
#3  0x0000ff55187e55c0 in mono_handle_native_crash (signal=0xff55185e31d2 "SIGSEGV", mctx=0xff550ce50ef0, info=0xff550ce51250) at /__w/1/s/src/mono/mono/mini/mini-exceptions.c:3005
#4  0x0000ff551874eb90 in mono_sigsegv_signal_handler_debug (_dummy=11, _info=0xff550ce51250, context=0xff550ce512d0, debug_fault_addr=0xff55ebfb7d70) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:3749
#5  <signal handler called>
#6  0x0000ff55ebfb7d70 in ?? ()
#7  0x0000ff54daeb1624 in ?? ()
#8  0x0000ff5517c5a410 in ?? ()
#9  0x0000000300000002 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 9 (Thread 0xff550d08a1c0 (LWP 35)):
#0  0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0x0, expected=0, futex_word=0xff550d0893c8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
#1  __pthread_cond_wait_common (abstime=0xff550d089350, mutex=0xff54fc02e430, cond=0xff550d0893a0) at pthread_cond_wait.c:533
#2  __pthread_cond_timedwait (cond=0xff550d0893a0, mutex=0xff54fc02e430, abstime=0xff550d089350) at pthread_cond_wait.c:667
#3  0x0000ff55186e1d4c in mono_os_cond_timedwait (cond=0xff550d0893a0, mutex=0xff54fc02e430, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/mono-os-mutex.c:75
#4  0x0000ff55186e6e6c in mono_coop_cond_timedwait (cond=0xff550d0893a0, mutex=<optimized out>, timeout_ms=20000) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-mutex.h:103
#5  mono_lifo_semaphore_timed_wait (semaphore=0xff54fc02e430, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/lifo-semaphore.c:48
#6  0x0000ff550d090b20 in ?? ()
#7  0x0000000000000001 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 8 (Thread 0xff550d0eb1c0 (LWP 34)):
#0  0x0000ff5518e2ea40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xff54f4000b80) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
#1  do_futex_wait (sem=sem@entry=0xff54f4000b80, abstime=0x0) at sem_waitcommon.c:111
#2  0x0000ff5518e2eb60 in __new_sem_wait_slow (sem=0xff54f4000b80, abstime=0x0) at sem_waitcommon.c:181
#3  0x0000ff55186e90a4 in mono_os_sem_wait (sem=0xff54f4000b80, flags=MONO_SEM_FLAGS_NONE) at /__w/1/s/src/mono/mono/utils/mono-os-semaphore.h:204
#4  mono_thread_info_wait_for_resume (info=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-threads.c:238
#5  0x0000ff55186ef408 in mono_threads_exit_gc_safe_region_unbalanced_internal (cookie=0xff54f4000b20, stackdata=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-threads-coop.c:389
#6  mono_threads_exit_gc_safe_region_unbalanced (cookie=0xff54f4000b20, stackpointer=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-threads-coop.c:409
#7  0x0000ff550d0a2b04 in ?? ()
#8  0x0000ff550d9778f8 in ?? ()
#9  0x0000ff550d0ea910 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 7 (Thread 0xff550d96e1c0 (LWP 33)):
#0  0x0000ff5518e2c5bc in futex_reltimed_wait_cancelable (private=<optimized out>, reltime=0xff5500000000, expected=0, futex_word=0xff55000870d8) at ../sysdeps/unix/sysv/linux/futex-internal.h:142
#1  __pthread_cond_wait_common (abstime=0xff550d96d118, mutex=0xff5500087080, cond=0xff55000870b0) at pthread_cond_wait.c:533
#2  __pthread_cond_timedwait (cond=0xff55000870b0, mutex=0xff5500087080, abstime=0xff550d96d118) at pthread_cond_wait.c:667
#3  0x0000ff5515a2bc4c in SystemNative_LowLevelMonitor_TimedWait (monitor=0xff5500087080, timeoutMilliseconds=12000) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:195
#4  0x0000ff550d0a2aec in ?? ()
#5  0x0000ff550d9778f8 in ?? ()
#6  0x0000ff550d96d910 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 6 (Thread 0xff550db7f1c0 (LWP 32)):
#0  0x0000ff5518e2ea40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xff55188db9f8 <suspend_semaphore>) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
#1  do_futex_wait (sem=sem@entry=0xff55188db9f8 <suspend_semaphore>, abstime=0x0) at sem_waitcommon.c:111
#2  0x0000ff5518e2eb60 in __new_sem_wait_slow (sem=0xff55188db9f8 <suspend_semaphore>, abstime=0x0) at sem_waitcommon.c:181
#3  0x0000ff55186e955c in mono_os_sem_wait (sem=<optimized out>, flags=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-os-semaphore.h:204
#4  mono_os_sem_timedwait (sem=0xff55188db9f8 <suspend_semaphore>, timeout_ms=4294967295, flags=MONO_SEM_FLAGS_NONE) at /__w/1/s/src/mono/mono/utils/mono-os-semaphore.h:237
#5  0x0000ff55186e9280 in mono_threads_wait_pending_operations () at /__w/1/s/src/mono/mono/utils/mono-threads.c:323
#6  0x0000ff55186c6948 in unified_suspend_stop_world (flags=MONO_THREAD_INFO_FLAGS_NO_GC, thread_stopped_callback=0xff55186c6d74 <sgen_client_stop_world_thread_stopped_callback>) at /__w/1/s/src/mono/mono/metadata/sgen-stw.c:345
#7  0x0000ff55186c6630 in sgen_client_stop_world (generation=0, serial_collection=0) at /__w/1/s/src/mono/mono/metadata/sgen-stw.c:155
#8  0x0000ff5518701e80 in sgen_stop_world (generation=0, serial_collection=0) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:3995
#9  0x0000ff55186fe958 in sgen_perform_collection_inner (requested_size=<optimized out>, generation_to_collect=<optimized out>, reason=<optimized out>, forced_serial=<optimized out>, stw=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:2643
#10 sgen_perform_collection (requested_size=4096, generation_to_collect=0, reason=0xff55185d4429 "Nursery full", forced_serial=0, stw=1) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:2766
#11 0x0000ff55186fe894 in sgen_ensure_free_space (size=4096, generation=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:2622
#12 0x0000ff55186f2f28 in sgen_alloc_obj_nolock (vtable=0xab49a3f1fc88, size=352) at /__w/1/s/src/mono/mono/sgen/sgen-alloc.c:279
#13 0x0000ff55186c929c in mono_gc_alloc_vector (vtable=0xab49a3f1fc88, size=352, max_length=158) at /__w/1/s/src/mono/mono/metadata/sgen-mono.c:1333
#14 0x0000ff5515535fec in ?? ()
#15 0x0000ff54fc12fd90 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 5 (Thread 0xff550def91c0 (LWP 29)):
#0  0x0000ff5518e2fac0 in __libc_read (fd=<optimized out>, buf=0xff550def89a7, nbytes=1) at ../sysdeps/unix/sysv/linux/read.c:27
#1  0x0000ff5515a2b19c in SignalHandlerLoop (arg=0xab49a4604ca0) at /__w/1/s/src/native/libs/System.Native/pal_signal.c:331
#2  0x0000ff5518e26088 in start_thread (arg=0xffffd743586f) at pthread_create.c:463
#3  0x0000ff5518b100cc in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 4 (Thread 0xff5515e191c0 (LWP 28)):
#0  0x0000ff5518e2ea40 in futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0, futex_word=0xff55188d69f8 <finalizer_sem>) at ../sysdeps/unix/sysv/linux/futex-internal.h:205
#1  do_futex_wait (sem=sem@entry=0xff55188d69f8 <finalizer_sem>, abstime=0x0) at sem_waitcommon.c:111
#2  0x0000ff5518e2eb60 in __new_sem_wait_slow (sem=0xff55188d69f8 <finalizer_sem>, abstime=0x0) at sem_waitcommon.c:181
#3  0x0000ff55186bc874 in mono_os_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../utils/mono-os-semaphore.h:204
#4  mono_coop_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-semaphore.h:41
#5  finalizer_thread (unused=<optimized out>) at /__w/1/s/src/mono/mono/metadata/gc.c:891
#6  0x0000ff5518696ce4 in start_wrapper_internal (start_info=0x0, stack_ptr=<optimized out>) at /__w/1/s/src/mono/mono/metadata/threads.c:1202
#7  0x0000ff5518696b90 in start_wrapper (data=0xab49a3d13280) at /__w/1/s/src/mono/mono/metadata/threads.c:1264
#8  0x0000ff5518e26088 in start_thread (arg=0xffffd7435adf) at pthread_create.c:463
#9  0x0000ff5518b100cc in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 3 (Thread 0xff551601a1c0 (LWP 27)):
#0  0x0000ff5518b06ef8 in __GI___poll (fds=0xff5510003ae0, nfds=280740946148327, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
#1  0x0000ff55188b3a88 in ipc_poll_fds (fds=<optimized out>, nfds=1, timeout=4294967295) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:470
#2  ds_ipc_poll (poll_handles_data=0xff55100032d0, poll_handles_data_len=1, timeout_ms=4294967295, callback=0xff55188b2e38 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:1096
#3  0x0000ff55188b1034 in ds_ipc_stream_factory_get_next_available_stream (callback=0xff55188b2e38 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc.c:395
#4  0x0000ff55188af8ac in server_thread (data=<optimized out>) at /__w/1/s/src/native/eventpipe/ds-server.c:129
#5  0x0000ff55188b2e18 in ep_rt_thread_mono_start_func (data=0xab49a3d47550) at /__w/1/s/src/mono/mono/mini/../eventpipe/ep-rt-mono.h:1332
#6  0x0000ff5518e26088 in start_thread (arg=0xffffd7435c2f) at pthread_create.c:463
#7  0x0000ff5518b100cc in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 2 (Thread 0xff5517bff1c0 (LWP 26)):
#0  0x0000ff5518e2c2a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xff55188e5930 <work_cond+40>) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
#1  __pthread_cond_wait_common (abstime=0x0, mutex=0xff55188e58d8 <lock>, cond=0xff55188e5908 <work_cond>) at pthread_cond_wait.c:502
#2  __pthread_cond_wait (cond=0xff55188e5908 <work_cond>, mutex=0xff55188e58d8 <lock>) at pthread_cond_wait.c:655
#3  0x0000ff5518736b70 in mono_os_cond_wait (cond=0xff55188e5930 <work_cond+40>, mutex=<optimized out>) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-os-mutex.h:219
#4  get_work (worker_index=<optimized out>, work_context=<optimized out>, do_idle=<optimized out>, job=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:167
#5  thread_func (data=0x0) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:198
#6  0x0000ff5518e26088 in start_thread (arg=0xffffd7435bcf) at pthread_create.c:463
#7  0x0000ff5518b100cc in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 1 (Thread 0xff5518e6bfd0 (LWP 25)):
#0  0x0000ff5518e2c2a4 in futex_wait_cancelable (private=<optimized out>, expected=0, futex_word=0xab49a470585c) at ../sysdeps/unix/sysv/linux/futex-internal.h:88
#1  __pthread_cond_wait_common (abstime=0x0, mutex=0xab49a4705800, cond=0xab49a4705830) at pthread_cond_wait.c:502
#2  __pthread_cond_wait (cond=0xab49a4705830, mutex=0xab49a4705800) at pthread_cond_wait.c:655
#3  0x0000ff5515a2bae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xab49a4705800) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ff550d978274 in ?? ()
#5  0x0000ff550d1f50b0 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

=================================================================
	Basic Fault Address Reporting
=================================================================
./RunTests.sh: line 168:    25 Segmentation fault      (core dumped) "$RUNTIME_PATH/dotnet" exec --runtimeconfig System.Text.Json.SourceGeneration.Roslyn4.4.Tests.runtimeconfig.json --depsfile System.Text.Json.SourceGeneration.Roslyn4.4.Tests.deps.json xunit.console.dll System.Text.Json.SourceGeneration.Roslyn4.4.Tests.dll -xml testResults.xml -nologo -nocolor -notrait category=IgnoreForCI -notrait category=OuterLoop -notrait category=failing $RSP_FILE
Memory around native instruction pointer (0xff55ebfb7d70):0xff55ebfb7d60  /root/helix/work/workitem/e
----- end Tue Jan 24 20:17:06 UTC 2023 ----- exit code 139 ----------------------------------------------------------
exit code 139 means SIGSEGV Illegal memory access. Deref invalid pointer, overrunning buffer, stack overflow etc. Core dumped.
ulimit -c value: unlimited
[ 6759.514186] docker0: port 1(veth72148a7) entered disabled state
[ 6759.515575] device veth72148a7 left promiscuous mode
[ 6759.515584] docker0: port 1(veth72148a7) entered disabled state
[ 6769.913472] docker0: port 1(vethd066c8e) entered blocking state
[ 6769.913474] docker0: port 1(vethd066c8e) entered disabled state
[ 6769.913578] device vethd066c8e entered promiscuous mode
[ 6770.147602] eth0: renamed from vethb227dbb
[ 6770.198833] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 6770.198854] IPv6: ADDRCONF(NETDEV_CHANGE): vethd066c8e: link becomes ready
[ 6770.198881] docker0: port 1(vethd066c8e) entered blocking state
[ 6770.198882] docker0: port 1(vethd066c8e) entered forwarding state
[ 6772.226500] docker0: port 1(vethd066c8e) entered disabled state
[ 6772.226556] vethb227dbb: renamed from eth0
[ 6772.306249] docker0: port 1(vethd066c8e) entered disabled state
[ 6772.309460] device vethd066c8e left promiscuous mode
[ 6772.309469] docker0: port 1(vethd066c8e) entered disabled state
[ 6779.111596] docker0: port 1(veth45ce3f8) entered blocking state
[ 6779.111598] docker0: port 1(veth45ce3f8) entered disabled state
[ 6779.111660] device veth45ce3f8 entered promiscuous mode
[ 6779.328717] eth0: renamed from vetha1cb83a
[ 6779.378816] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 6779.378837] IPv6: ADDRCONF(NETDEV_CHANGE): veth45ce3f8: link becomes ready
[ 6779.378866] docker0: port 1(veth45ce3f8) entered blocking state
[ 6779.378868] docker0: port 1(veth45ce3f8) entered forwarding state
[ 6785.683479] docker0: port 1(veth45ce3f8) entered disabled state
[ 6785.684312] vetha1cb83a: renamed from eth0
[ 6785.787690] docker0: port 1(veth45ce3f8) entered disabled state
[ 6785.790451] device veth45ce3f8 left promiscuous mode
[ 6785.790459] docker0: port 1(veth45ce3f8) entered disabled state
[ 6796.084217] docker0: port 1(veth3da1fe2) entered blocking state
[ 6796.084221] docker0: port 1(veth3da1fe2) entered disabled state
[ 6796.084307] device veth3da1fe2 entered promiscuous mode
[ 6796.392579] eth0: renamed from veth4154aac
[ 6796.445104] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 6796.445125] IPv6: ADDRCONF(NETDEV_CHANGE): veth3da1fe2: link becomes ready
[ 6796.445153] docker0: port 1(veth3da1fe2) entered blocking state
[ 6796.445154] docker0: port 1(veth3da1fe2) entered forwarding state
[ 6799.560774] docker0: port 1(veth3da1fe2) entered disabled state
[ 6799.561331] veth4154aac: renamed from eth0
[ 6799.649776] docker0: port 1(veth3da1fe2) entered disabled state
[ 6799.650977] device veth3da1fe2 left promiscuous mode
[ 6799.650989] docker0: port 1(veth3da1fe2) entered disabled state
[ 6807.368507] docker0: port 1(vethda166d4) entered blocking state
[ 6807.368509] docker0: port 1(vethda166d4) entered disabled state
[ 6807.368564] device vethda166d4 entered promiscuous mode
[ 6807.606956] eth0: renamed from veth9ebc771
[ 6807.657017] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 6807.657039] IPv6: ADDRCONF(NETDEV_CHANGE): vethda166d4: link becomes ready
[ 6807.657073] docker0: port 1(vethda166d4) entered blocking state
[ 6807.657075] docker0: port 1(vethda166d4) entered forwarding state
Waiting a few seconds for any dump to be written..
cat /proc/sys/kernel/core_pattern: /home/helixbot/dotnetbuild/dumps/core.%u.%p
cat /proc/sys/kernel/core_uses_pid: 0
cat: /proc/sys/kernel/coredump_filter: No such file or directory
cat /proc/sys/kernel/coredump_filter:
Looking around for any Linux dump..
... found no dump in /root/helix/work/workitem/e
+ export _commandExitCode=139
+ python /root/helix/work/correlation/reporter/run.py https://dev.azure.com/dnceng-public/ public 3144091 eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsIng1dCI6Im9PdmN6NU1fN3AtSGpJS2xGWHo5M3VfVjBabyJ9.eyJuYW1laWQiOiJjNzczZjJjMi01MTIwLTQyMDctYWZlMi1hZmFmMzVhOGJjMGEiLCJzY3AiOiJhcHBfdG9rZW4iLCJhdWkiOiI4MDk2NTRkNC1iMjdkLTQ4ODItODhiMC01MjdiNDEzMThkMmQiLCJzaWQiOiI4YzBlODU1MS00YmY3LTQ0ZTUtYjAxNi04NjIzNmJkNmQyYzEiLCJCdWlsZElkIjoiY2JiMTgyNjEtYzQ4Zi00YWJiLTg2NTEtOGNkY2I1NDc0NjQ5OzE0Njk5OSIsInBwaWQiOiJ2c3RmczovLy9CdWlsZC9CdWlsZC8xNDY5OTkiLCJvcmNoaWQiOiJiMGIxMWI5MS1kZDgyLTQ2ZGEtYmY5My03ZDMyMTFmOTFlOWMuYnVpbGQubGlicmFyaWVzX3Rlc3RfcnVuX3JlbGVhc2VfbW9ub19saW51eF9hcm02NF9kZWJ1Zy5fX2RlZmF1bHQiLCJyZXBvSWRzIjoiIiwiaXNzIjoiYXBwLnZzdG9rZW4udmlzdWFsc3R1ZGlvLmNvbSIsImF1ZCI6ImFwcC52c3Rva2VuLnZpc3VhbHN0dWRpby5jb218dnNvOjZmY2M5MmU1LTczYTctNGY4OC04ZDEzLWQ5MDQ1YjQ1ZmIyNyIsIm5iZiI6MTY3NDU4ODQzNSwiZXhwIjoxNjc0NTk4NjM1fQ.2F4kxlj-2u_b13NpFwPNbLvm6BUC1xJ64_ZrdCLyvY7s0IjbJ9RR9IP-P0w_BMdZQQIHDCcSN4-Vd2IQuD58uR_Zr4mFyxl9JpCPSzz-SYZlDzAqdELf3S2t12AIvrHE0myXSCjXZxoe9OpK-rpfeXthdrnjf1x9LddoxcruIkzTqjWjJm4vlGZrE4fdazQIRwwBwDiQVPmnKyV4CK3xO64iHbJRUn_cHiESOzu5ueGsvkxkXmUNfLhiLlTSL6h_r39eq2ctuVx5J9Fz5OSz2_WYIZsQGl2XsYYlgEa6OXKFvRNzFPLO5A6ZGzM2bcPBLQzthe7gSb3dSbi9h7HnJQ
2023-01-24T20:17:17.601Z	INFO   	run.py	run(48)	main	Beginning reading of test results.
2023-01-24T20:17:17.602Z	INFO   	run.py	__init__(42)	read_results	Searching '/root/helix/work/workitem/e' for test results files
2023-01-24T20:17:17.602Z	INFO   	run.py	__init__(42)	read_results	Searching '/root/helix/work/workitem/uploads' for test results files
2023-01-24T20:17:17.602Z	WARNING	run.py	__init__(55)	read_results	No results file found in any of the following formats: xunit, junit, trx
2023-01-24T20:17:17.603Z	INFO   	run.py	packing_test_reporter(30)	report_results	Packing 0 test reports to '/root/helix/work/workitem/e/__test_report.json'
2023-01-24T20:17:17.603Z	INFO   	run.py	packing_test_reporter(33)	report_results	Packed 1466 bytes
+ exit 139
+ export _commandExitCode=139
+ chmod -R 777 /home/helixbot/dotnetbuild/dumps
+ exit 139

[END EXECUTION]
Exit Code:139

Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests

Callstack
----- start Tue 24 Jan 2023 08:17:23 PM UTC =============== To repro directly: =====================================================
pushd .
/root/helix/work/correlation/dotnet exec --runtimeconfig Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.runtimeconfig.json --depsfile Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.deps.json xunit.console.dll Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.dll -xml testResults.xml -nologo -nocolor -notrait category=IgnoreForCI -notrait category=OuterLoop -notrait category=failing 
popd
===========================================================================================================
/root/helix/work/workitem/e /root/helix/work/workitem/e
  Discovering: Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests (method display = ClassAndMethod, method display options = None)
  Discovered:  Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests (found 68 of 69 test cases)
  Starting:    Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests (parallel test collections = on, max threads = 2)

=================================================================
	Native Crash Reporting
=================================================================
Got a SIGSEGV while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries 
used by your application.
=================================================================

=================================================================
	Native stacktrace:
=================================================================
	0xffb88835b8e8 - Unknown

=================================================================
	External Debugger Dump:
=================================================================
[New LWP 23]
[New LWP 24]
[New LWP 25]
[New LWP 26]
[New LWP 29]
[New LWP 30]
[New LWP 31]
[New LWP 32]
[New LWP 33]
[New LWP 34]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/aarch64-linux-gnu/libthread_db.so.1".
futex_wait_cancelable (private=0, expected=0, futex_word=0xaba741d5bb4c) at ../sysdeps/nptl/futex-internal.h:186
186	../sysdeps/nptl/futex-internal.h: No such file or directory.
  Id   Target Id                                        Frame 
* 1    Thread 0xffb8889f9010 (LWP 22) "dotnet"          futex_wait_cancelable (private=0, expected=0, futex_word=0xaba741d5bb4c) at ../sysdeps/nptl/futex-internal.h:186
  2    Thread 0xffb8877ff1b0 (LWP 23) "SGen worker"     futex_wait_cancelable (private=0, expected=0, futex_word=0xffb88841a930 <work_cond+40>) at ../sysdeps/nptl/futex-internal.h:186
  3    Thread 0xffb885a6d1b0 (LWP 24) ".NET EventPipe"  0x0000ffb88863def4 in __GI___poll (fds=0xffb878003b20, nfds=1, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
  4    Thread 0xffb88586c1b0 (LWP 25) "Finalizer"       futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xffb88840b9f8 <finalizer_sem>) at ../sysdeps/nptl/futex-internal.h:323
  5    Thread 0xffb87d4e71b0 (LWP 26) ".NET SigHandler" __libc_read (nbytes=1, buf=0xffb87d4e6977, fd=<optimized out>) at ../sysdeps/unix/sysv/linux/read.c:26
  6    Thread 0xffb87d1271b0 (LWP 29) ".NET Long Runni" futex_wait_cancelable (private=0, expected=0, futex_word=0xffb87400d168) at ../sysdeps/nptl/futex-internal.h:186
  7    Thread 0xffb87cf261b0 (LWP 30) ".NET TP Worker"  futex_abstimed_wait_cancelable (private=0, abstime=0xffb87cf250e8, clockid=<optimized out>, expected=0, futex_word=0xffb868008a58) at ../sysdeps/nptl/futex-internal.h:323
  8    Thread 0xffb87ccb51b0 (LWP 31) ".NET TP Gate"    futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xffb86c000bc0) at ../sysdeps/nptl/futex-internal.h:323
  9    Thread 0xffb87cc541b0 (LWP 32) ".NET TP Worker"  futex_abstimed_wait_cancelable (private=0, abstime=0xffb87cc53320, clockid=<optimized out>, expected=0, futex_word=0xffb87cc53398) at ../sysdeps/nptl/futex-internal.h:323
  10   Thread 0xffb87ca231b0 (LWP 33) ".NET Long Runni" 0x0000ffb88861977c in __GI___wait4 (pid=<optimized out>, stat_loc=0xffb87ca1c510, options=0, usage=0x0) at ../sysdeps/unix/sysv/linux/wait4.c:27
  11   Thread 0xffb87c8221b0 (LWP 34) ".NET Long Runni" futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xffb8884109f8 <suspend_semaphore>) at ../sysdeps/nptl/futex-internal.h:323

Thread 11 (Thread 0xffb87c8221b0 (LWP 34) ".NET Long Runni"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xffb8884109f8 <suspend_semaphore>) at ../sysdeps/nptl/futex-internal.h:323
#1  do_futex_wait (sem=sem@entry=0xffb8884109f8 <suspend_semaphore>, abstime=0x0, clockid=0) at sem_waitcommon.c:112
#2  0x0000ffb8889b633c in __new_sem_wait_slow (sem=0xffb8884109f8 <suspend_semaphore>, abstime=0x0, clockid=0) at sem_waitcommon.c:184
#3  0x0000ffb88821e55c in mono_os_sem_wait (sem=<optimized out>, flags=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-os-semaphore.h:204
#4  mono_os_sem_timedwait (sem=0xffb8884109f8 <suspend_semaphore>, timeout_ms=4294967295, flags=MONO_SEM_FLAGS_NONE) at /__w/1/s/src/mono/mono/utils/mono-os-semaphore.h:237
#5  0x0000ffb88821e280 in mono_threads_wait_pending_operations () at /__w/1/s/src/mono/mono/utils/mono-threads.c:323
#6  0x0000ffb8881fb948 in unified_suspend_stop_world (flags=MONO_THREAD_INFO_FLAGS_NO_GC, thread_stopped_callback=0xffb8881fbd74 <sgen_client_stop_world_thread_stopped_callback>) at /__w/1/s/src/mono/mono/metadata/sgen-stw.c:345
#7  0x0000ffb8881fb630 in sgen_client_stop_world (generation=0, serial_collection=0) at /__w/1/s/src/mono/mono/metadata/sgen-stw.c:155
#8  0x0000ffb888236e80 in sgen_stop_world (generation=0, serial_collection=0) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:3995
#9  0x0000ffb888233958 in sgen_perform_collection_inner (requested_size=<optimized out>, generation_to_collect=<optimized out>, reason=<optimized out>, forced_serial=<optimized out>, stw=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:2643
#10 sgen_perform_collection (requested_size=4096, generation_to_collect=0, reason=0xffb888109429 "Nursery full", forced_serial=0, stw=1) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:2766
#11 0x0000ffb888233894 in sgen_ensure_free_space (size=4096, generation=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-gc.c:2622
#12 0x0000ffb888227f28 in sgen_alloc_obj_nolock (vtable=0xaba74283bc58, size=80) at /__w/1/s/src/mono/mono/sgen/sgen-alloc.c:279
#13 0x0000ffb888228440 in sgen_alloc_obj (vtable=0xaba74283bc58, size=80) at /__w/1/s/src/mono/mono/sgen/sgen-alloc.c:454
#14 0x0000ffb8881fd7a4 in mono_gc_alloc_obj (vtable=0xffb8884109f8 <suspend_semaphore>, size=393) at /__w/1/s/src/mono/mono/metadata/sgen-mono.c:904
#15 0x0000ffb884ebf044 in ?? ()
#16 0x0000000000000004 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 10 (Thread 0xffb87ca231b0 (LWP 33) ".NET Long Runni"):
#0  0x0000ffb88861977c in __GI___wait4 (pid=<optimized out>, stat_loc=0xffb87ca1c510, options=0, usage=0x0) at ../sysdeps/unix/sysv/linux/wait4.c:27
#1  0x0000ffb88835b9e8 in dump_native_stacktrace (signal=<optimized out>, mctx=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:843
#2  mono_dump_native_crash_info (signal=<optimized out>, mctx=0xffb87ca1cf70, info=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:870
#3  0x0000ffb88831a5c0 in mono_handle_native_crash (signal=0xffb8881181d2 "SIGSEGV", mctx=0xffb87ca1cf70, info=0xffb87ca1d2d0) at /__w/1/s/src/mono/mono/mini/mini-exceptions.c:3005
#4  0x0000ffb888283b90 in mono_sigsegv_signal_handler_debug (_dummy=11, _info=0xffb87ca1d2d0, context=0xffb87ca1d350, debug_fault_addr=0x0) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:3749
#5  <signal handler called>
#6  0x0000000000000000 in ?? ()
#7  0x0000ffb85facdf48 in ?? ()
#8  0x0000ffb887bfa150 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 9 (Thread 0xffb87cc541b0 (LWP 32) ".NET TP Worker"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0xffb87cc53320, clockid=<optimized out>, expected=0, futex_word=0xffb87cc53398) at ../sysdeps/nptl/futex-internal.h:323
#1  __pthread_cond_wait_common (abstime=0xffb87cc53320, clockid=<optimized out>, mutex=0xffb87400bc80, cond=0xffb87cc53370) at pthread_cond_wait.c:520
#2  __pthread_cond_timedwait (cond=0xffb87cc53370, mutex=0xffb87400bc80, abstime=0xffb87cc53320) at pthread_cond_wait.c:656
#3  0x0000ffb888216d4c in mono_os_cond_timedwait (cond=0xffb87cc53370, mutex=0xffb87400bc80, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/mono-os-mutex.c:75
#4  0x0000ffb88821be6c in mono_coop_cond_timedwait (cond=0xffb87cc53370, mutex=<optimized out>, timeout_ms=20000) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-mutex.h:103
#5  mono_lifo_semaphore_timed_wait (semaphore=0xffb87400bc80, timeout_ms=20000) at /__w/1/s/src/mono/mono/utils/lifo-semaphore.c:48
#6  0x0000ffb87cc64480 in ?? ()
#7  0x0000ffb8879f2838 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 8 (Thread 0xffb87ccb51b0 (LWP 31) ".NET TP Gate"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xffb86c000bc0) at ../sysdeps/nptl/futex-internal.h:323
#1  do_futex_wait (sem=sem@entry=0xffb86c000bc0, abstime=0x0, clockid=0) at sem_waitcommon.c:112
#2  0x0000ffb8889b633c in __new_sem_wait_slow (sem=0xffb86c000bc0, abstime=0x0, clockid=0) at sem_waitcommon.c:184
#3  0x0000ffb88821e0a4 in mono_os_sem_wait (sem=0xffb86c000bc0, flags=MONO_SEM_FLAGS_NONE) at /__w/1/s/src/mono/mono/utils/mono-os-semaphore.h:204
#4  mono_thread_info_wait_for_resume (info=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-threads.c:238
#5  0x0000ffb888224408 in mono_threads_exit_gc_safe_region_unbalanced_internal (cookie=0xffb86c000b60, stackdata=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-threads-coop.c:389
#6  mono_threads_exit_gc_safe_region_unbalanced (cookie=0xffb86c000b60, stackpointer=<optimized out>) at /__w/1/s/src/mono/mono/utils/mono-threads-coop.c:409
#7  0x0000ffb87cc6f7ac in ?? ()
#8  0x0000ffb87d133f50 in ?? ()
#9  0x0000ffb87ccb48e0 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 7 (Thread 0xffb87cf261b0 (LWP 30) ".NET TP Worker"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0xffb87cf250e8, clockid=<optimized out>, expected=0, futex_word=0xffb868008a58) at ../sysdeps/nptl/futex-internal.h:323
#1  __pthread_cond_wait_common (abstime=0xffb87cf250e8, clockid=<optimized out>, mutex=0xffb868008a00, cond=0xffb868008a30) at pthread_cond_wait.c:520
#2  __pthread_cond_timedwait (cond=0xffb868008a30, mutex=0xffb868008a00, abstime=0xffb87cf250e8) at pthread_cond_wait.c:656
#3  0x0000ffb88541cc4c in SystemNative_LowLevelMonitor_TimedWait (monitor=0xffb868008a00, timeoutMilliseconds=12000) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:195
#4  0x0000ffb87cc6f794 in ?? ()
#5  0x0000ffb87d133f50 in ?? ()
#6  0x0000ffb87cf258e0 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 6 (Thread 0xffb87d1271b0 (LWP 29) ".NET Long Runni"):
#0  futex_wait_cancelable (private=0, expected=0, futex_word=0xffb87400d168) at ../sysdeps/nptl/futex-internal.h:186
#1  __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0xffb87400d110, cond=0xffb87400d140) at pthread_cond_wait.c:508
#2  __pthread_cond_wait (cond=0xffb87400d140, mutex=0xffb87400d110) at pthread_cond_wait.c:638
#3  0x0000ffb88541cae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xffb87400d110) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ffb87d134514 in ?? ()
#5  0x0000ffb88791dba0 in ?? ()
#6  0x0000ffb88791e120 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

Thread 5 (Thread 0xffb87d4e71b0 (LWP 26) ".NET SigHandler"):
#0  __libc_read (nbytes=1, buf=0xffb87d4e6977, fd=<optimized out>) at ../sysdeps/unix/sysv/linux/read.c:26
#1  __libc_read (fd=<optimized out>, buf=0xffb87d4e6977, nbytes=1) at ../sysdeps/unix/sysv/linux/read.c:24
#2  0x0000ffb88541c19c in SignalHandlerLoop (arg=0xaba7424ceb70) at /__w/1/s/src/native/libs/System.Native/pal_signal.c:331
#3  0x0000ffb8889ac648 in start_thread (arg=0xffb87d4e6ab0) at pthread_create.c:477
#4  0x0000ffb888647c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 4 (Thread 0xffb88586c1b0 (LWP 25) "Finalizer"):
#0  futex_abstimed_wait_cancelable (private=0, abstime=0x0, clockid=0, expected=0, futex_word=0xffb88840b9f8 <finalizer_sem>) at ../sysdeps/nptl/futex-internal.h:323
#1  do_futex_wait (sem=sem@entry=0xffb88840b9f8 <finalizer_sem>, abstime=0x0, clockid=0) at sem_waitcommon.c:112
#2  0x0000ffb8889b633c in __new_sem_wait_slow (sem=0xffb88840b9f8 <finalizer_sem>, abstime=0x0, clockid=0) at sem_waitcommon.c:184
#3  0x0000ffb8881f1874 in mono_os_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../utils/mono-os-semaphore.h:204
#4  mono_coop_sem_wait (sem=<optimized out>, flags=MONO_SEM_FLAGS_ALERTABLE) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-coop-semaphore.h:41
#5  finalizer_thread (unused=<optimized out>) at /__w/1/s/src/mono/mono/metadata/gc.c:891
#6  0x0000ffb8881cbce4 in start_wrapper_internal (start_info=0x0, stack_ptr=<optimized out>) at /__w/1/s/src/mono/mono/metadata/threads.c:1202
#7  0x0000ffb8881cbb90 in start_wrapper (data=0xaba741665a20) at /__w/1/s/src/mono/mono/metadata/threads.c:1264
#8  0x0000ffb8889ac648 in start_thread (arg=0xffb88586bab0) at pthread_create.c:477
#9  0x0000ffb888647c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 3 (Thread 0xffb885a6d1b0 (LWP 24) ".NET EventPipe"):
#0  0x0000ffb88863def4 in __GI___poll (fds=0xffb878003b20, nfds=1, timeout=<optimized out>) at ../sysdeps/unix/sysv/linux/poll.c:41
#1  0x0000ffb8883e8a88 in ipc_poll_fds (fds=<optimized out>, nfds=1, timeout=4294967295) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:470
#2  ds_ipc_poll (poll_handles_data=0xffb878003310, poll_handles_data_len=1, timeout_ms=4294967295, callback=0xffb8883e7e38 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc-pal-socket.c:1096
#3  0x0000ffb8883e6034 in ds_ipc_stream_factory_get_next_available_stream (callback=0xffb8883e7e38 <server_warning_callback>) at /__w/1/s/src/native/eventpipe/ds-ipc.c:395
#4  0x0000ffb8883e48ac in server_thread (data=<optimized out>) at /__w/1/s/src/native/eventpipe/ds-server.c:129
#5  0x0000ffb8883e7e18 in ep_rt_thread_mono_start_func (data=0xaba74163e3a0) at /__w/1/s/src/mono/mono/mini/../eventpipe/ep-rt-mono.h:1332
#6  0x0000ffb8889ac648 in start_thread (arg=0xffb885a6cab0) at pthread_create.c:477
#7  0x0000ffb888647c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 2 (Thread 0xffb8877ff1b0 (LWP 23) "SGen worker"):
#0  futex_wait_cancelable (private=0, expected=0, futex_word=0xffb88841a930 <work_cond+40>) at ../sysdeps/nptl/futex-internal.h:186
#1  __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0xffb88841a8d8 <lock>, cond=0xffb88841a908 <work_cond>) at pthread_cond_wait.c:508
#2  __pthread_cond_wait (cond=0xffb88841a908 <work_cond>, mutex=0xffb88841a8d8 <lock>) at pthread_cond_wait.c:638
#3  0x0000ffb88826bb70 in mono_os_cond_wait (cond=0xffb88841a930 <work_cond+40>, mutex=<optimized out>) at /__w/1/s/src/mono/mono/mini/../../mono/utils/mono-os-mutex.h:219
#4  get_work (worker_index=<optimized out>, work_context=<optimized out>, do_idle=<optimized out>, job=<optimized out>) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:167
#5  thread_func (data=0x0) at /__w/1/s/src/mono/mono/sgen/sgen-thread-pool.c:198
#6  0x0000ffb8889ac648 in start_thread (arg=0xffb8877feab0) at pthread_create.c:477
#7  0x0000ffb888647c1c in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78

Thread 1 (Thread 0xffb8889f9010 (LWP 22) "dotnet"):
#0  futex_wait_cancelable (private=0, expected=0, futex_word=0xaba741d5bb4c) at ../sysdeps/nptl/futex-internal.h:186
#1  __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0xaba741d5baf0, cond=0xaba741d5bb20) at pthread_cond_wait.c:508
#2  __pthread_cond_wait (cond=0xaba741d5bb20, mutex=0xaba741d5baf0) at pthread_cond_wait.c:638
#3  0x0000ffb88541cae8 in SystemNative_LowLevelMonitor_Wait (monitor=0xaba741d5baf0) at /__w/1/s/src/native/libs/System.Native/pal_threading.c:155
#4  0x0000ffb87d134514 in ?? ()
#5  0x0000ffb887919498 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)
[Inferior 1 (process 22) detached]

=================================================================
	Basic Fault Address Reporting
=================================================================
instruction pointer is NULL, skip dumping
=================================================================
	Managed Stacktrace:
=================================================================
	  at <unknown> <0xffffffff>
	  at System.Reflection.RuntimeMethodInfo:InternalInvoke <0x00007>
	  at System.Reflection.MethodInvoker:InterpretedInvoke <0x00073>
	  at System.Reflection.MethodInvoker:Invoke <0x00107>
	  at System.Reflection.RuntimeMethodInfo:Invoke <0x001ab>
	  at System.Reflection.MethodBase:Invoke <0x00053>
	  at Xunit.Sdk.TestInvoker`1:CallTestMethod <0x00047>
	  at <<InvokeTestMethodAsync>b__1>d:MoveNext <0x00377>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x0009f>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder:Start <0x0002f>
	  at <>c__DisplayClass48_0:<InvokeTestMethodAsync>b__1 <0x000cf>
	  at <AggregateAsync>d__4:MoveNext <0x000c3>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x0009b>
	  at Xunit.Sdk.ExecutionTimer:AggregateAsync <0x000ef>
	  at <>c__DisplayClass48_0:<InvokeTestMethodAsync>b__0 <0x00177>
	  at <RunAsync>d__9:MoveNext <0x00067>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x00093>
	  at Xunit.Sdk.ExceptionAggregator:RunAsync <0x000eb>
	  at <InvokeTestMethodAsync>d__48:MoveNext <0x0028b>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000a7>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestInvoker`1:InvokeTestMethodAsync <0x00117>
	  at Xunit.Sdk.XunitTestInvoker:InvokeTestMethodAsync <0x00127>
	  at <<RunAsync>b__47_0>d:MoveNext <0x0068f>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000af>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestInvoker`1:<RunAsync>b__47_0 <0x000d7>
	  at <RunAsync>d__10`1:MoveNext <0x00073>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x00093>
	  at Xunit.Sdk.ExceptionAggregator:RunAsync <0x000eb>
	  at Xunit.Sdk.TestInvoker`1:RunAsync <0x00103>
	  at Xunit.Sdk.XunitTestRunner:InvokeTestMethodAsync <0x000cb>
	  at <InvokeTestAsync>d__4:MoveNext <0x001c3>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000a3>
	  at Xunit.Sdk.XunitTestRunner:InvokeTestAsync <0x000f3>
	  at <>c__DisplayClass43_0:<RunAsync>b__0 <0x00043>
	  at <RunAsync>d__10`1:MoveNext <0x00083>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x0009f>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.ExceptionAggregator:RunAsync <0x00117>
	  at <RunAsync>d__43:MoveNext <0x0049b>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000af>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestRunner`1:RunAsync <0x000d7>
	  at Xunit.Sdk.XunitTestCaseRunner:RunTestAsync <0x000df>
	  at <RunAsync>d__19:MoveNext <0x0032f>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000a7>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestCaseRunner`1:RunAsync <0x000d3>
	  at Xunit.Sdk.XunitTestCase:RunAsync <0x000b3>
	  at Xunit.Sdk.XunitTestMethodRunner:RunTestCaseAsync <0x000a3>
	  at <RunTestCasesAsync>d__32:MoveNext <0x001cb>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000af>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestMethodRunner`1:RunTestCasesAsync <0x000d7>
	  at <RunAsync>d__31:MoveNext <0x001a3>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x0009f>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestMethodRunner`1:RunAsync <0x000cf>
	  at Xunit.Sdk.XunitTestClassRunner:RunTestMethodAsync <0x000df>
	  at <RunTestMethodsAsync>d__38:MoveNext <0x007d3>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000b7>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestClassRunner`1:RunTestMethodsAsync <0x000db>
	  at <RunAsync>d__37:MoveNext <0x0034f>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000a7>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestClassRunner`1:RunAsync <0x000d3>
	  at Xunit.Sdk.XunitTestCollectionRunner:RunTestClassAsync <0x000e3>
	  at <RunTestClassesAsync>d__28:MoveNext <0x003e7>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000af>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestCollectionRunner`1:RunTestClassesAsync <0x000d7>
	  at <RunAsync>d__27:MoveNext <0x0034f>
	  at System.Runtime.CompilerServices.AsyncMethodBuilderCore:Start <0x000a7>
	  at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1:Start <0x0002f>
	  at Xunit.Sdk.TestCollectionRunner`1:RunAsync <0x000d3>
	  at Xunit.Sdk.XunitTestAssemblyRunner:RunTestCollectionAsync <0x000af>
	  at <>c__DisplayClass14_2:<RunTestCollectionsAsync>b__2 <0x0006f>
	  at System.Threading.Tasks.Task`1:InnerInvoke <0x0006f>
	  at <>c:<.cctor>b__273_0 <0x0003b>
	  at System.Threading.ExecutionContext:RunInternal <0x000bf>
	  at System.Threading.Tasks.Task:ExecuteWithThreadLocal <0x00257>
	  at System.Threading.Tasks.Task:ExecuteEntry <0x000c7>
	  at <>c:<.cctor>b__8_0 <0x00067>
	  at Xunit.Sdk.MaxConcurrencySyncContext:RunOnSyncContext <0x0004f>
	  at <>c__DisplayClass11_0:<WorkerThreadProc>b__0 <0x00053>
	  at System.Threading.ExecutionContext:RunInternal <0x000bf>
	  at System.Threading.ExecutionContext:Run <0x00047>
	  at System.Object:lambda_method2 <0x0008b>
	  at Xunit.Sdk.ExecutionContextHelper:Run <0x00063>
	  at Xunit.Sdk.MaxConcurrencySyncContext:WorkerThreadProc <0x00233>
	  at <>c:<QueueUserWorkItem>b__5_0 <0x00077>
	  at System.Threading.Tasks.Task:InnerInvoke <0x000b7>
	  at <>c:<.cctor>b__273_0 <0x0003b>
	  at System.Threading.ExecutionContext:RunInternal <0x000bf>
	  at System.Threading.Tasks.Task:ExecuteWithThreadLocal <0x00257>
	  at System.Threading.Tasks.Task:ExecuteEntryUnsafe <0x000b3>
	  at <>c:<.cctor>b__10_0 <0x0006b>
	  at System.Threading.Thread:StartCallback <0x0012b>
	  at System.Object:runtime_invoke_void__this__ <0x00087>
=================================================================
./RunTests.sh: line 168:    22 Aborted                 (core dumped) "$RUNTIME_PATH/dotnet" exec --runtimeconfig Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.runtimeconfig.json --depsfile Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.deps.json xunit.console.dll Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.dll -xml testResults.xml -nologo -nocolor -notrait category=IgnoreForCI -notrait category=OuterLoop -notrait category=failing $RSP_FILE
/root/helix/work/workitem/e
----- end Tue 24 Jan 2023 08:17:50 PM UTC ----- exit code 134 ----------------------------------------------------------
exit code 134 means SIGABRT Abort. Managed or native assert, or runtime check such as heap corruption, caused call to abort(). Core dumped.
ulimit -c value: unlimited
[ 6739.965300] docker0: port 1(vethb8d78aa) entered disabled state
[ 6739.967232] device vethb8d78aa left promiscuous mode
[ 6739.967243] docker0: port 1(vethb8d78aa) entered disabled state
[ 6766.376926] docker0: port 1(veth4a00c91) entered blocking state
[ 6766.376930] docker0: port 1(veth4a00c91) entered disabled state
[ 6766.377002] device veth4a00c91 entered promiscuous mode
[ 6767.379352] eth0: renamed from vethf99e3ff
[ 6767.430950] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 6767.430973] IPv6: ADDRCONF(NETDEV_CHANGE): veth4a00c91: link becomes ready
[ 6767.431002] docker0: port 1(veth4a00c91) entered blocking state
[ 6767.431004] docker0: port 1(veth4a00c91) entered forwarding state
[ 6770.235476] docker0: port 1(veth4a00c91) entered disabled state
[ 6770.237020] vethf99e3ff: renamed from eth0
[ 6770.595680] docker0: port 1(veth4a00c91) entered disabled state
[ 6770.597219] device veth4a00c91 left promiscuous mode
[ 6770.597231] docker0: port 1(veth4a00c91) entered disabled state
[ 6787.060065] docker0: port 1(vethaeeee3b) entered blocking state
[ 6787.060067] docker0: port 1(vethaeeee3b) entered disabled state
[ 6787.060193] device vethaeeee3b entered promiscuous mode
[ 6787.734917] eth0: renamed from vethe49151d
[ 6787.787958] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 6787.787981] IPv6: ADDRCONF(NETDEV_CHANGE): vethaeeee3b: link becomes ready
[ 6787.788013] docker0: port 1(vethaeeee3b) entered blocking state
[ 6787.788014] docker0: port 1(vethaeeee3b) entered forwarding state
[ 6790.278797] docker0: port 1(vethaeeee3b) entered disabled state
[ 6790.279554] vethe49151d: renamed from eth0
[ 6790.524430] docker0: port 1(vethaeeee3b) entered disabled state
[ 6790.526891] device vethaeeee3b left promiscuous mode
[ 6790.526903] docker0: port 1(vethaeeee3b) entered disabled state
[ 6800.652166] docker0: port 1(vethb3d8eb6) entered blocking state
[ 6800.652168] docker0: port 1(vethb3d8eb6) entered disabled state
[ 6800.652533] device vethb3d8eb6 entered promiscuous mode
[ 6801.109553] eth0: renamed from veth84d4264
[ 6801.163032] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 6801.163054] IPv6: ADDRCONF(NETDEV_CHANGE): vethb3d8eb6: link becomes ready
[ 6801.163080] docker0: port 1(vethb3d8eb6) entered blocking state
[ 6801.163081] docker0: port 1(vethb3d8eb6) entered forwarding state
[ 6807.710574] docker0: port 1(vethb3d8eb6) entered disabled state
[ 6807.710631] veth84d4264: renamed from eth0
[ 6807.954694] docker0: port 1(vethb3d8eb6) entered disabled state
[ 6807.957459] device vethb3d8eb6 left promiscuous mode
[ 6807.957471] docker0: port 1(vethb3d8eb6) entered disabled state
[ 6828.749568] docker0: port 1(veth75e5c29) entered blocking state
[ 6828.749571] docker0: port 1(veth75e5c29) entered disabled state
[ 6828.749701] device veth75e5c29 entered promiscuous mode
[ 6829.566756] eth0: renamed from veth0b3f5bd
[ 6829.618375] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
[ 6829.618396] IPv6: ADDRCONF(NETDEV_CHANGE): veth75e5c29: link becomes ready
[ 6829.618429] docker0: port 1(veth75e5c29) entered blocking state
[ 6829.618431] docker0: port 1(veth75e5c29) entered forwarding state
Waiting a few seconds for any dump to be written..
cat /proc/sys/kernel/core_pattern: /home/helixbot/dotnetbuild/dumps/core.%u.%p
cat /proc/sys/kernel/core_uses_pid: 0
cat /proc/sys/kernel/coredump_filter:
Looking around for any Linux dump..
cat: /proc/sys/kernel/coredump_filter: No such file or directory
... found no dump in /root/helix/work/workitem/e
+ export _commandExitCode=134
+ python /root/helix/work/correlation/reporter/run.py https://dev.azure.com/dnceng-public/ public 3144092 eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsIng1dCI6Im9PdmN6NU1fN3AtSGpJS2xGWHo5M3VfVjBabyJ9.eyJuYW1laWQiOiJjNzczZjJjMi01MTIwLTQyMDctYWZlMi1hZmFmMzVhOGJjMGEiLCJzY3AiOiJhcHBfdG9rZW4iLCJhdWkiOiI4MDk2NTRkNC1iMjdkLTQ4ODItODhiMC01MjdiNDEzMThkMmQiLCJzaWQiOiI4YzBlODU1MS00YmY3LTQ0ZTUtYjAxNi04NjIzNmJkNmQyYzEiLCJCdWlsZElkIjoiY2JiMTgyNjEtYzQ4Zi00YWJiLTg2NTEtOGNkY2I1NDc0NjQ5OzE0Njk5OSIsInBwaWQiOiJ2c3RmczovLy9CdWlsZC9CdWlsZC8xNDY5OTkiLCJvcmNoaWQiOiJiMGIxMWI5MS1kZDgyLTQ2ZGEtYmY5My03ZDMyMTFmOTFlOWMuYnVpbGQubGlicmFyaWVzX3Rlc3RfcnVuX3JlbGVhc2VfbW9ub19saW51eF9hcm02NF9kZWJ1Zy5fX2RlZmF1bHQiLCJyZXBvSWRzIjoiIiwiaXNzIjoiYXBwLnZzdG9rZW4udmlzdWFsc3R1ZGlvLmNvbSIsImF1ZCI6ImFwcC52c3Rva2VuLnZpc3VhbHN0dWRpby5jb218dnNvOjZmY2M5MmU1LTczYTctNGY4OC04ZDEzLWQ5MDQ1YjQ1ZmIyNyIsIm5iZiI6MTY3NDU4ODQzNSwiZXhwIjoxNjc0NTk4NjM1fQ.2F4kxlj-2u_b13NpFwPNbLvm6BUC1xJ64_ZrdCLyvY7s0IjbJ9RR9IP-P0w_BMdZQQIHDCcSN4-Vd2IQuD58uR_Zr4mFyxl9JpCPSzz-SYZlDzAqdELf3S2t12AIvrHE0myXSCjXZxoe9OpK-rpfeXthdrnjf1x9LddoxcruIkzTqjWjJm4vlGZrE4fdazQIRwwBwDiQVPmnKyV4CK3xO64iHbJRUn_cHiESOzu5ueGsvkxkXmUNfLhiLlTSL6h_r39eq2ctuVx5J9Fz5OSz2_WYIZsQGl2XsYYlgEa6OXKFvRNzFPLO5A6ZGzM2bcPBLQzthe7gSb3dSbi9h7HnJQ
2023-01-24T20:18:01.170Z	INFO   	run.py	run(48)	main	Beginning reading of test results.
2023-01-24T20:18:01.171Z	INFO   	run.py	__init__(42)	read_results	Searching '/root/helix/work/workitem/e' for test results files
2023-01-24T20:18:01.172Z	INFO   	run.py	__init__(42)	read_results	Searching '/root/helix/work/workitem/uploads' for test results files
2023-01-24T20:18:01.172Z	WARNING	run.py	__init__(55)	read_results	No results file found in any of the following formats: xunit, junit, trx
2023-01-24T20:18:01.172Z	INFO   	run.py	packing_test_reporter(30)	report_results	Packing 0 test reports to '/root/helix/work/workitem/e/__test_report.json'
2023-01-24T20:18:01.173Z	INFO   	run.py	packing_test_reporter(33)	report_results	Packed 1439 bytes
+ exit 134
+ export _commandExitCode=134
+ chmod -R 777 /home/helixbot/dotnetbuild/dumps
+ exit 134

[END EXECUTION]
Exit Code:134

@carlossanlop
Copy link
Member

This issue has heavy impact in many PRs. Looking for an area owner.

@steveisok
Copy link
Member

steveisok commented Mar 9, 2023

Can't repro it either. Would suggest disabling these for the time being.

This seems to happen in different suites, so disabling may turn into a whack-a-mole game. For example, I spotted one happening in System.Security.Cryptography tests.

https://helixre8s23ayyeko0k025g8.blob.core.windows.net/dotnet-runtime-refs-pull-83182-merge-d76124716c4942d188/System.Security.Cryptography.Tests/1/console.e84626ab.log?helixlogtype=result

@lambdageek
Copy link
Member

going to try to decode the crashed ip address in-process in the crash handler. #83219

@vargaz
Copy link
Contributor

vargaz commented Mar 10, 2023

I can reproduce a crash on osx with the Microsoft.Extensions.Logging.Abstractions/tests/Microsoft.Extensions.Logging.Generators.Tests/Microsoft.Extensions.Logging.Generators.Roslyn4.0.Tests.csproj
test suite.
It happens much more often, it could be a different issue through.

@vargaz
Copy link
Contributor

vargaz commented Mar 18, 2023

Some of the failures appear to be races with delegate invocation.
Testcase:

using System;
using System.Threading;

public class Tests
{
	public void foo () {
	}

	static Action[] delegates;

	private static void Main(string[] args)
	{
		var t = new Tests ();
		delegates = new Action [100000];
		for (int i = 0; i < delegates.Length; ++i)
			delegates [i] = new Action(t.foo);

		var arr = new Thread[10];
		for (int i = 0; i < 10; ++i) {
			arr [i] = new Thread (delegate () { for (int i = 0; i < delegates.Length; ++i) delegates [i] (); });
		}
		for (int i = 0; i < 10; ++i)
			arr [i].Start ();
    }
}

This fails when ran in a loop on on an apple m1.

@vargaz
Copy link
Contributor

vargaz commented Mar 18, 2023

So the problem seems to be related to the invoke trampolines which do:
arm_ldrx (code, ARMREG_IP0, ARMREG_R0, MONO_STRUCT_OFFSET (MonoDelegate, method_ptr));
arm_ldrx (code, ARMREG_R0, ARMREG_R0, MONO_STRUCT_OFFSET (MonoDelegate, target));
code = mono_arm_emit_brx (code, ARMREG_IP0);

These trampolines are stored into delegate->invoke_impl by mono_delegate_trampoline (). Another thread makes an indirect call through delegate->invoke_impl and observes delegate->method_ptr being NULL. delegate->method_ptr is also set by mono_delegate_trampoline (), but even with a memory barrier between the stores, its not guaranteed that another thread will not observed the previous NULL value.

@lambdageek
Copy link
Member

lambdageek commented Mar 21, 2023

We think 81123 #83673 addresses this issue. Keeping the issue open a little bit to see if the runfo numbers go down

The numbers on March 21 2023 before the change were:

24-Hour Hit Count 7-Day Hit Count 1-Month Count
30 153 535

@carlossanlop
Copy link
Member

Thanks @lambdageek and @vargaz.

Once we confirm that fixes the issue, please consider backporting it to 7.0. We are seeing the failure happen there too. [Pending verifying benchmark results as @lambdageek told me offline].

@vargaz
Copy link
Contributor

vargaz commented Mar 21, 2023

#83688 is probably needed too.

@lambdageek

This comment was marked as resolved.

@lambdageek

This comment was marked as outdated.

@lambdageek
Copy link
Member

Most of the recent hits are either on the 7.0 branch, or are unrelated false positives (the query catches too many other native crashes)

There is one hit from #84004 from System.Security.Cryptography.Tests.WorkItemExecution that seems real:

Thread 12 (Thread 0xffae348e11c0 (LWP 40)):
#0  0x0000ffae3efead5c in __waitpid (pid=<optimized out>, stat_loc=0xffae348de0b0, options=<optimized out>) at ../sysdeps/unix/sysv/linux/waitpid.c:30
#1  0x0000ffae3e8a1708 in dump_native_stacktrace (signal=<optimized out>, mctx=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:843
#2  mono_dump_native_crash_info (signal=<optimized out>, mctx=0xffae348deb10, info=<optimized out>) at /__w/1/s/src/mono/mono/mini/mini-posix.c:870
#3  0x0000ffae3e857f5c in mono_handle_native_crash (signal=0xffae3e771720 "SIGSEGV", mctx=0xffae348deb10, info=0xffae348dee70) at /__w/1/s/src/mono/mono/mini/mini-exceptions.c:2979
#4  0x0000ffae3e7bfa60 in mono_sigsegv_signal_handler_debug (_dummy=11, _info=0xffae348dee70, context=0xffae348deef0, debug_fault_addr=0x0) at /__w/1/s/src/mono/mono/mini/mini-runtime.c:3758
#5  <signal handler called>
#6  0x0000000000000000 in ?? ()
#7  0x0000ffae25efcf00 in ?? ()
#8  0x0000ffae2618eb60 in ?? ()
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

@vargaz
Copy link
Contributor

vargaz commented Mar 29, 2023

It still seems to crash, but the crashes are less frequent.

@hoyosjs
Copy link
Member

hoyosjs commented Jul 21, 2023

The XML test lately failing are showing something like:

Thread 9 (Thread 0x7f488d899700 (LWP 318) "dotnet"):
#0  0x00007f4899231747 in __GI___wait4 (pid=323, stat_loc=0x7f488d896140, options=0, usage=0x0) at ../sysdeps/unix/sysv/linux/wait4.c:27
#1  0x00007f4898e4b412 in mono_dump_native_crash_info () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#2  0x00007f4898df29ae in mono_handle_native_crash () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#3  0x00007f4898e4ab81 in sigabrt_signal_handler () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#4  <signal handler called>
#5  __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:50
#6  0x00007f489918c537 in __GI_abort () at abort.c:79
#7  0x00007f4898ee8085 in monoeg_assert_abort () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#8  0x00007f4898ef9cf6 in mono_log_write_logfile () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#9  0x00007f4898ee84ef in monoeg_g_logv_nofree () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#10 0x00007f4898ee8655 in monoeg_assertion_message () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#11 0x00007f4898ee8697 in mono_assertion_message () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#12 0x00007f4898e7b1a7 in generate () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#13 0x00007f4898e75daf in mono_interp_transform_method () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#14 0x00007f4898e5d3ed in do_transform_method () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#15 0x00007f4898e4f7c6 in mono_interp_exec_method () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#16 0x00007f4898e4ce36 in interp_runtime_invoke () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#17 0x00007f4898f79767 in mono_runtime_invoke_checked () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#18 0x00007f4898f91984 in start_wrapper () from /root/helix/work/correlation/shared/Microsoft.NETCore.App/8.0.0/libcoreclr.so
#19 0x00007f4899676ea7 in start_thread (arg=<optimized out>) at pthread_create.c:477
#20 0x00007f4899265a2f in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:95

with reported native stack:

=================================================================
	Native stacktrace:
=================================================================
	0x7f4898e4b2e1 - Unknown
	0x7f4898df29ae - Unknown
	0x7f4898e4ab81 - Unknown
	0x7f4899682140 - Unknown
	0x7f48991a2ce1 - Unknown
	0x7f489918c537 - Unknown
	0x7f4898ee8085 - Unknown
	0x7f4898ef9cf6 - Unknown
	0x7f4898ee84ef - Unknown
	0x7f4898ee8655 - Unknown
	0x7f4898ee8697 - Unknown
	0x7f4898e7b1a7 - Unknown
	0x7f4898e75daf - Unknown
	0x7f4898e5d3ed - Unknown
	0x7f4898e4f7c6 - Unknown
	0x7f4898e4ce36 - Unknown
	0x7f4898f79767 - Unknown
	0x7f4898f91984 - Unknown
	0x7f4899676ea7 - Unknown
	0x7f4899265a2f - Unknown

and managed stack:

	Managed Stacktrace:
=================================================================
	  at <unknown> <0xffffffff>
	  at System.Object:<xsl:template match="/"> <0x00132>
	  at System.Object:Root <0x00038>
	  at System.Object:Execute <0x0000c>
	  at System.Xml.Xsl.XmlILCommand:Execute <0x0007e>
	  at System.Xml.Xsl.XmlILCommand:Execute <0x000c4>
	  at System.Xml.Xsl.XslCompiledTransform:Transform <0x00068>
	  at System.Xml.Xsl.XslCompiledTransform:Transform <0x00170>
	  at System.Xml.XslCompiledTransformApiTests.SameInstanceXslTransformReader:Transform <0x0003e>
	  at System.Xml.Tests.CThread:InternalThreadStart <0x0006a>
	  at System.Object:runtime_invoke_direct_void__this__ <0x001fc>
	  at <unknown> <0x00000>
=================================================================

These are fixed by #88892

@SamMonoRT
Copy link
Member

These two PRs - #88892 and #89231 should have fixed the issue. If you still encounter it, please re-open.

@ghost ghost locked as resolved and limited conversation to collaborators Sep 2, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-VM-meta-mono blocking-clean-ci Blocking PR or rolling runs of 'runtime' or 'runtime-extra-platforms' Known Build Error Use this to report build issues in the .NET Helix tab source-generator Indicates an issue with a source generator feature
Projects
None yet
Development

No branches or pull requests

10 participants