Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CI][Python] Some macOS wheels are failing with a segmentation fault when running test_parquet_dataset_lazy_filtering #39562

Closed
raulcd opened this issue Jan 11, 2024 · 7 comments · Fixed by #39632
Assignees
Labels
Component: Continuous Integration Component: Python Critical Fix Bugfixes for security vulnerabilities, crashes, or invalid data. Type: bug
Milestone

Comments

@raulcd
Copy link
Member

raulcd commented Jan 11, 2024

Describe the bug, including details regarding any error messages, version, and platform.

Some jobs seem to fail sometimes with the following error:

Fatal Python error: Aborted

Current thread 0x00000001e28b2100 (most recent call first):
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pyarrow/tests/test_dataset.py", line 3753 in test_parquet_dataset_lazy_filtering
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/python.py", line 194 in pytest_pyfunc_call
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/python.py", line 1792 in runtest
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/runner.py", line 169 in pytest_runtest_call
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/runner.py", line 262 in <lambda>
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/runner.py", line 341 in from_call
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/runner.py", line 261 in call_runtest_hook
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/runner.py", line 222 in call_and_report
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/runner.py", line 133 in runtestprotocol
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/runner.py", line 114 in pytest_runtest_protocol
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/main.py", line 350 in pytest_runtestloop
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/main.py", line 325 in _main
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/main.py", line 271 in wrap_session
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/main.py", line 318 in pytest_cmdline_main
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/config/__init__.py", line 169 in main
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/_pytest/config/__init__.py", line 192 in console_main
  File "/Users/voltrondata/github-actions-runner/_work/crossbow/crossbow/test-arm64-env/lib/python3.11/site-packages/pytest/__main__.py", line 5 in <module>
  File "<frozen runpy>", line 88 in _run_code
  File "<frozen runpy>", line 198 in _run_module_as_main

Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, pyarrow.lib, pyarrow._hdfsio, pyarrow._fs, pyarrow._hdfs, pyarrow._gcsfs, pyarrow._s3fs, cython.cimports.libc.math, pyarrow._compute, pyarrow._acero, pyarrow._csv, pyarrow._json, pandas._libs.tslibs.np_datetime, pandas._libs.tslibs.dtypes, pandas._libs.tslibs.base, pandas._libs.tslibs.nattype, pandas._libs.tslibs.timezones, pandas._libs.tslibs.ccalendar, pandas._libs.tslibs.fields, pandas._libs.tslibs.timedeltas, pandas._libs.tslibs.tzconversion, pandas._libs.tslibs.timestamps, pandas._libs.properties, pandas._libs.tslibs.offsets, pandas._libs.tslibs.strptime, pandas._libs.tslibs.parsing, pandas._libs.tslibs.conversion, pandas._libs.tslibs.period, pandas._libs.tslibs.vectorized, pandas._libs.ops_dispatch, pandas._libs.missing, pandas._libs.hashtable, pandas._libs.algos, pandas._libs.interval, pandas._libs.lib, pandas._libs.ops, pandas._libs.arrays, pandas._libs.tslib, pandas._libs.sparse, pandas._libs.indexing, pandas._libs.index, pandas._libs.internals, pandas._libs.join, pandas._libs.writers, pandas._libs.window.aggregations, pandas._libs.window.indexers, pandas._libs.reshape, pandas._libs.groupby, pandas._libs.json, pandas._libs.parsers, pandas._libs.testing, pyarrow._dataset, pyarrow._dataset_orc, pyarrow._parquet, pyarrow._parquet_encryption, pyarrow._dataset_parquet_encryption, pyarrow._dataset_parquet, pyarrow._orc, pyarrow._flight, pyarrow._substrait, _cffi_backend, pyarrow._pyarrow_cpp_tests, pyarrow._feather, numpy.linalg.lapack_lite, pandas._libs.hashing, google._upb._message, markupsafe._speedups, grpc._cython.cygrpc, charset_normalizer.md, crc32c (total: 82)
arrow/ci/scripts/python_wheel_unix_test.sh: line 95: 13092 Abort trap: 6           python -m pytest -r s --pyargs pyarrow
...............................................s.....

Component(s)

Continuous Integration, Python

@jorisvandenbossche
Copy link
Member

The current assumption is that this started with #39065 (the crashing test wasn't added in that PR, but the test does use functionality that was touched by it)

I am trying to reproduce on an MacOS M2 AWS instance, installing our nightly wheel, but I am unable to get a failure (both by running the test, or by running a script that mimics the test)

@jorisvandenbossche jorisvandenbossche changed the title [CI][Python] Come macOS wheels are failing with a segmentation fault when running test_parquet_dataset_lazy_filtering [CI][Python] Some macOS wheels are failing with a segmentation fault when running test_parquet_dataset_lazy_filtering Jan 11, 2024
@jorisvandenbossche
Copy link
Member

jorisvandenbossche commented Jan 11, 2024

I was using Python 3.11, and for that Python version, our latest uploaded nightly wheel is from Monday, so before the change. Switching to Python 3.10 (for which there is a more recently uploaded wheel), I can reproduce the failure.

Running with lldb:

ec2-user@ip-172-31-44-53 ~ % lldb python3.10 -- -m pytest --pyargs pyarrow -k "test_parquet_dataset_lazy_filtering"
(lldb) target create "python3.10"
Current executable set to 'python3.10' (arm64).
(lldb) settings set -- target.run-args  "-m" "pytest" "--pyargs" "pyarrow" "-k" "test_parquet_dataset_lazy_filtering"
(lldb) run
Process 3716 launched: '/opt/homebrew/Cellar/python@3.10/3.10.13_1/Frameworks/Python.framework/Versions/3.10/Resources/Python.app/Contents/MacOS/Python' (arm64)
============================= test session starts ==============================
platform darwin -- Python 3.10.13, pytest-7.4.4, pluggy-1.3.0
rootdir: /Users/ec2-user
plugins: hypothesis-6.92.8, lazy-fixture-0.6.3
Process 3716 stopped
* thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_INSTRUCTION (code=1, subcode=0x4a03000)
    frame #0: 0x000000011912d308 libarrow.1500.dylib`_armv7_neon_probe + 72
libarrow.1500.dylib`:
->  0x11912d308 <+0>: eor    z0.d, z0.d, z0.d
    0x11912d30c <+4>: ret    

libarrow.1500.dylib`:
    0x11912d310 <+0>: xar    z0.d, z0.d, z0.d, #0x20
    0x11912d314 <+4>: ret    
Target 0: (Python) stopped.

Our wheel has of course no debug symbols, so I am not directly sure if the above is informative (is there a way to find out what those addresses point to?).

@jorisvandenbossche
Copy link
Member

Ignore the above lldb output, that is useless because of #37589. Thanks to the workaround mentioned in https://stackoverflow.com/questions/74059978/why-is-lldb-generating-exc-bad-instruction-with-user-compiled-library-on-macos/76032052#76032052 (settings set platform.plugin.darwin.ignored-exceptions EXC_BAD_INSTRUCTION), I could get an actual backtrace:

(lldb) process launch
Process 2066 launched: '/opt/homebrew/Cellar/python@3.10/3.10.13_1/Frameworks/Python.framework/Versions/3.10/Resources/Python.app/Contents/MacOS/Python' (arm64)
libc++abi: terminating due to uncaught exception of type std::length_error: vector
Process 2066 stopped
* thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGABRT
    frame #0: 0x00000001a9b30744 libsystem_kernel.dylib`__pthread_kill + 8
libsystem_kernel.dylib`:
->  0x1a9b30744 <+8>:  b.lo   0x1a9b30764               ; <+40>
    0x1a9b30748 <+12>: pacibsp 
    0x1a9b3074c <+16>: stp    x29, x30, [sp, #-0x10]!
    0x1a9b30750 <+20>: mov    x29, sp
Target 0: (Python) stopped.
(lldb) bt
* thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGABRT
  * frame #0: 0x00000001a9b30744 libsystem_kernel.dylib`__pthread_kill + 8
    frame #1: 0x00000001a9b67c28 libsystem_pthread.dylib`pthread_kill + 288
    frame #2: 0x00000001a9a75ae8 libsystem_c.dylib`abort + 180
    frame #3: 0x00000001a9b20b84 libc++abi.dylib`abort_message + 132
    frame #4: 0x00000001a9b103b4 libc++abi.dylib`demangling_terminate_handler() + 320
    frame #5: 0x00000001a97e6e68 libobjc.A.dylib`_objc_terminate() + 160
    frame #6: 0x00000001a9b1ff48 libc++abi.dylib`std::__terminate(void (*)()) + 16
    frame #7: 0x00000001a9b22d34 libc++abi.dylib`__cxxabiv1::failed_throw(__cxxabiv1::__cxa_exception*) + 36
    frame #8: 0x00000001a9b22ce0 libc++abi.dylib`__cxa_throw + 140
    frame #9: 0x0000000148022f90 libarrow_dataset.1500.dylib`std::__1::__throw_length_error[abi:v160006](char const*) + 60
    frame #10: 0x00000001480635f8 libarrow_dataset.1500.dylib`std::__1::vector<bool, std::__1::allocator<bool>>::__throw_length_error[abi:v160006]() const + 20
    frame #11: 0x000000014801d4f8 libarrow_dataset.1500.dylib`std::__1::vector<bool, std::__1::allocator<bool>>::resize(unsigned long, bool) + 600
    frame #12: 0x000000014801d14c libarrow_dataset.1500.dylib`arrow::dataset::ParquetFileFragment::SetMetadata(std::__1::shared_ptr<parquet::FileMetaData>, std::__1::shared_ptr<parquet::arrow::SchemaManifest>) + 432
    frame #13: 0x000000014801d7e4 libarrow_dataset.1500.dylib`arrow::dataset::ParquetFileFragment::SplitByRowGroup(arrow::compute::Expression) + 720
    frame #14: 0x000000010763b824 _dataset_parquet.cpython-310-darwin.so`__pyx_pw_7pyarrow_16_dataset_parquet_19ParquetFileFragment_5split_by_row_group(_object*, _object* const*, long, _object*) + 1428

So it is giving a terminating due to uncaught exception of type std::length_error: vector error for the vector resize in ParquetFileFragment::SetMetadata, presumably the one that I changed in #39065:

-    statistics_expressions_complete_.resize(physical_schema_->num_fields(), false);
+    statistics_expressions_complete_.resize(manifest_->descr->num_columns(), false);

I am wondering if sometimes manifest_->descr->num_columns() could be undefined?
The crash also happens specifically in a test where there dataset is created with ParquetDatasetFactory

It's still very strange that this only occurs in the MacOS wheels. I found some potentially similar issue (pyg-team/pytorch_geometric#4419), but also without clear solution (guess that it was related with inference of system libraries, was typically solved by using a (virtual) environment)

@raulcd
Copy link
Member Author

raulcd commented Jan 12, 2024

Thanks @jorisvandenbossche for the investigation!

@jorisvandenbossche
Copy link
Member

So I could nail down the failure to the following:

Status ParquetFileFragment::SetMetadata(
std::shared_ptr<parquet::FileMetaData> metadata,
std::shared_ptr<parquet::arrow::SchemaManifest> manifest) {
DCHECK(row_groups_.has_value());
metadata_ = std::move(metadata);
manifest_ = std::move(manifest);
statistics_expressions_.resize(row_groups_->size(), compute::literal(true));
statistics_expressions_complete_.resize(manifest_->descr->num_columns(), false);

In the above snippet, sometimes manifest_->descr->num_columns() returns -1, and so then we do a vector resize with -1 which triggers the std::length_error crash.
(see #39567 for the reproducer: I added a Status check for the value being positive, and now the tests sometimes fail with that error instead of crashing)

But I have no idea why that would sometimes return -1, and only on MacOS when running the test from an installed wheel (not any of the other Mac builds where we build Arrow directly)

cc @pitrou @mapleFU in case you have any clue about why parquet::arrow::SchemaManifest::decr::num_columns() would sometimes return -1. This methods returns the size of an underlying vector:

// The number of physical columns appearing in the file
int num_columns() const { return static_cast<int>(leaves_.size()); }

@pitrou
Copy link
Member

pitrou commented Jan 16, 2024

cc @pitrou @mapleFU in case you have any clue about why parquet::arrow::SchemaManifest::decr::num_columns() would sometimes return -1.

Well, theoretically, leaves_ could have more than 2**31 elements which, when casted to int, would result in a negative number. But of course that doesn't make sense in an unit test, and besides, it would certainly happen deterministically.

So, the likely explanation is that the leaves_ vector is corrupted. And that's quite plausible, given that SchemaManifest::SchemaDescriptor is a raw pointer.

pitrou added a commit to pitrou/arrow that referenced this issue Jan 16, 2024
TODO complete this description
@pitrou
Copy link
Member

pitrou commented Jan 16, 2024

Submitted a candidate fix before lunch:
#39632

pitrou added a commit that referenced this issue Jan 16, 2024
…ring (#39632)

### Rationale for this change

`ParquetFileFragment` stores a `SchemaManifest` that has a raw pointer to a `SchemaDescriptor`. The `SchemaDescriptor` is originally provided by a `FileMetadata` instance but, in some cases, the `FileMetadata` instance can be destroyed while the `ParquetFileFragment` is still in use. This can typically lead to bugs or crashes.

### What changes are included in this PR?

Ensure that `ParquetFileFragment` keeps an owning pointer to the `FileMetadata` instance that provides its `SchemaManifest`'s schema descriptor.

### Are these changes tested?

An assertion is added that would fail deterministically in the Python test suite.

### Are there any user-facing changes?

No.

* Closes: #39562

Authored-by: Antoine Pitrou <antoine@python.org>
Signed-off-by: Antoine Pitrou <antoine@python.org>
@pitrou pitrou modified the milestones: 15.0.0, 16.0.0 Jan 16, 2024
raulcd pushed a commit that referenced this issue Jan 16, 2024
…ring (#39632)

### Rationale for this change

`ParquetFileFragment` stores a `SchemaManifest` that has a raw pointer to a `SchemaDescriptor`. The `SchemaDescriptor` is originally provided by a `FileMetadata` instance but, in some cases, the `FileMetadata` instance can be destroyed while the `ParquetFileFragment` is still in use. This can typically lead to bugs or crashes.

### What changes are included in this PR?

Ensure that `ParquetFileFragment` keeps an owning pointer to the `FileMetadata` instance that provides its `SchemaManifest`'s schema descriptor.

### Are these changes tested?

An assertion is added that would fail deterministically in the Python test suite.

### Are there any user-facing changes?

No.

* Closes: #39562

Authored-by: Antoine Pitrou <antoine@python.org>
Signed-off-by: Antoine Pitrou <antoine@python.org>
idailylife pushed a commit to idailylife/arrow that referenced this issue Jan 18, 2024
…_filtering (apache#39632)

### Rationale for this change

`ParquetFileFragment` stores a `SchemaManifest` that has a raw pointer to a `SchemaDescriptor`. The `SchemaDescriptor` is originally provided by a `FileMetadata` instance but, in some cases, the `FileMetadata` instance can be destroyed while the `ParquetFileFragment` is still in use. This can typically lead to bugs or crashes.

### What changes are included in this PR?

Ensure that `ParquetFileFragment` keeps an owning pointer to the `FileMetadata` instance that provides its `SchemaManifest`'s schema descriptor.

### Are these changes tested?

An assertion is added that would fail deterministically in the Python test suite.

### Are there any user-facing changes?

No.

* Closes: apache#39562

Authored-by: Antoine Pitrou <antoine@python.org>
Signed-off-by: Antoine Pitrou <antoine@python.org>
@amoeba amoeba added the Critical Fix Bugfixes for security vulnerabilities, crashes, or invalid data. label Jan 22, 2024
clayburn pushed a commit to clayburn/arrow that referenced this issue Jan 23, 2024
…_filtering (apache#39632)

### Rationale for this change

`ParquetFileFragment` stores a `SchemaManifest` that has a raw pointer to a `SchemaDescriptor`. The `SchemaDescriptor` is originally provided by a `FileMetadata` instance but, in some cases, the `FileMetadata` instance can be destroyed while the `ParquetFileFragment` is still in use. This can typically lead to bugs or crashes.

### What changes are included in this PR?

Ensure that `ParquetFileFragment` keeps an owning pointer to the `FileMetadata` instance that provides its `SchemaManifest`'s schema descriptor.

### Are these changes tested?

An assertion is added that would fail deterministically in the Python test suite.

### Are there any user-facing changes?

No.

* Closes: apache#39562

Authored-by: Antoine Pitrou <antoine@python.org>
Signed-off-by: Antoine Pitrou <antoine@python.org>
dgreiss pushed a commit to dgreiss/arrow that referenced this issue Feb 19, 2024
…_filtering (apache#39632)

### Rationale for this change

`ParquetFileFragment` stores a `SchemaManifest` that has a raw pointer to a `SchemaDescriptor`. The `SchemaDescriptor` is originally provided by a `FileMetadata` instance but, in some cases, the `FileMetadata` instance can be destroyed while the `ParquetFileFragment` is still in use. This can typically lead to bugs or crashes.

### What changes are included in this PR?

Ensure that `ParquetFileFragment` keeps an owning pointer to the `FileMetadata` instance that provides its `SchemaManifest`'s schema descriptor.

### Are these changes tested?

An assertion is added that would fail deterministically in the Python test suite.

### Are there any user-facing changes?

No.

* Closes: apache#39562

Authored-by: Antoine Pitrou <antoine@python.org>
Signed-off-by: Antoine Pitrou <antoine@python.org>
zanmato1984 pushed a commit to zanmato1984/arrow that referenced this issue Feb 28, 2024
…_filtering (apache#39632)

### Rationale for this change

`ParquetFileFragment` stores a `SchemaManifest` that has a raw pointer to a `SchemaDescriptor`. The `SchemaDescriptor` is originally provided by a `FileMetadata` instance but, in some cases, the `FileMetadata` instance can be destroyed while the `ParquetFileFragment` is still in use. This can typically lead to bugs or crashes.

### What changes are included in this PR?

Ensure that `ParquetFileFragment` keeps an owning pointer to the `FileMetadata` instance that provides its `SchemaManifest`'s schema descriptor.

### Are these changes tested?

An assertion is added that would fail deterministically in the Python test suite.

### Are there any user-facing changes?

No.

* Closes: apache#39562

Authored-by: Antoine Pitrou <antoine@python.org>
Signed-off-by: Antoine Pitrou <antoine@python.org>
thisisnic pushed a commit to thisisnic/arrow that referenced this issue Mar 8, 2024
…_filtering (apache#39632)

### Rationale for this change

`ParquetFileFragment` stores a `SchemaManifest` that has a raw pointer to a `SchemaDescriptor`. The `SchemaDescriptor` is originally provided by a `FileMetadata` instance but, in some cases, the `FileMetadata` instance can be destroyed while the `ParquetFileFragment` is still in use. This can typically lead to bugs or crashes.

### What changes are included in this PR?

Ensure that `ParquetFileFragment` keeps an owning pointer to the `FileMetadata` instance that provides its `SchemaManifest`'s schema descriptor.

### Are these changes tested?

An assertion is added that would fail deterministically in the Python test suite.

### Are there any user-facing changes?

No.

* Closes: apache#39562

Authored-by: Antoine Pitrou <antoine@python.org>
Signed-off-by: Antoine Pitrou <antoine@python.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Component: Continuous Integration Component: Python Critical Fix Bugfixes for security vulnerabilities, crashes, or invalid data. Type: bug
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants