Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spurious "previous failures in test files that were not rerun" in watch mode #3295

Closed
mildmojo opened this issue Jan 27, 2024 · 2 comments · Fixed by #3297
Closed

Spurious "previous failures in test files that were not rerun" in watch mode #3295

mildmojo opened this issue Jan 27, 2024 · 2 comments · Fixed by #3297
Labels
bug current functionality does not work as desired

Comments

@mildmojo
Copy link

Actual

When AVA v6.1.0 runs in watch mode, it prints something like 4 previous failures in test files that were not rerun on every rerun, counting all previously-failed tests from all previous runs, even after re-running all tests with 100% passing results.

Expected

When AVA runs in watch mode, it should not report previous test failures. When all tests are passing, it should report that all tests are passing, full stop.

Reproduction

Minimal reproduction steps below, with annotations about observed behavior:

$ mkdir /tmp/ava-repro && cd /tmp/ava-repro && npm install --dev ava
$ npx ava --version
6.1.0
$ echo "require('ava')('repro', t => t.fail());" > test.js
$ npx ava -w

  ✘ [fail]: test › repro Test failed via `t.fail()`
  ─

  test › repro

  test.js:1

   1: require('ava')('repro', t => t.fail());
   2:                                        

  Test failed via `t.fail()`

  › test.js:1:32

  ─

  1 test failed [16:17:04]

Touch test.js to let AVA re-run the test:

 ✘ [fail]: test › repro Test failed via `t.fail()`
  ─

  test › repro

  test.js:1

   1: require('ava')('repro', t => t.fail());
   2:                                        

  Test failed via `t.fail()`

  › test.js:1:32

 ─

  1 test failed [16:39:56]
  1 previous failure in test files that were not rerun

Change t.fail() to t.pass() and save:

 ✔ test › repro
  ─

  1 test passed [16:19:00]
  2 previous failures in test files that were not rerun

Type 'r' in the test console and hit enter to re-run tests:

  ✔ test › repro
  ─

  1 test passed [16:19:51]
  2 previous failures in test files that were not rerun

The "previous failures" message never seems to clear. I can't figure out what it means or how it would be useful; all test files were rerun, and none had failures.

  1. Why does it say test files were not rerun, even when there's only one test file that was definitely rerun?
  2. Why does the "previous failures" count persist, going up every time a test fails during a run, even when there's only a single test in the suite?

There are some previous reports of similar issues (#2821, #2069, #2040), but they've all been closed, and this still seems to happen in the current release.

novemberborn added a commit that referenced this issue Jan 28, 2024
The counters used absolute paths for the test files, but the clearing logic used relative paths. Count using relative paths instead.

The number of previous failures is not observable to the test harness, so this does not come with test coverage.

Fixes #3295.
@novemberborn novemberborn added bug current functionality does not work as desired and removed needs triage labels Jan 28, 2024
@novemberborn
Copy link
Member

Thanks for the great bug report @mildmojo!

See #3297, the counters were never cleared correctly, so on every run they would increase.

Previous reports were likely closed because the watch mode was rewritten, and that should have resolved any remaining issues, but for a path inconsistency.

novemberborn added a commit that referenced this issue Jan 28, 2024
The counters used absolute paths for the test files, but the clearing logic used relative paths. Count using relative paths instead.

The number of previous failures is not observable to the test harness, so this does not come with test coverage.

Fixes #3295.
@mildmojo
Copy link
Author

Thanks for the fast fix! 🙌

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug current functionality does not work as desired
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants