Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test retries, when retrying on several failed tests, causes hanging in cypress run #9040

Closed
radDunin opened this issue Oct 30, 2020 · 19 comments · Fixed by #14381
Closed

Test retries, when retrying on several failed tests, causes hanging in cypress run #9040

radDunin opened this issue Oct 30, 2020 · 19 comments · Fixed by #14381

Comments

@radDunin
Copy link

radDunin commented Oct 30, 2020

Hi,
I am running cypress and tests using docker inside the Jenkins pipeline. Unfortunately when the test failing cypress is hanging on with test retries attempt I suppose that in both attempts the test fails but no idea why cypress didn't go on after it. Could anyone help me with this or explain if it is a bug or wrong usage? I am trying to search for any issue with this case but didn't find it.
Screen from the pipeline:
image

I've experimented with different ways for adding test retries to my tests in many ways:

it('should emit event search_triggered', { retries: 1 }, () => {...}
it('should emit event search_triggered', { retries: { runMode: 1, openMode: 1} }, () => {...}. 

//and in global cypress.json

{
  "retries": {
    "runMode": 1,
    "openMode": 3
  }
}

docker Image:

FROM cypress/browsers:node12.6.0-chrome77
#install cypress and dependecy
COPY e2e ./e2e
WORKDIR ./e2e
RUN npm ci
```

command use to run docker:

docker run --rm --ipc=host -m 4GB -e CYPRESS_VIDEO=false
--entrypoint=npm ".../cypress-test:5.5.0-1"
run testOnJenkins -- --env configFile="cypress-$ENVIRONMENT_ID"


Cypress:    5.5.0                                                                      
Browser:    Electron 85 
@danmaftei
Copy link

danmaftei commented Oct 30, 2020

I don't have an example to reproduce but this happened to us as well. Cypress hung indefinitely in a test spec after retrying a test case. We turned off retries, and the test spec runs to completion.

I ran with DEBUG=cypress:* and it looks like there might be a memory leak? I've pasted the last portion of the logs. The table at the end kept on being outputted indefinitely, each time with higher memory usage for Cypress:

  cypress:server:reporter test retried: Should allow half hour derived attributes +0ms
  cypress:server:server Got CONNECT request from localhost:8080 +4m
  cypress:https-proxy Writing browserSocket connection headers { url: 'localhost:8080', headLength: 0, headers: { host: 'localhost:8080', 'proxy-connection': 'keep-alive', 'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Cypress/5.0.0 Chrome/83.0.4103.122 Electron/9.2.0 Safari/537.36' } } +4m
  cypress:https-proxy Got first head bytes { url: 'localhost:8080', head: 'GET /__socket.io/?EIO=3&transport=websocket HTTP/1.1\r\nHost: loca' } +7ms
  cypress:network:cors Parsed URL { port: '8080', tld: 'localhost', domain: '' } +4m
  cypress:server:server HTTPS request does match URL: https://localhost:8080 with props: { port: '8080', tld: 'localhost', domain: '' } +11ms
  cypress:https-proxy Not making direct connection { url: 'localhost:8080' } +3ms
  cypress:https-proxy Making intercepted connection to 62973 +0ms
  cypress:https-proxy getting proxy URL { port: 62973, serverPort: 62973, sniPort: 62975, url: 'https://localhost:62973' } +1ms
  cypress:server:util:socket_allowed allowed socket closed, removing { localPort: 62988 } +4m
  cypress:network:connect successfully connected { opts: { port: 62973, host: 'localhost', getDelayMsForRetry: [Function: getDelayForRetry] }, iteration: 0 } +4m
  cypress:https-proxy received upstreamSocket callback for request { port: 62973, hostname: 'localhost', err: undefined } +9ms
  cypress:server:util:socket_allowed allowing socket { localPort: 64160 } +1ms
  cypress:server:server Got UPGRADE request from /__socket.io/?EIO=3&transport=websocket +11ms
  cypress:server:util:socket_allowed is incoming request allowed? { isAllowed: true, reqUrl: '/__socket.io/?EIO=3&transport=websocket', remotePort: 64160, remoteAddress: '127.0.0.1' } +2ms
  cypress:server:socket socket connected +42s
  cypress:server:util:process_profiler current & mean memory and CPU usage by process group:
  cypress:server:util:process_profiler ┌─────────┬───────────────────┬──────────────┬────────────────┬────────────┬────────────────┬──────────┬──────────────┬─────────────┐
  cypress:server:util:process_profiler │ (index) │       group       │ processCount │      pids      │ cpuPercent │ meanCpuPercent │ memRssMb │ meanMemRssMb │ maxMemRssMb │
  cypress:server:util:process_profiler ├─────────┼───────────────────┼──────────────┼────────────────┼────────────┼────────────────┼──────────┼──────────────┼─────────────┤
  cypress:server:util:process_profiler │    0    │     'cypress'     │      1       │    '69136'     │   214.7    │     107.01     │ 1283.01  │    389.71    │   1396.57   │
  cypress:server:util:process_profiler │    1    │    'Electron'     │      1       │    '69171'     │    53.6    │     77.41      │  204.77  │    509.12    │   547.44    │
  cypress:server:util:process_profiler │    2    │ 'electron-shared' │      2       │ '69139, 69141' │    58.3    │      56.8      │  157.65  │    158.97    │   177.23    │
  cypress:server:util:process_profiler │    3    │     'plugin'      │      1       │    '69168'     │    0.1     │      0.04      │   57.3   │    108.71    │   288.49    │
  cypress:server:util:process_profiler │    4    │      'other'      │      2       │ '74592, 74593' │     0      │      0.2       │   1.86   │     1.86     │    1.91     │
  cypress:server:util:process_profiler │    5    │      'TOTAL'      │      7       │      '-'       │   326.7    │     235.36     │ 1704.58  │   1133.06    │   2196.36   │
  cypress:server:util:process_profiler └─────────┴───────────────────┴──────────────┴────────────────┴────────────┴────────────────┴──────────┴──────────────┴─────────────┘ +52s

After ~20 or so of these tables being spit out, the last one before I quit the run was:

 cypress:server:util:process_profiler current & mean memory and CPU usage by process group:
  cypress:server:util:process_profiler ┌─────────┬───────────────────┬──────────────┬────────────────┬────────────┬────────────────┬──────────┬──────────────┬─────────────┐
  cypress:server:util:process_profiler │ (index) │       group       │ processCount │      pids      │ cpuPercent │ meanCpuPercent │ memRssMb │ meanMemRssMb │ maxMemRssMb │
  cypress:server:util:process_profiler ├─────────┼───────────────────┼──────────────┼────────────────┼────────────┼────────────────┼──────────┼──────────────┼─────────────┤
  cypress:server:util:process_profiler │    0    │     'cypress'     │      1       │    '69136'     │   205.9    │     164.02     │ 2575.36  │    1620.7    │   3129.61   │
  cypress:server:util:process_profiler │    1    │    'Electron'     │      1       │    '69171'     │     10     │     41.19      │  182.31  │    326.14    │   547.44    │
  cypress:server:util:process_profiler │    2    │ 'electron-shared' │      2       │ '69139, 69141' │    53.2    │     55.25      │  132.46  │    144.09    │   177.23    │
  cypress:server:util:process_profiler │    3    │     'plugin'      │      1       │    '69168'     │     0      │      0.02      │  31.83   │    66.18     │   288.49    │
  cypress:server:util:process_profiler │    4    │      'other'      │      2       │ '81203, 81204' │     0      │      0.09      │   1.85   │     1.86     │    1.94     │
  cypress:server:util:process_profiler │    5    │      'TOTAL'      │      7       │      '-'       │   269.1    │     258.56     │ 2923.82  │   2147.79    │   3476.9    │
  cypress:server:util:process_profiler └─────────┴───────────────────┴──────────────┴────────────────┴────────────┴────────────────┴──────────┴──────────────┴─────────────┘ +10s

@jennifer-shehane
Copy link
Member

Likely this issue is very specific to the structure of the test code, perhaps the combination of hooks and suites, or the test code itself.

We don't have anything to investigate until a reproducible example is provided. Here are some tips for providing a Short, Self Contained, Correct, Example and our own Troubleshooting Cypress guide.

@jennifer-shehane jennifer-shehane added the stage: needs information Not enough info to reproduce the issue label Nov 2, 2020
@c32hedge
Copy link

c32hedge commented Nov 20, 2020

We're seeing the same problem in Cypress 5.5 using Chrome 83 on Ubuntu 20.04--I had three different Cypress projects all start hanging shortly after upgrading to 5.5 and enabling retries. Removing the retries configuration immediately resolved the issue. Some observations:

  • For the same set of failing tests, the hang seems to occur at the same point fairly deterministically.
  • We're seeing this when there are many failures (I've seen it occur after 9-15 tests failed in a single spec file with retries set to 2).

And I think I just got a reproducible example:

spec file:

describe('page', () => {
  for (let i = 0; i < 200; i++) {
    it('fails', () => {
      expect(true).to.be.false;
    });
  }
});

cypress.json:

{
  "retries": 10
}

The above works fine using cypress open, but cypress run gets very sluggish around the 4th and 5th tests (the second attempt for the 4th test is quite slow to finish, and it has been hanging on the third attempt for 15+ minutes). I tried cypress run with the example above in both Electron 85 and Chrome 86 on Ubuntu 20.04, with the same result.

image

@c32hedge
Copy link

@jennifer-shehane this was supposed to be the marquee feature of Cypress 5. I have a trivial example above that reproduces the hang, but it's now been 10 days with no response and Cypress 6 coming out in that time. I love Cypress, but the pattern of lack of responsiveness to, let alone fixing of existing issues in the apparent rush to deliver new features is getting to be a real problem for us.

Likely this issue is very specific to the structure of the test code, perhaps the combination of hooks and suites, or the test code itself.

To be frank, as a professional tester I'm disappointed that literally the first thing I tried to reproduce the hang worked. I'm even more disappointed that the Cypress team didn't even try something simple like this and instead made the assumption that it was an obscure edge case. To me, that suggests that new Cypress features aren't really being subjected to more than shallow confirmatory testing, which for a testing tool is scary.

Sorry for the rant. Again, I've been using Cypress for a couple years now and it's overall been a positive experience. I just haven't known how best to voice my frustration with the seeming lack of support and not knowing which bug reports will just disappear into a black hole. Please take this as constructive feedback from a tester who would love to see Cypress become even better.

@bahunov
Copy link

bahunov commented Dec 1, 2020

@jennifer-shehane this was supposed to be the marquee feature of Cypress 5. I have a trivial example above that reproduces the hang, but it's now been 10 days with no response and Cypress 6 coming out in that time. I love Cypress, but the pattern of lack of responsiveness to, let alone fixing of existing issues in the apparent rush to deliver new features is getting to be a real problem for us.

Likely this issue is very specific to the structure of the test code, perhaps the combination of hooks and suites, or the test code itself.

To be frank, as a professional tester I'm disappointed that literally the first thing I tried to reproduce the hang worked. I'm even more disappointed that the Cypress team didn't even try something simple like this and instead made the assumption that it was an obscure edge case. To me, that suggests that new Cypress features aren't really being subjected to more than shallow confirmatory testing, which for a testing tool is scary.

Sorry for the rant. Again, I've been using Cypress for a couple years now and it's overall been a positive experience. I just haven't known how best to voice my frustration with the seeming lack of support and not knowing which bug reports will just disappear into a black hole. Please take this as constructive feedback from a tester who would love to see Cypress become even better.

I'm facing exact same issue.. could it be CPU or memory restrains?

My logs before it just hangs and prints tons of cpu&memory usage:

[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.312Z cypress:proxy:http:response-middleware received response { req: { method: 'POST', proxiedUrl: 'http://localhost:9000/app/internalRemarks/fetchRemarks', headers: { host: 'localhost:9000', 'proxy-connection': 'keep-alive', 'content-length': '202', accept: 'application/json, text/plain, */*', 'user-agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/85.0.4183.102 Safari/537.36', 'content-type': 'application/json;charset=UTF-8', 'sec-fetch-site': 'same-origin', 'sec-fetch-mode': 'cors', 'sec-fetch-dest': 'empty', referer: 'http://localhost:9000/dpr', 'accept-encoding': 'gzip', 'accept-language': 'en-US' } }, incomingRes: { headers: { 'x-powered-by': 'Express', 'x-content-type-options': 'nosniff', 'x-xss-protection': '1; mode=block', 'cache-control': 'no-cache, no-store, max-age=0, must-revalidate', pragma: 'no-cache', expires: '0', 'content-type': 'application/json;charset=UTF-8', 'transfer-encoding': 'chunked', date: 'Tue, 01 Dec 2020 19:52:47 GMT', connection: 'close' }, statusCode: 200 } }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.312Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'AttachPlainTextStreamFn' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'InterceptResponse' }
[2020-12-01T19:52:47.336Z] Tue, 01 Dec 2020 19:52:47 GMT cypress:net-stubbing:server:intercept-response InterceptResponse { req: { url: '/app/internalRemarks/fetchRemarks' }, backendRequest: undefined }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'PatchExpressSetHeader' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'SetInjectionLevel' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http:response-middleware injection levels: { isInitial: false, wantsInjection: false, wantsSecurityRemoved: false }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'OmitProblematicHeaders' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'MaybePreventCaching' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'MaybeStripDocumentDomainFeaturePolicy' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'CopyCookiesFromIncomingRes' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'MaybeSendRedirectToClient' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'CopyResponseStatusCode' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'ClearCyInitialCookie' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:server:server Getting remote state: { auth: null, props: { port: '9000', tld: 'localhost', domain: '' }, origin: 'http://localhost:9000', strategy: 'http', visiting: false, domainName: 'localhost', fileServer: null }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'MaybeEndWithEmptyBody' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'MaybeInjectHtml' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'MaybeRemoveSecurity' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'GzipBody' }
[2020-12-01T19:52:47.336Z] 2020-12-01T19:52:47.313Z cypress:proxy:http Running middleware { stage: 'IncomingResponse', middlewareName: 'SendResponseBodyToClient' }
[2020-12-01T19:52:49.848Z] 2020-12-01T19:52:49.317Z cypress:server:util:socket_allowed allowed socket closed, removing { localPort: 40408 }
[2020-12-01T19:52:49.848Z] 2020-12-01T19:52:49.318Z cypress:server:util:socket_allowed allowed socket closed, removing { localPort: 42302 }
[2020-12-01T19:52:57.920Z] 2020-12-01T19:52:57.290Z cypress:server:util:process_profiler current & mean memory and CPU usage by process group:
[2020-12-01T19:52:57.920Z] ┌─────────┬───────────────────┬──────────────┬───────────────────────────────────────────────────────┬────────────┬────────────────┬──────────┬──────────────┬─────────────┐
[2020-12-01T19:52:57.920Z] │ (index) │       group       │ processCount │                         pids                          │ cpuPercent │ meanCpuPercent │ memRssMb │ meanMemRssMb │ maxMemRssMb │
[2020-12-01T19:52:57.920Z] ├─────────┼───────────────────┼──────────────┼───────────────────────────────────────────────────────┼────────────┼────────────────┼──────────┼──────────────┼─────────────┤
[2020-12-01T19:52:57.920Z] │    0    │     'Chrome'      │      7       │ '3804, 3817, 3818, 3839, 3869, 3841 ... 1 more items' │   460.02   │     20.25      │  291.2   │   1282.73    │   2524.43   │
[2020-12-01T19:52:57.920Z] │    1    │     'cypress'     │      1       │                        '1081'                         │    0.11    │      1.06      │  224.51  │    307.71    │   369.85    │
[2020-12-01T19:52:57.920Z] │    2    │ 'electron-shared' │      4       │               '1093, 1253, 1094, 1289'                │     0      │       0        │  172.04  │    170.94    │   172.04    │
[2020-12-01T19:52:57.920Z] │    3    │     'plugin'      │      1       │                        '1337'                         │     0      │       0        │  83.38   │    85.78     │    91.05    │
[2020-12-01T19:52:57.920Z] │    4    │      'other'      │      2       │                     '5499, 5500'                      │     0      │       0        │   3.49   │     3.57     │    5.46     │
[2020-12-01T19:52:57.920Z] │    5    │      'TOTAL'      │      15      │                          '-'                          │   460.13   │     20.81      │  774.62  │   1818.01    │   3022.21   │
[2020-12-01T19:52:57.920Z] └─────────┴───────────────────┴──────────────┴───────────────────────────────────────────────────────┴────────────┴

@jennifer-shehane
Copy link
Member

jennifer-shehane commented Dec 16, 2020

We have lots of tests on retries, but likely not retrying upwards of 10 times. Perhaps that is the case causing the hanging. Trying every iteration of test suites is something that is not possible and why we have bug reports for users.

Thanks for providing this example and I understand the frustration. We're doing what we can.

I can reproduce the hanging with the example. We haven't previously seen this behavior with regards to retries.

Each retry slows down the run. You can even see this if you add retries: 4 with the example below. Each test gets slower and slower until it hangs.

This has been present since 5.0

describe('page', () => {
  for (let i = 0; i < 10; i++) {
    it(`test ${i}`, () => {
      expect(true).to.be.false
    })
  }
})
cypress run

Screen Shot 2020-12-16 at 11 12 01 AM

DEBUG logs with video turned off: run.log

This is a weird log at the end when video was turned on, but doesn't explain the locking up when video is turned off:

2020-12-16T04:44:06.789Z cypress:server:video capture stderr log { message: 'More than 1000 frames duplicated' }
2020-12-16T04:44:06.854Z cypress:server:browsers:electron debugger: sending Page.screencastFrameAck with params { sessionId: 1 }
2020-12-16T04:44:06.854Z cypress:server:browsers:electron debugger: received response to Page.screencastFrameAck: {}
2020-12-16T04:44:06.935Z cypress:server:browsers:electron debugger: sending Page.screencastFrameAck with params { sessionId: 1 }
2020-12-16T04:44:06.935Z cypress:server:browsers:electron debugger: received response to Page.screencastFrameAck: {}
2020-12-16T04:44:06.985Z cypress:server:browsers:electron debugger: sending Page.screencastFrameAck with params { sessionId: 1 }
2020-12-16T04:44:06.985Z cypress:server:browsers:electron debugger: received response to Page.screencastFrameAck: {}
2020-12-16T04:44:07.029Z cypress:server:browsers:electron debugger: sending Page.screencastFrameAck with params { sessionId: 1 }

I tried to reduce the problem down. A few things that did not work in preventing the hanging.

  • Setting video: false - no change
  • Setting numTestsKeptInMemory: 0 - ran a few more tests before hanging
  • Setting CYPRESS_NO_COMMAND_LOG=1 - ran a few more tests before hanging

@jennifer-shehane jennifer-shehane changed the title Test retries is hanging on first attempt in CI at Jenkins pipeline Test retries, when retrying on several failed tests, causes hanging in cypress run Dec 16, 2020
@cypress-bot cypress-bot bot added stage: needs investigating Someone from Cypress needs to look at this and removed stage: needs information Not enough info to reproduce the issue labels Dec 16, 2020
@bahmutov
Copy link
Contributor

bahmutov commented Dec 17, 2020

Recreated the tests slowing down and hanging for Jennifer's example in https://github.com/cypress-io/cypress-test-tiny/tree/retries-slowdown-9040

Interesting that the test is really slow in the phase "No commands were issued in this test." When running with --headed:

retries

@bahmutov
Copy link
Contributor

Stopping the debugger randomly multiple times while the GUI is spinning shows it is stopping at WS communication without anything meaningful being sent

Screen Shot 2020-12-17 at 2 50 39 PM

@bahmutov
Copy link
Contributor

The WS communications do not look extraordinary

Screen Shot 2020-12-17 at 2 56 02 PM

@bahmutov
Copy link
Contributor

Note: repeating the same test via retries does not seem to show unusual slowdown

describe('page', () => {
  for (let i = 0; i < 1; i++) {
    it(`test ${i}`, { retries: 140 }, () => {
      expect(true).to.be.false
    })
  }
})

Keep running pretty snappy. On the other repeating 2 retries per test slows down a lot even after 10 tests

describe('page', () => {
  for (let i = 0; i < 100; i++) {
    it(`test ${i}`, { retries: 2 }, () => {
      expect(true).to.be.false
    })
  }
})

@bahmutov
Copy link
Contributor

Stopping the debugger more times and see that if stops inside Mobx code

Screen Shot 2020-12-17 at 3 05 21 PM

The values of the call

Screen Shot 2020-12-17 at 3 05 54 PM

Screen Shot 2020-12-17 at 3 06 07 PM

@bahmutov
Copy link
Contributor

Enabled debug logs in the browser with localStorage.debug='cypress*' and look at this call!

Screen Shot 2020-12-17 at 3 17 43 PM

@bahmutov
Copy link
Contributor

bahmutov commented Dec 17, 2020

Setting the debug logs to be more specific with localStorage.debug='cypress:driver' shows that commands slow down

Screen Shot 2020-12-17 at 3 24 49 PM

@bahmutov
Copy link
Contributor

Using a regular expression /command:log:added|backend:request/ we can filter to show both messages

Screen Shot 2020-12-17 at 3 35 02 PM

@bahmutov
Copy link
Contributor

Went to the suspicious code in lib/reporter.js

const mergeRunnable = (eventName) => {
  return (function (testProps, runnables) {
    debug('merging runnable %s', eventName)
    toMochaProps(testProps)

    const runnable = runnables[testProps.id]

    if (eventName === 'test:before:run') {
      if (testProps._currentRetry > runnable._currentRetry) {
        debug('test retried:', testProps.title)
        const prevAttempts = runnable.prevAttempts || []

        delete runnable.prevAttempts
        const prevAttempt = _.cloneDeep(runnable)

        delete runnable.failedFromHookId
        delete runnable.err
        delete runnable.hookName
        testProps.prevAttempts = prevAttempts.concat([prevAttempt])
      }
    }

    const merged = _.extend(runnable, testProps)
    debug('merging done')
    return merged
  })
}

Added debug logs to see how long it takes to merge events - seems the pauses in the test runner align with the messages about merging

Screen Shot 2020-12-17 at 4 08 15 PM

@cypress-bot cypress-bot bot added stage: work in progress and removed stage: needs investigating Someone from Cypress needs to look at this labels Dec 17, 2020
@bahmutov
Copy link
Contributor

Added timing to the merge code in #14222 and saw the following

Screen Shot 2020-12-17 at 4 15 54 PM

The test keeps all those objects around and the merge is slow. Maybe we should do our own lightweight extend here since we know the structure

@bahmutov bahmutov self-assigned this Dec 17, 2020
@bahmutov
Copy link
Contributor

The slowdown can be attributed to this line

const prevAttempt = _.cloneDeep(runnable)

The runnable is becoming larger and larger

@cypress-bot cypress-bot bot added stage: ready for work The issue is reproducible and in scope stage: needs review The PR code is done & tested, needs review and removed stage: work in progress stage: ready for work The issue is reproducible and in scope labels Jan 4, 2021
@cypress-bot cypress-bot bot added stage: pending release and removed stage: needs review The PR code is done & tested, needs review labels Jan 4, 2021
@cypress-bot
Copy link
Contributor

cypress-bot bot commented Jan 4, 2021

The code for this is done in cypress-io/cypress#14381, but has yet to be released.
We'll update this issue and reference the changelog when it's released.

@cypress-bot
Copy link
Contributor

cypress-bot bot commented Jan 4, 2021

Released in 6.2.1.

This comment thread has been locked. If you are still experiencing this issue after upgrading to
Cypress v6.2.1, please open a new issue.

@cypress-bot cypress-bot bot locked as resolved and limited conversation to collaborators Jan 4, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.