Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

azure-pipelines.yml: test important packages #769

Open
wants to merge 11 commits into
base: master
Choose a base branch
from

Conversation

alaviss
Copy link
Contributor

@alaviss alaviss commented Feb 5, 2020

No description provided.

@alaviss alaviss changed the title travis: test nimble against the important packages list azure-pipelines.yml: test important packages Feb 6, 2020
@dom96
Copy link
Collaborator

dom96 commented Feb 7, 2020

Do we need another CI for this? Can't we do it in travis?

@dom96
Copy link
Collaborator

dom96 commented Feb 7, 2020

To be honest, I wonder whether this should even be done every commit. It seems like it would be very flaky. We should have it run continuously every day or so instead to give us general signal, but every commit sounds like a pain.

@alaviss
Copy link
Contributor Author

alaviss commented Feb 7, 2020

Do we need another CI for this? Can't we do it in travis?

It usually takes over 10 mins for an output from testament (I guess it was doing all the cloning in the background), which just cause travis to terminate due to timeout.

To be honest, I wonder whether this should even be done every commit. It seems like it would be very flaky. We should have it run continuously every day or so instead to give us general signal, but every commit sounds like a pain.

It appears to me that nimble has been failing the "important packages test" for a while: https://dev.azure.com/alaviss/Nim/_build/results?buildId=280&view=logs&j=12f1170f-54f2-53f3-20dd-22fc7dff55f9&t=356bb04c-cb4a-5f04-82ca-d3b102917eba

I can reproduce the errors, and they aren't there with stable nimble (which is used by Nim devel for testing). So yes, I think we should do this kind of testing more often.

@dom96
Copy link
Collaborator

dom96 commented Feb 7, 2020

If we can get a list of packages that Nimble master fails on but where the last stable Nimble succeeds then that would be very high signal. But setting up and navigating all these different kinds of CIs is quite frustrating.

@timotheecour
Copy link
Member

It seems like it would be very flaky.

if it becomes too flaky we can simply add:

bash: testament cat nimble-packages

=>

bash: testament cat nimble-packages || echo "FAILED (allowing failure)"

but this is strictly better than not testing at all

@dom96
Copy link
Collaborator

dom96 commented Mar 15, 2020

It can give good signal, but I'd rather it not block PRs/commits which is why I think it should be run separately. Also, I would prefer if it didn't depend on testament.

@timotheecour
Copy link
Member

timotheecour commented Mar 15, 2020

it wouldn't block anything because of the || echo "FAILED (allowing failure)" which causes it to never fail; if by block you mean "takes too long" (in practice should take as long as it does in nim repo, ie ~32mn), you can always decide to squash&merge a commit without waiting for it to complete.

@dom96
Copy link
Collaborator

dom96 commented Mar 17, 2020

That could work but only if we can avoid noise in GitHub's UI.

@timotheecour
Copy link
Member

as long as a task succeeds (eg via || echo "FAILED (allowing failure)"), IIUC github UI would just show successful check, ie no noise, but logs could still be inspectable for failures

@dom96
Copy link
Collaborator

dom96 commented Jul 4, 2021

We now have Github Actions, can we do it there?

@dom96
Copy link
Collaborator

dom96 commented Jul 4, 2021

To be honest, I'm still preferring a separate repo for this so that we can avoid delaying the CI status.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants