Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build: disable cache for golangci-lint #537

Merged
merged 1 commit into from May 17, 2023
Merged

Conversation

lzap
Copy link
Member

@lzap lzap commented May 17, 2023

CI currently fails with

/usr/bin/tar -xf /home/runner/work/_temp/9d82feee-0932-4ee7-82bf-2f8f8b676785/cache.tgz -P -C /home/runner/work/provisioning-backend/provisioning-backend -z
  /usr/bin/tar: ../../../go/pkg/mod/go.opentelemetry.io/otel@v1.14.0/VERSIONING.md: Cannot open: File exists
  Error: /usr/bin/tar: ../../../go/pkg/mod/go.opentelemetry.io/otel@v1.14.0/propagation/trace_context_test.go: Cannot open: File exists

This is because cache is stored twice, Go cache and Linter cache. The documentation for the linter explicitly disables the Go cache because it is essentially stored twice. This patch does exactly that.

During editing of the workflow I realized that we do not build the artifacts at all. So there can be a build error in main.go and we would not notice until container would fail to build. This adds a new job for that. Unfortunately, build cache for the Go action cannot be shared between build, test and integration test via setup-go, so I disabled it for now. There is a workaround but that is a lot of YAML I want rather to keep it simple.

Signed-off-by: Lukáš Zapletal <lzap+git@redhat.com>
@lzap lzap changed the title build: use own liter build: disable cache for golangci-lint May 17, 2023
@lzap
Copy link
Member Author

lzap commented May 17, 2023

The test "Code format, imports and style" will not pass until this is merged, I renamed jobs for better consistency.

@lzap
Copy link
Member Author

lzap commented May 17, 2023

Interesting observation:

The golangci-lint action uses its own cache for both Go and linter themselves. On every run, cache is reused (downloaded), used and then uploaded back to the cache. It grows over time (e.g. when dependencies are bumped) the starting point being 390 MB and I saw it as big as 750 MB recently. Every 7 days the cache is deleted and new one created.

When there is no cache (or cache was deleted), golangci can actually timeout (5 minutes limit of GHA), but the cache is uploaded even after a failed run, therefore the subsequent job will likely pass as most of the files will be already in the cache.

As much as I would like to use our own golangci-lint from the bin directory, this is quite useful and maintaining this would probably take too much time, so let’s continue using that for now.

Copy link
Member

@ezr-ondrej ezr-ondrej left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like it works, the reasoning make sense, I do not understand all the caches, but lets merge and see 👀 it :)

@ezr-ondrej ezr-ondrej merged commit b74b9c2 into RHEnVision:main May 17, 2023
7 checks passed
@lzap
Copy link
Member Author

lzap commented May 17, 2023

Just to explain and elaborate what is going on:

Action setup-go/v4 (which we upgraded to recently) has go package cache and it is enabled by default. It automatically caches compiled Go until go.mod is changed. With this patch, we only use this in the newly introduced "Build Go" job. Unfortunately, test and integration test cannot utilize the cache easily, so it is turned off. This is reported: one report and other report.

Action golangci-lint uses its own cache that contains Go libraries plus linter parser cache. It collides with Go cache when it was turned on thus the error. This fixes it. It grows over time and the strategy for this action is to clean it every 7 days.

Hope this makes it more clear.

@lzap lzap deleted the linter-make branch May 17, 2023 13:07
@ezr-ondrej
Copy link
Member

It does, even I understand it! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants