Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API for accessing all defined specs #1225

Open
pohly opened this issue Jun 19, 2023 · 13 comments
Open

API for accessing all defined specs #1225

pohly opened this issue Jun 19, 2023 · 13 comments

Comments

@pohly
Copy link
Contributor

pohly commented Jun 19, 2023

ginkgo --dry-run can be useful for users to see what specs are defined. But sometimes test suite authors may want to provide other ways of listing specs or derive information about them (e.g. all defined labels). For that, a PreviewSpecs function that returns a full Report would be useful.

Less useful alternative: adding a "don't produce any output" ReportConfig field, then calling RunSpecs with a ReportAfterSuite callback. More complicated to set up.

Originally discussed in https://gophers.slack.com/archives/CQQ50BBNW/p1686938649240809.

@onsi
Copy link
Owner

onsi commented Jun 22, 2023

thanks for adding the issue @pohly I'd like your input on some design questions. I can imagine two approaches:

  1. PreviewSpecs simply constructs the spec tree and returns a Report object. It does not honor any configuration and always produces the complete spec list. The order of specs is not well-defined.
  2. PreviewSpecs operates more like --dry-run: if called without arguments it uses the global Ginkgo config (configured via the cli/flags) to construct, order, and filter the tree. The various randomization flags are honored, as are all the filter flags (note that if a spec is filtered out it still appears in the tree - however it will have SpecStateSkipped). As with RunSpecs you can provide custom GinkgoConfiguration()s to PreviewSpecs to see the effects of a given configuration on the specs.

I haven't considered the implementation yet and it may prove that one of these is much cheaper than the other - but I wanted to discuss the design without that bias. Thoughts?

@pohly
Copy link
Contributor Author

pohly commented Jun 22, 2023

Can we do both?

Option 2 may be useful for users to quickly try out the effect of the CLI flags. Option 1 can be achieved by not setting any flags, so option 2 is more capable. However, if in that same run one wants to report "x out of y specs would run", then one needs both.

@Dannyb48
Copy link
Contributor

Just curious has there been any further progress on this? I would be interested in this functionality.

In my team we've switched leveraging the ginkgo outline as a way to see what tests will be run but that is because we've shifted the perspective and treat the outline output like a BDD Gherkin style "feature scenario" to see what is being tested. Yet this kind of breaks on us since the outline command relies on AST and some of our tests are generated dynamically so sometimes we end up with a big graph of unknown text. But having an interface to preview specs would help us out a lot.

@onsi
Copy link
Owner

onsi commented Oct 5, 2023

I'm finally working on this and want to confirm that, just like --dry-run, PreviewSpecs will be mutually exclusive with RunSpecs and will require you to run in series. These constraints could be conceivably relaxed in the futrue (i.e. in a backward compatible way) but in the interest of getting this out - if I can make those simplifying constraints I can ship it sooner. Any concerns?

@pohly
Copy link
Contributor Author

pohly commented Oct 5, 2023

So you mean a process can invoke either PreviewSpecs or RunSpecs, but not both (i.e. first PreviewSpecs, then RunSpecs)?

The -list-tests and -list-labels that I am implementing in kubernetes/kubernetes#112894 would be okay with that constraint.

But the PR also adds sanity checking of the test registration. I'm still discussing with @aojea whether a panic when some bad call is invoked (my original approach) or use some more elaborate "collect all errors during registration, report them together" approach (current content of the PR) is better. If I want to do the latter as a prerequisite before running tests, then I would have to do PreviewSpecs + "check for errors" + RunSpecs.

Having said that, shipping it sooner with the constraint and later relaxing it sounds good to me.

@onsi
Copy link
Owner

onsi commented Oct 5, 2023

Hey @pohly and @Dannyb48 I now have a... preview of PreviewSpecs up on the master branch. The docs are here PTAL.

@onsi
Copy link
Owner

onsi commented Oct 5, 2023

I'll take a look at allowing PreviewSpecs and RunSpecs to both run.

@onsi
Copy link
Owner

onsi commented Oct 5, 2023

@pohly what about the constraint to run in series only?

@onsi
Copy link
Owner

onsi commented Oct 5, 2023

actually - never mind. i think i've found a way around both constraints that isn't too expensive or too ugly. i'll push it to master after I add some tests

@onsi
Copy link
Owner

onsi commented Oct 5, 2023

alrighty - sorry for all the noise. the latest code is now on master and both constraints are gone. You can call PreviewSpecs and RunSpecs in the same invocation of ginkgo and you can call PreviewSpecs when running in parallel. Each parallel process will run PreviewSpecs and get back what should be a basically identical (modulo minor timestamp differences) Report of the whole suite (i.e. not the subset of specs that the particular process will run - which is not predictable deterministically - and probably not what you want anyway).

@pohly
Copy link
Contributor Author

pohly commented Oct 5, 2023

Excellent! I'll take a look.

@pohly
Copy link
Contributor Author

pohly commented Oct 9, 2023

I've now also tested with PreviewSpecs followed by RunSpecs. Everything is working as expected, so as far as I am concerned, this is ready for tagging a release.

@onsi
Copy link
Owner

onsi commented Oct 9, 2023

👍 thanks @pohly - i'll cut a release now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants