-
Notifications
You must be signed in to change notification settings - Fork 391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Export metric rating thresholds, add explicit MetricRatingThresholds
type
#323
Conversation
60b0609
to
9909211
Compare
f9ad55e
to
cb26090
Compare
MetricThresholds
type
Seems a reasonable change, though it does make the built files ever so slightly bigger since we're exporting 6 new values now. A couple of points:
For the second, maybe a tests/unit.js file with the following: import {CLSThresholds, LCPThresholds} from 'web-vitals';
import assert from 'assert';
assert.strictEqual(CLSThresholds[0], 0.1);
assert.strictEqual(CLSThresholds[1], 0.25);
assert.strictEqual(LCPThresholds[0], 2500);
assert.strictEqual(LCPThresholds[1], 4000); I'm in two minds about putting all 12 test cases in there (two for each metric). On one hand it's a bit repetitive, but on the other it does ensure they are all exported properly so probably for the best despite it being repetitive. Besides I don't expect these thresholds to change often. WDYT? |
71ab89d
to
86194e6
Compare
I think it's a good idea to verify all thresholds are being exported correctly even if it's a little repetitive, especially since they're not expected to change often. I added unit tests for this. BTW, should tests be running in the CI/CD pipeline? Looks like it's currently only running lint checks. |
74708ef
to
25105d6
Compare
MetricThresholds
typeMetricRatingThresholds
type
cf576bf
to
94f05d9
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This LGTM.
@robatron could you update the initial comment of this PR since it's changed a good bit since you opened this PR. Just in case anyone comes across it later and is confused.
@brendankenny could you give this another review?
94f05d9
to
7266b2b
Compare
MetricRatingThresholds
typegetMetricRatingThresholds()
function, add MetricRatingThresholds
type
66c5435
to
3909d10
Compare
Done! Also squashed the commits (→ 3909d10) and should be ready to merge anytime 👍 |
Ta!
FYI, for next time, we Squash and Merge, so no need to do this (and actually would prefer not to force push so easier to track changes in the PR!).
Cool, would like @brendankenny 's review before merging. |
Hey @brendankenny , any other changes you'd like me to make before @tunetheweb merges? |
@robatron sorry to jump in here at the last minute, but could you explain to me a bit more about the use case for wanting to know the metric thresholds outside of the report callback function? I'm concerned that this is adding functionality that 99.9% of users won't need and probably shouldn't use, so I just wanted to make sure I better understand the use case. Another implementation option, given that this likely isn't an API that most users will need, would be to create a new top-level module export that doesn't contribute to the main bundle, e.g.: import {onLCP} from 'web-vitals';
import {LCPThreshold} from 'web-vitals/thresholds'; Potentially we could even have a generic WDYT? |
Sure! Happy to explain our use cases: We use Web Vitals thresholds in our analytics and performance monitoring systems to calculate and track degrees of Web Vitals score ratings over time. This allows us to do things like:
Additionally, I noticed a number of hard-coded Web Vitals thresholds on GitHub, e.g.,
While some examples could potentially be rewritten to use the
import {onLCP} from 'web-vitals';
import {LCPThreshold} from 'web-vitals/thresholds'; I like it! An API for the official Web Vitals rating thresholds would be highly valuable regardless of origin. I'd be happy to update the pull request if everyone's satisfied with this solution. |
Ok, I thought about this a bit more, and I think having a seperate export (e.g. Also, I'm not sure the format Given this, and the fact that And if we go this route, I think it makes the most sense for the metric modules to export the thresholds, so that folks only using one or two metrics don't pay the cost of all of the thresholds. So I'm thinking something like: // File: ./onCLS.js
// The thresholds are a top-level module export for each metric.
export const LCPThresholds = [2500, 4000];
// The metric function exported as it is today.
export const onLCP = () => { ... }; Also, I understand @brendankenny's concern about people modifying the thresholds, but honestly I'm not too worried about that. If we do want to prevent that from happening, we could always export |
I do think being able to export the thresholds is a good thing. And agree the Array without labels is probably sufficient.
But that would be 6 exports (with quite long names), whereas a function to get the exports (based in a
That cost seems pretty small to me isn’t it? |
All modern bundlers will remove those exports when building application code, and assign them to a local variable (which could be minified, so in that sense they're basically free). I just tried a build that exported all of the symbols I outlined above and it increased the brotli'd size of the final bundle by 38 bytes, so that would be the maximum size increase if pulling the build from a CDN, but if bundling it locally then it shouldn't increase it at all (I'm pretty sure). |
Compression is amazing! Funny, I was just making that point (to compare compressed size of a bundle, rather than uncompressed size) today and then forgot my own advice! OK then so back to original implementation basically @robatron . Apologies for all the iterations here and ending up back to the beginning basically! |
- Replace exported Web Vitals metric thresholds with a new public function`getMetricRatingThresholds()` - Refactor `MetricRatingThresholds` type to be more self-explanatory
3909d10
to
bd68422
Compare
getMetricRatingThresholds()
function, add MetricRatingThresholds
typeMetricRatingThresholds
type
Lol. Ok, back to the original implementation then! (Another reason I shouldn't have pre-squashed my commits 😆) I unsquashed my commits, reverted the relevant code, and updated the PR title and description accordingly. WDYT? |
Hi @philipwalton :) FWIW I was thinking more like const thresholds = [0.1, 0.25];
export const onCLS = () => {
// use `thresholds` as before.
}
export const CLSThresholds = [...thresholds]; at which point export const CLSThresholds = {good: thresholds[0], poor: thresholds[1]}; doesn't matter much, size-wise. This would prevent you having to consult the docs on which number is which, at least, though I do take the point that we don't really have names we use for the thresholds, and those names conflate thresholds with buckets. |
Yeah, I had something similar in the previous implementation to prevent modification: export const getMetricRatingThresholds = () => {
try {
// Return a copy to prevent changes
const {good, needsImprovement} = metricRatingThresholds[metricName];
return {good, needsImprovement};
} catch (e) { /* ... */ }
}; |
MetricRatingThresholds
typeMetricRatingThresholds
type
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks for hanging in there with us on this!
OK let's merge this. Will do a release then. |
What?
This PR exports the existing rating thresholds for all metrics, e.g.,
... and defines the existing rating thresholds format with an explicit
MetricRatingThresholds
type:Why?
Exporting typed metric rating thresholds enables referencing threshold values directly, reducing potential duplication issues and improving long-term codebase maintainability for consumers 💪
Details
Exported metric rating thresholds
The thresholds of each metric's "good", "needs improvement", and "poor" ratings are made available as a top-level exports:
These exported metric rating thresholds are simply the existing private thresholds hoisted to module scope and exported alongside their
on*()
functions. E.g.,New
MetricRatingThresholds
typeThis new type simply defines the existing format explicitly (formerly
number[]
):This type represents the thresholds of metric's "good", "needs improvement", and "poor" ratings:
I.e.,