You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
docs: fix typos of the 'A Guided Tour' page (#3236)
Summary:
I'm learning a lot about Relay from the great (and very long) _A Guided Tour_ article. By finishing it, I also found some typos in this article, so I'm making this PR for fixes of those typos.
Pull Request resolved: #3236
Reviewed By: jstejada
Differential Revision: D24621867
Pulled By: kassens
fbshipit-source-id: 1a5b57af00b149a8595bbe6a1beab724a3e3ae28
Copy file name to clipboardexpand all lines: website/versioned_docs/version-experimental/RelayHooks-AGuidedTourOfRelay.md
+24-18
Original file line number
Diff line number
Diff line change
@@ -520,7 +520,7 @@ function App() {
520
520
521
521
Note that:
522
522
523
-
* The ***fragment reference*** that `UserComponent` expects is is the result of reading a parent query that includes its fragment, which in our case means a query that includes `...UsernameSection_user`. In other words, the `data` obtained as a result of `useLazyLoadQuery` also serves as the fragment reference for any child fragments included in that query.
523
+
* The ***fragment reference*** that `UserComponent` expects is the result of reading a parent query that includes its fragment, which in our case means a query that includes `...UsernameSection_user`. In other words, the `data` obtained as a result of `useLazyLoadQuery` also serves as the fragment reference for any child fragments included in that query.
524
524
* As mentioned previously, ***all fragments must belong to a query when they are rendered,*** which means that all fragment components *must* be descendants of a query. This guarantees that you will always be able to provide a fragment reference for `useFragment`, by starting from the result of reading a root query with `useLazyLoadQuery`.
525
525
526
526
### Variables
@@ -743,7 +743,7 @@ Relay currently does not expose the resolved variables (i.e. after applying argu
743
743
744
744
As you may have noticed, we mentioned that using `useLazyLoadQuery` will ***fetch*** a query from the server, but we didn't elaborate on how to render a loading UI while the query is being loaded. We will cover that in this section.
745
745
746
-
To render loading states while a query is being fetched, we rely on [React Suspense](https://reactjs.org/docs/concurrent-mode-suspense.html). Suspense is a new feature in React that allows components to interrupt or *"suspend"* rendering in order to wait for some asynchronous resource (such as code, images or data) to be loaded; when a component "suspends", it indicates to React that the component isn't *"ready"* to be rendered yet, and wont be until the asynchronous resource it's waiting for is loaded. When the resource finally loads, React will try to render the component again.
746
+
To render loading states while a query is being fetched, we rely on [React Suspense](https://reactjs.org/docs/concurrent-mode-suspense.html). Suspense is a new feature in React that allows components to interrupt or *"suspend"* rendering in order to wait for some asynchronous resource (such as code, images or data) to be loaded; when a component "suspends", it indicates to React that the component isn't *"ready"* to be rendered yet, and won't be until the asynchronous resource it's waiting for is loaded. When the resource finally loads, React will try to render the component again.
747
747
748
748
This capability is useful for components to express asynchronous dependencies like data, code, or images that they require in order to render, and lets React coordinate rendering the loading states across a component tree as these asynchronous resources become available. More generally, the use of Suspense give us better control to implement more deliberately designed loading states when our app is loading for the first time or when it's transitioning to different states, and helps prevent accidental flickering of loading elements (such as spinners), which can commonly occur when loading sequences aren't explicitly designed and coordinated.
749
749
@@ -870,7 +870,10 @@ function App() {
870
870
Whenever we're going to make a transition that might cause new content to suspend, we should use the [**`useTransition`**](https://reactjs.org/docs/concurrent-mode-patterns.html#transitions) to schedule the update for transition:
871
871
872
872
```javascript
873
-
const {useTransition} =require('React');
873
+
const {
874
+
useState,
875
+
useTransition,
876
+
} =require('React');
874
877
875
878
functionTabSwitcher() {
876
879
// We use startTransition to schedule the update
@@ -908,7 +911,10 @@ The ***pending*** stage is the first state in a transition, and is usually rende
908
911
By default, when a suspense transition occurs, if the new content suspends, React will automatically transition to the loading state and show the fallbacks from any `Suspense` boundaries that are in place for the new content. However, if we want to delay showing the loading state, and show a *pending* state instead, we can also use [**`useTransition`**](https://reactjs.org/docs/concurrent-mode-patterns.html#transitions) to do so:
909
912
910
913
```javascript
911
-
const {useTransition} =require('React');
914
+
const {
915
+
useState,
916
+
useTransition,
917
+
} =require('React');
912
918
913
919
constSUSPENSE_CONFIG= {
914
920
// timeoutMs allows us to delay showing the "loading" state for a while
@@ -949,7 +955,7 @@ function TabSwitcher() {
949
955
Let's take a look at what's happening here:
950
956
951
957
* In this case, we're passing the **`SUSPENSE_CONFIG`** config object to `useTransition` in order to configure how we want this transition to behave. Specifically, we can pass a **`timeoutMs`** property in the config, which will dictate how long React should wait before transitioning to the *"loading"* state (i.e. transition to showing the fallbacks from the `Suspense` boundaries), in favor of showing a ***pending*** state controlled locally by the component during that time.
952
-
* `useTransition` will also return a **`isPending`** boolean value, which captures the pending state. That is, this value will become `true` ***immediately*** when the transition starts, and will become `false` when the transition reaches the fully *"completed"* stage, that is, when all the new content has been fully loaded. As mentioned above, the pending state should be used to give immediate post to the user that they're action has been received, and we can do so by using the `isPending` value to control what we render; for example, we can use that value to render a spinner next to the button, or in this case, disable the button immediately after it is clicked.
958
+
* `useTransition` will also return a **`isPending`** boolean value, which captures the pending state. That is, this value will become `true` ***immediately*** when the transition starts, and will become `false` when the transition reaches the fully *"completed"* stage, that is, when all the new content has been fully loaded. As mentioned above, the pending state should be used to give immediate post to the user that the action has been received, and we can do so by using the `isPending` value to control what we render; for example, we can use that value to render a spinner next to the button, or in this case, disable the button immediately after it is clicked.
953
959
954
960
For more details, check out the [React docs on Suspense](https://reactjs.org/docs/concurrent-mode-suspense.html).
@@ -1313,7 +1319,7 @@ const store = new Store(source, {gcScheduler});
1313
1319
1314
1320
The Relay Store internally holds a release buffer to keep a specific (configurable) number of queries temporarily retained even after they have been released by their original owner (i.e., usually when a component rendering that query unmounts). This makes it possible (and more likely) to reuse data when navigating back to a page, tab or piece of content that has been visited before.
1315
1321
1316
-
In order to configure the size of the release buffer, we can you can **`gcReleaseBufferSize`** option to the Relay Store:
1322
+
In order to configure the size of the release buffer, you can provide a **`gcReleaseBufferSize`** option to the Relay Store:
Just marking the store or records as stale will cause queries to be refetched they next time they are evaluated; so for example, the next time you navigate back to a page that renders a stale query, the query will be refetched even if the data is cached, since the query references stale data.
1374
+
Just marking the store or records as stale will cause queries to be refetched the next time they are evaluated; so for example, the next time you navigate back to a page that renders a stale query, the query will be refetched even if the data is cached, since the query references stale data.
1369
1375
1370
1376
This is useful for a lot of use cases, but there are some times when we’d like to immediately refetch some data upon invalidation, for example:
1371
1377
@@ -1957,7 +1963,7 @@ Let's distill what's happening in this example:
1957
1963
1958
1964
## Rendering List Data and Pagination
1959
1965
1960
-
There are several scenarios in which we'll want to query a list of data from the GraphQL server. Often times we wont want to query the *entire* set of data up front, but rather discrete sub-parts of the list, incrementally, usually in response to user input or other events. Querying a list of data in discrete parts is usually known as [Pagination](https://graphql.github.io/learn/pagination/).
1966
+
There are several scenarios in which we'll want to query a list of data from the GraphQL server. Often times we won't want to query the *entire* set of data up front, but rather discrete sub-parts of the list, incrementally, usually in response to user input or other events. Querying a list of data in discrete parts is usually known as [Pagination](https://graphql.github.io/learn/pagination/).
* `usePaginationFragment` behaves the same way as a `useFragment` ([Fragments](#fragments)), so our list of friends is available under **`data.friends.edges.node`**, as declared by the fragment. However, it also has a few additions:
2050
2056
* It expects a fragment that is a connection field annotated with the `@connection` directive
2051
2057
* It expects a fragment that is annotated with the `@refetchable` directive. Note that `@refetchable` directive can only be added to fragments that are "refetchable", that is, on fragments that are on `Viewer`, or on `Query`, or on a type that implements `Node` (i.e. a type that has an `id` field).
2052
-
* It takes to Flow type parameters: the type of the generated query (in our case `FriendsListPaginationQuery`), and a second type which can always be inferred, so you only need to pass underscore (`_`).
2058
+
* It takes two Flow type parameters: the type of the generated query (in our case `FriendsListPaginationQuery`), and a second type which can always be inferred, so you only need to pass underscore (`_`).
2053
2059
* Note that we're using `[SuspenseList](https://reactjs.org/docs/concurrent-mode-patterns.html#suspenselist)` to render the items: this will ensure that the list is rendered in order from top to bottom even if individual items in the list suspend and resolve at different times; that is, it will prevent items from rendering out of order, which prevents content from jumping around after it has been rendered.
2054
2060
2055
2061
### Pagination
@@ -2488,7 +2494,7 @@ Let's distill what's going on here:
2488
2494
* In our case, we need to pass the count we want to fetch as the `first` variable, and we can pass different values for our filters, like `orderBy` or `searchTerm`.
2489
2495
* This will re-render your component and may cause it to suspend (as explained in [Transitions And Updates That Suspend](#transitions-and-updates-that-suspend)) if it needs to send and wait for a network request. If `refetch` causes the component to suspend, you'll need to make sure that there's a `Suspense` boundary wrapping this component from above, and/or that you are using [`useTransition`](https://reactjs.org/docs/concurrent-mode-patterns.html#transitions) with a Suspense config in order to show the appropriate pending or loading state.
2490
2496
* Note that since `refetch` may cause the component to suspend, regardless of whether we're using a Suspense config to render a pending state, we should always use `startTransition` to schedule that update; any update that may cause a component to suspend should be scheduled using this pattern.
2491
-
* Conceptually, when we call refetch, we're fetching the connection *from scratch*. It other words, we're fetching it again from the *beginning* and ***"resetting"*** our pagination state. For example, if we fetch the connection with a different `search_term`, our pagination information for the previous `search_term` no longer makes sense, since we're essentially paginating over a new list of items.
2497
+
* Conceptually, when we call refetch, we're fetching the connection *from scratch*. In other words, we're fetching it again from the *beginning* and ***"resetting"*** our pagination state. For example, if we fetch the connection with a different `search_term`, our pagination information for the previous `search_term` no longer makes sense, since we're essentially paginating over a new list of items.
2492
2498
2493
2499
### Adding and Removing Items From a Connection
2494
2500
@@ -3376,7 +3382,7 @@ Let's see what's happening here:
3376
3382
3377
3383
* The `optimisticUpdater` has the same signature and behaves the same way as the regular `updater` function, the main difference being that it will be executed immediately, before the mutation response completes.
3378
3384
* If the mutation succeeds, ***the optimistic update will be rolled back,*** and the server response will be applied.
3379
-
* Note that if we used an `optimisticResponse`, we wouldn't able to statically provide a value for `like_count`, since it requires reading the current value from the store first, which we can do with an `optimisticUpdater`.
3385
+
* Note that if we used an `optimisticResponse`, we wouldn't be able to statically provide a value for `like_count`, since it requires reading the current value from the store first, which we can do with an `optimisticUpdater`.
3380
3386
* Also note that when mutation completes, the value from the server might differ from the value we optimistically predicted locally. For example, if other "Likes" occurred at the same time, the final `like_count` from the server might've incremented by more than 1.
3381
3387
* If the mutation *fails*, ***the optimistic update will be rolled back,*** and the error will be communicated via the `onError` callback.
3382
3388
* Note that we're not providing an `updater` function, which is okay. If it's not provided, the default behavior will still be applied when the server response arrives (i.e. merging the new field values for `like_count` and `viewer_does_like` on the `Post` object).
@@ -3402,7 +3408,7 @@ In general, execution of the `updater` and optimistic updates will occur in the
3402
3408
3403
3409
_**Full Example**_
3404
3410
3405
-
This means that in more complicated scenarios you can still provide all 3 options: `optimisticResponse`, `optimisticUpdater` and `updater`. For example, the mutation to add a new comment could like something like the following (for full details on updating connections, check out our [Adding and Removing Items From a Connection](#adding-and-removing-items-from-a-connection) guide):
3411
+
This means that in more complicated scenarios you can still provide all 3 options: `optimisticResponse`, `optimisticUpdater` and `updater`. For example, the mutation to add a new comment could be like something like the following (for full details on updating connections, check out our [Adding and Removing Items From a Connection](#adding-and-removing-items-from-a-connection) guide):
Let's distill this example, according to the execution order of the updaters:
3497
3503
3498
3504
* Given that an `optimisticResponse` was provided, it will be executed *first*. This will cause the new value of `viewer_has_commented` to be merged into the existing `Post` object, setting it to `true`.
3499
-
* Given that an `optimisticResponse` was provided, it will be executed next. Our `optimisticUpdater` will create new comment and edge records from scratch, simulating what the new edge in the server response would look like, and then add the new edge to the connection.
3505
+
* Given that an `optimisticUpdater` was provided, it will be executed next. Our `optimisticUpdater` will create new comment and edge records from scratch, simulating what the new edge in the server response would look like, and then add the new edge to the connection.
3500
3506
* When the optimistic updates conclude, components subscribed to this data will be notified.
3501
3507
* When the mutation succeeds, all of our optimistic updates will be rolled back.
3502
3508
* The server response will be processed by Relay, and this will cause the new value of `viewer_has_commented` to be merged into the existing `Post` object, setting it to `true`.
@@ -3507,7 +3513,7 @@ Let's distill this example, according to the execution order of the updaters:
3507
3513
3508
3514
The recommended approach when executing a mutation is to request ***all*** the relevant data that was affected by the mutation back from the server (as part of the mutation body), so that our local Relay store is consistent with the state of the server.
3509
3515
3510
-
However, often times it can be unfeasible to know and specify all the possible data the possible data that would be affected for mutations that have large rippling effects (e.g. imagine “blocking a user” or “leaving a group”).
3516
+
However, often times it can be unfeasible to know and specify all the possible data that would be affected for mutations that have large rippling effects (e.g. imagine “blocking a user” or “leaving a group”).
3511
3517
3512
3518
For these types of mutations, it’s often more straightforward to explicitly mark some data as stale (or the whole store), so that Relay knows to refetch it the next time it is rendered. In order to do so, you can use the data invalidation apis documented in our [Staleness of Data section](#staleness-of-data).
* In our specific example, we're adding a newcomment to our local store when. Specifically, we're adding a new item to a connection; for more details on the specifics of how that works, check out our [Adding and Removing Items From a Connection](#adding-and-removing-items-from-a-connection) section.
3759
3765
* Note that any local data updates will automatically cause components subscribed to the data to be notified of the change and re-render.
3760
3766
3761
-
#### CommitPayload
3767
+
#### commitPayload
3762
3768
3763
3769
**`commitPayload`** takes an `OperationDescriptor` and the payload for the query, and writes it to the Relay Store. The payload will be resolved like a normal server response for a query.
3764
3770
@@ -3838,7 +3844,7 @@ extend type Item {
3838
3844
3839
3845
#### Reading Client-Only Data
3840
3846
3841
-
We can read client-only data be selecting it inside[fragments](#fragments) or [queries](#queries) as normal:
3847
+
We can read client-only data by selecting it inside[fragments](#fragments) or [queries](#queries) as normal:
3842
3848
3843
3849
```javascript
3844
3850
const data = *useFragment*(
@@ -3932,7 +3938,7 @@ fetchQuery<AppQuery>(
3932
3938
```
3933
3939
3934
3940
* The returned Promise that resolves to the query data, read out from the store when the first network response is received from the server. If the request fails, the promise will reject
3935
-
* Note that we specify the `AppQuery` Flow type; this ensures that the type of the data the the promise will resolve to matches the shape of the query, and enforces that the `variables` passed as input to `fetchQuery` match the type of the variables expected by the query.
3941
+
* Note that we specify the `AppQuery` Flow type; this ensures that the type of the data the promise will resolve to matches the shape of the query, and enforces that the `variables` passed as input to `fetchQuery` match the type of the variables expected by the query.
3936
3942
3937
3943
> See also our API Reference for [fetchQuery](api-reference.html#fetchquery).
0 commit comments