-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement a util function to batch-upsert cache entries #4561
Conversation
Review or Edit in CodeSandboxOpen the branch in Web Editor • VS Code • Insiders |
This pull request is automatically built and testable in CodeSandbox. To see build info of the built libraries, click here or the icon next to each commit SHA. Latest deployment of this branch, based on commit 3358c13:
|
✅ Deploy Preview for redux-starter-kit-docs ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
size-limit report 📦
|
> | ||
}, | ||
], | ||
) => PayloadAction<NormalizedQueryUpsertEntryPayload> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
) => PayloadAction<NormalizedQueryUpsertEntryPayload> | |
) => PayloadAction<NormalizedQueryUpsertEntryPayload[]> |
My main critique point here is conceptual - if you do normalization, oftentimes you have a list with 50 items, and a request might only update a few of those (and also not return the unchanged ones). So to update three items in a list of 50, userland code would have to retrieve the current cache entry with 50 items from the Redux store, manually update those three items and write them back. Essentially, moving business logic out of Redux. That's why we have the callback notation in Could I suggest we modify this a bit so something like this would become possible? upsertEntries([
{
endpointName: "item",
arg: 1,
value: item1,
// this is the default
merge: (_, value) => value,
},
{
endpointName: "item",
arg: 20,
value: item20,
// if someone wanted to opt out of overriding existing values, they could do something like this
merge: (existing) => existing,
},
{
endpointName: "allItems",
arg: undefined,
value: { 1: item1, 20: item20 },
// and here someone could have custom merging logic
merge(existing, upserted) {
return { ...existing, ...upserted }
},
},
]); The example above shows one weakness regarding usage with entityAdapter, though. The following would not be possible: {
endpointName: "allItems",
arg: undefined,
value: [ item1, item20 ],
merge: itemsAdapter.setMany,
}, The problem here being the case of "nothing exists yet, so we don't call merge". In that case, I'm open for ideas here :) PS: This could work? {
endpointName: "allItems",
arg: undefined,
value: [ item1, item20 ],
+ initialState: itemsAdapter.getInitialState()
+ // or
+ getInitialState: itemsAdapter.getInitialState
merge: itemsAdapter.setMany,
}, |
08983a0
to
3e7e4a8
Compare
Lenz and I discussed this internally, and we noted that we already have a |
3e7e4a8
to
b71222d
Compare
9ced387
to
3358c13
Compare
This PR:
pending
andfulfilled
cache entries into reusable helper functionsutil.upsertEntries
action creator + reducer that accepts an array of{endpointName, arg, value}
entries, and upserts those into the cache, overwriting any prior entriesUse Cases
One use case might be just preloading some data into the cache programmatically.
The other main use case I'm thinking of is a pseudo-normalization approach, ala #4106 and #4073.
Say you have a
getPosts
endpoint and it returns the usual array. RTKQ won't deduplicate individual objects across multiple endpoints, so if you then try to accessgetPost("2")
, we have to make that request separately.It seems very reasonable to want to take the response contents from the
getPosts
query, and prepopulate all of the corresponding individualgetPost
cache entries.We have
upsertQueryData
, which relies on the entire async thunk lifecycle, and like the rest of our utils it only does a single entry at a time. That's always felt too limiting, both in terms of possibly wanting to update many cache entries at once, and performance due to the number of actions that would have to get dispatched.Status
This is the first seemingly-working implementation. Thanks to @EskiMojo14 the external types seem to work (ie,
{endpointName: 'getPost'}
correctly expects to go along with{arg: string, value: Post}
). I threw in one POC unit testPerf
Per #4106, doing this by calling the
apiSlice.reducer()
with a separate faked action for each item was still relatively slow - 1500ms for 1000 items (as compared to the abominably slow 6000ms for 1000 items using individualupsertQueryData
calls).In a quick local test, a single
dispatch(api.util.upsertEntries())
call with 1000 separategetPost
entries was only 31ms. That's because we only called the reducer once, so we did all that work inside a single Immer call. That should reduce the perf overhead significantly.Plans
The starting point would be to just ship this util action creator as-is, and users would manually dispatch that in
onQueryStarted
after the response resolves.I can also imagine a new
upsertCacheEntries
option to make this more of a first-class citizen, which would look roughly like(data: T) => NormalizedQueryUpsertEntry[]
.I'm sure there's edge cases here. For example,
merge
expectsbaseQueryMeta
, and since we aren't even using a real request here, we don't have that. Right now I'm faking it with an empty object. I could imagine amerge
function actually expecting a real value instead.