I am using apollo-link-rest to construct calls to a free 3rd party rest api. I think that it's a great use case.
However, as we all know 3rd party apis should be handled with _respect in regards of request concurrency & min safety gaps between successive api requests_. I.e. making sure you're hitting the server not more often than every 100ms.
As apollo is taking over the calls, I was wondering if there is a nice way to spread out calls over time? By accident, it doesn't take much to create a query that fires of additional requests to resolve the full request. For example by using the @export
directive.
_Two calls at once_ ππ
const QUERY = gql`
query RestData($email: String!) {
# 1) Query for user π
users @rest(path: '/users/email?{args.email}', method: 'GET', type: 'User') {
id @export(as: "id")
firstName
lastName
# 2) Sub query for friends fires immediately as well π
friends @rest(path: '/friends/{exportVariables.id}', type: '[User]') {
firstName
lastName
}
}
}
`;
To mitigate light cases of the n+1 query problem (i.e. fetch all products, than each products description), I'd like to be able to have a safety net for server 429 errors (fail due to rate limiting from the server).
_What I do right now: Wrapping fetch_
const restProviders = new RestLink({
endpoints: {
mySlowEndpoint: baseUrl
},
customFetch: (...args) =>
new Promise((resolve, reject) => {
// πβ·πβ·π Implementing my queue logic to fire of fetches
resolve(fetch(...args));
// when enough time has elapsed. πβ·πβ·π
})
})
Right now I am wrapping the fetching process in custom queue logic, but is there a "more elegant way" we could integrate this? I am sure, many people would be interested to have some control of outgoing load without necessarily failing queries right away.
Maybe I should add what I do right now for other people coming in from google. Right now I use p-throttle
(repo).
// (...)
import pThrottle from "p-throttle";
const restProviders = new RestLink({
endpoints: {
mySlowEndpoint: baseUrl
},
// Throttle fetches to max 1 concurrent request π and
// min. delay of 0.5 seconds πβ·πβ·π .
customFetch: pThrottle( (...args) => {
return fetch(...args);
}, 1, 500)
})
// (...)
@D1no I was going to suggest customFetch, but youβve already done that!
Iβm not sure this should really be a directly configurable feature of Apollo-link-rest, but Iβd support including a snippet like this in the docs.
Can you point me to a nice place in the doc to PR this information as an example of what one can do with customFetch? I think you are right; throwing configurability like that into apollo-link-rest would couple fetch methodologies which is not very elegant.
In the PR Iβd kick out p-queue for a more contrived example for novices reading by. Anybody seriously using apollo-link-rest as a gate-way-drug into GraphQL will have a need for _some management of concurrency / debouncing_ at some point.
Hey @D1no, you can find the docs here. I'd suggest it should sit in theOptions
heading, above Complete options
.
Thanks @tombarton. Looking closer at the issue again, it might be a good idea to be able to have a simple throttle option for the fetch call anyway.
Though I originally said its coupling fetch, actually that is not true. Its the call to (any arbitrary) fetch methodology that is going rampant here. To be able to tame apollo-link-rest blindly / eager execution, some simple concurrency and ms limit option like seen above, might not be such a bad idea in the end.
const restProviders = new RestLink({
endpoints: {
mySlowEndpoint: baseUrl
},
maxConcurrency: 2 // Max Concurrency ππ
minThrottle: 500 // Handling Rate Limit in ms πβ·πβ·π
})
and/or being able to specify it in the query
const QUERY = gql`
query RestData($email: String!) {
users @rest(path: '/users/email?{args.email}', method: 'GET', type: 'User') {
id @export(as: "id")
firstName
lastName
# NEW: β€₯ πβ·π β€₯ ππ
friends @rest(throttle: 500, concurrency: 2, path: '/friends/{exportVariables.id}', type: '[User]') {
firstName
lastName
}
}
}
`;
After all, why should a fetch methodology manage how often it is called (it should not care what's going on upstairs). I'll think a little about that. In anyway, customFetch
should get passed some information about the current query and not only a blank request (to be able to make smart decisions if people really need to).
Maybe one of the maintainers / staff (@fbartho) can chip in? I'd be happy to take the time for a PR if its merit is approved.
@D1no I honestly liked your suggestion of including in the docs. -- Queueing is the opposite of the way we wanted to do things in our company's app. (Deduplicating was one that I built, for example).
Separately, because endpoints can be totally different, I'm hesitant to set "maxConcurrency/minThrottle" as global concerns.
ApolloLinkRest is about "plugging stuff in to Apollo" -- these concerns you've been describing really feel like they should be at the Network layer aka custom fetch. -- Additionally, I find it unlikely that you want to customize these values for every query-path, so embedding their settings in the GraphQL seems noisy.
Alright β I'll PR an addition to the documentation first. We should bottom out customFetch
first before creaming layers on top of it.
Most helpful comment
Maybe I should add what I do right now for other people coming in from google. Right now I use
p-throttle
(repo).