The application I'm working on has to fire off multiple requests to an external API. It will receive the results of those requests asynchronously, but then I need to handle each response in a synchronous way. My research tells me there are multiple ways of doing this, but I am considering this idea and wondering if it will work:
Every time I make a call to the external API, its response is entered into the queue as soon as it asynchronously comes back:
function genericFetchFunction(url, callback) {
makeAsyncRequest(url, (result) => {
addToHandlingQueue({
data: result,
callback
});
});
}
The queue handler pushes the response onto the queue, and initiates the queue firing process.
const responseQueue = [];
function addToHandlingQueue(response) {
responseQueue.push(response);
if (responseQueue.length == 1) {
fireQueue();
}
}
function fireQueue() {
let item = responseQueue.shift();
item.callback(item.data);
if (responseQueue.length > 0) {
fireQueue();
}
}
Would this code work how I expect? Will all results get into the queue and their callbacks fired in sequence? If not - why not?
EDIT: The use case for this is that the callback for these responses is itself going to initiate a process that should be only handled synchronously rather than asynchronously (an update to a stored state).
asyncwaterfall and parallel techniques. It has all been done before and really simple to reproduce once you work with it, no need for a library to depend one though it’s quite small and stable.