
Asynchronous Javascript: Sequential, Parallel, and Combined Processing
Control asynchronous flow with sequential and parallel operations in Javascript with async await and more
Tuesday, Mar 9, 2021
Event Loop (Quick Synopsis)
The Browser and NodeJs run a constant single threaded event loop to execute your Javascript code. It will first run all of the synchronous non blocking code while queueing up any asynchronous events to be executed at a later point in time.
Tasks such as timeouts, setIntervals, network requests, promises will run in a separate thread pool and once those tasks finish, the event loop will either queue up the microtask at the end of the current event loop/ macrotask at the beginning of the next event loop respectively.
Fantastic talk on the event loop by Jake Archibald
Sequential
Use Cases:
Sequential processing is incredibly handy for asynchronous tasks that need to run in a particular order. You might have data that is required for an API request that depends on another API request
Fetch data from your api, fetch data from another api
Delay Function to mock an asynchronous API call.
const apiCall = (ms) => new Promise((res) => setTimeout(res, ms));
(async () => {
console.log('Api Call 1')
await apiCall(1000);
console.log('Api Call 2')
await apiCall(1000);
console.log('Api Call 3')
await apiCall(1000);
console.log('done!')
/* Outputs */
// 'Api Call 1'
// 'Api Call 2'
// 'Api Call 3'
// 'done!'
})();
for of loop to sequentially process each apiCall within the array of emoji.
(async () => {
const items = ['š„', 'š', 'š', 'š', 'š¶'];
for (const item of items) {
console.log(item)
await apiCall(2000);
console.log(`${item} done`)
}
})();
Outputs
š„
š„ done
š
š done
š
š done
š
š done
š¶
š¶ done
Can alternatively use a reduce function to control sequential order
(async () => {
await ['š„', 'š', 'š', 'š', 'š¶'].reduce(async (prev, cur) => {
// await previous promise,
// if remove this line, will run in parallel
await prev;
//
console.log(cur)
await apiCall(2000);
console.log(`${cur} done`)
return;
}, Promise.resolve());
})();
Parallel
Use Cases:
If your promises/asnc tasks have 0 dependencies of one another and only care when all of them resolve to move onto other asynchronous tasks.
(async () => {
//
await Promise.all(['š„', 'š', 'š', 'š', 'š¶'].map(async (item) => {
console.log(item)
await apiCall(2000);
console.log(`${item} done`);
return;
}));
})()
Outputs
š„
š
š
š
š¶
š„ done
š done
š done
š done
š¶ done
Alternatively can use the previously used reduce function and remove awaiting the previous returned promise
(async () => {
await ['š„', 'š', 'š', 'š', 'š¶'].reduce(async (prev, cur) => {
// await prev;
console.log(cur)
await apiCall(2000);
console.log(`${cur} done`)
return;
}, Promise.resolve());
})();
While running in parallel, once each individual promise resolves, you can iterate over the result using for await
.
(async () => {
const tasks = ['š„', 'š', 'š', 'š', 'š¶'].map(async (item) => {
console.log(item)
await apiCall(2000);
console.log(`${item} done`);
return item;
});
// iterates over results once Promise.all resolves
for await (const i of tasks) {
console.log(`for of ${i}`)
}
})()
Outputs
š„
š
š
š
š¶
š„ done
for of š„
š done
for of š
š done
for of š
š done
for of š
š¶ done
for of š¶
Combined
Scenario:
For either performance or API concurrency limitations, you can't affrord to process n number of requests in parallel.
Additionally you don't want to wait for every single async request to resolve one after the other (sequentially), which could take a really long time.
Solution: Combine both sequential and parallel processing
First Implementation
For this attempt we're going to make use of a generator function to create an iterable with inner arrays containing the limit we pass in.
Generator Helper
function * chunkGen(collection, size=2, i=0) {
for (; i < collection.length; i += size) {
yield collection.slice(i, i + size);
}
}
function chunk(collection, size=1) {
const chunked = [];
const gen = chunkGen(collection, size);
let c = gen.next();
while (!c.done) {
chunked.push(c.value);
c = gen.next();
}
return chunked;
}
This will loop over 20 requests, limiting the total running async tasks to 5. That way we avoid taxing performance or throttle API requests.
...
// helper to create some randomness in our apiCall calls
const delay = () => Math.floor(Math.random() * 5) + 1;
(async () => {
const requestsToProcess = ['š„', 'š', 'š', 'š', 'š¶', 'š³', 'š',
'š', 'š', 'š', 'š¦§', 'š¦', 'š
', 'š²', 'š„©', 'š¦', 'š', 'šµ',
'š', 'šø'];
const batches = chunk(requestsToProcess, 5);
for (const batch of batches) {
// queue each chunk of promises in parallel
await Promise.all(batch.map(async (i) => {
console.log(i)
await apiCall(delay() * 1000)
console.log(`done ${i}`)
}))
}
})();
Outputs
š„
š
š
š
š¶
done š„
done š
done š
done š¶
done š
š³
š
š
š
š
done š
done š
done š³
done š
done š
š¦§
š¦
š
š²
š„©
done š¦§
done š¦
done š„©
done š
done š²
š¦
š
šµ
š
šø
done š
done š¦
done š
done šµ
done šø
So this solves the scenario mentioned. It runs all the promises within a single batch in parallel but processes all of the batches sequentially.
You might have noticed the issue with this approach.
Only once ALL of the promises within a batch resolves, will it move onto the next batch. What if 90% of a batch is done processing and 1 individual promise is still yet to resolve? We could queue up additional promises once resources free up. Meaning we can decrease the overall time it takes to process the entire collection of async tasks.
Final Implementation
For the final solution we're going to take advantage of a promise task queue. Not going to cover how to build from one scratch but feel free to look through the source code from the repo, async-parallel-limit
Going to use this very small npm module I've recently published for fun.
This was inspired by asyncTimesLimit by the async project. Definitely worth taking a look at their docs and features.
async-parallel-limit
npm i @mjyocca/async-parallel-limit
So instead of the previous approach of waiting for all the promises in a batch to resolve before queueing up a new batch, this will queue another async task as soon as each promise resolves.
import asyncParallel from '@mjyocca/async-parallel-limit';
(async () => {
const requestsToProcess = ['š„', 'š', 'š', 'š', 'š¶', 'š³', 'š',
'š', 'š', 'š', 'š¦§', 'š¦', 'š
', 'š²', 'š„©', 'š„', 'š', 'šµ',
'š', 'šø', 'šŖ²', 'šø', 'š', 'š', 'š', 'š', 'šŖØ', 'š¹', 'š',
'š²', 'ā', 'š', 'šŗ', 'š¦', 'š¦'];
// limiting to a total of 5 at any given time
await asyncParallel(requestsToProcess, 5, async (n, emoji, next) => {
console.log(emoji)
await apiCall(delay() * 1000)
// done
console.log(`done ${emoji}`)
next();
})
})();
Outputs
š„
š
š
š
š¶
done š¶
š³
done š„
š
done š
š
done š
š
done š
š
done š
š¦§
done š
š¦
done š
š
done š
š²
done š³
š„©
done š¦
š¦
done š¦§
š
done š
šµ
done š²
š
done šµ
šø
done š
done š„©
done š¦
done šø
done š
As you can see from the output it doesn't wait for 5 to finish before queuing up more promises. Making it a whole lot faster completing the entire queue.
Hope this helps and cheers!