Asynchronous Javascript: Sequential, Parallel, and Combined Processing
Control asynchronous flow with sequential and parallel operations in Javascript with async await and more
Tuesday, Mar 9, 2021
javascript
Event Loop (Quick Synopsis)
The Browser and NodeJs run a constant single threaded event loop to execute your Javascript code. It will first run all of the synchronous non blocking code while queueing up any asynchronous events to be executed at a later point in time.
Tasks such as timeouts, setIntervals, network requests, promises will run in a separate thread pool and once those tasks finish, the event loop will either queue up the microtask at the end of the current event loop/ macrotask at the beginning of the next event loop respectively.
Sequential processing is incredibly handy for asynchronous tasks that need to run in a particular order. You might have data that is required for an API request that depends on another API request
Fetch data from your api, fetch data from another api
Delay Function to mock an asynchronous API call.
for of loop to sequentially process each apiCall within the array of emoji.
Outputs
Can alternatively use a reduce function to control sequential order
Parallel
Use Cases:
If your promises/asnc tasks have 0 dependencies of one another and only care when all of them resolve to move onto other asynchronous tasks.
Outputs
Alternatively can use the previously used reduce function and remove awaiting the previous returned promise
While running in parallel, once each individual promise resolves, you can iterate over the result using for await.
Outputs
Combined
Scenario:
For either performance or API concurrency limitations, you can't affrord to process n number of requests in parallel.
Additionally you don't want to wait for every single async request to resolve one after the other (sequentially), which could take a really long time.
Solution: Combine both sequential and parallel processing
First Implementation
For this attempt we're going to make use of a generator function to create an iterable with inner arrays containing the limit we pass in.
Generator Helper
This will loop over 20 requests, limiting the total running async tasks to 5. That way we avoid taxing performance or throttle API requests.
Outputs
So this solves the scenario mentioned. It runs all the promises within a single batch in parallel but processes all of the batches sequentially.
You might have noticed the issue with this approach.
Only once ALL of the promises within a batch resolves, will it move onto the next batch. What if 90% of a batch is done processing and 1 individual promise is still yet to resolve? We could queue up additional promises once resources free up. Meaning we can decrease the overall time it takes to process the entire collection of async tasks.
Final Implementation
For the final solution we're going to take advantage of a promise task queue. Not going to cover how to build from one scratch but feel free to look through the source code from the repo, async-parallel-limit
Going to use this very small npm module I've recently published for fun.
This was inspired by asyncTimesLimit by the async project. Definitely worth taking a look at their docs and features.
async-parallel-limit
So instead of the previous approach of waiting for all the promises in a batch to resolve before queueing up a new batch, this will queue another async task as soon as each promise resolves.
Outputs
As you can see from the output it doesn't wait for 5 to finish before queuing up more promises. Making it a whole lot faster completing the entire queue.