I have a heavy data processing operation that I need to get done per 10-12 simultaneous requests. I have read that for a higher level of concurrency Node.js is a good platform and it achieves it by having a non-blocking event loop.
What I know is that for having things like querying a database, I can spawn off an event to a separate process (like mongod, mysqld) and then have a callback that will handle the result from that process. Fair enough.
But what if I want to have a heavy piece of computation to be done within a callback? Won't it block other requests until the code in that callback is executed completely? For example, I want to process a high-resolution image, and the code I have is in Javascript itself (no separate process to do image processing).
The way I think of implementing is like
get_image_from_db(image_id, callback(imageBitMap) {
heavy_operation(imageBitMap); // Can take 5 seconds.
});
Will that heavy_operation stop the node from taking in any request for those 5 seconds? Or am I thinking the wrong way to do such a task? Please guide me, I am a JS newbie.
UPDATE
Or can it be like I could process a partial image and make the event loop go back to take in other callbacks and then return back to process that partial image? (something like prioritizing events).