DEV Community

Cover image for Speed Up Node.js and Prevent Blocking with worker_threads (Up to 5x Faster)
EugSh
EugSh

Posted on • Edited on

Speed Up Node.js and Prevent Blocking with worker_threads (Up to 5x Faster)

Hello!

In this article, I will explain how the worker_threads module in Node.js can significantly speed up your application and prevent route blocking during heavy computations.

The node:worker_threads module enables the use of threads that execute JavaScript in parallel.

Workers are useful for performing CPU-intensive operations, such as heavy calculations.

Let's give it a try!

Practical part

Let's create a simple Hono API to showcase the issue without using worker threads.
It doesn't matter what we use. I chose Hono because its code is clean and similar to Express code.

// main.js
import { Hono } from "hono";
import { serve } from "@hono/node-server";

const app = new Hono();
const PORT = 4200;

function calculate(amount) {
    return Array.from({ length: amount }, () => Math.floor(Math.random() * 1_000_000)).reduce(
        (acc, num) => (num % 2 === 0 ? acc + num : acc),
        0
    );
}

app.get("/", (c) => {
    return c.text("Hello, World!");
});

app.get("/compute", (c) => {
    return c.text(`Result: ${calculate(100_000_000)}`);
});

serve({
    fetch: app.fetch,
    port: PORT
});

console.log(`Server started on http://localhost:${PORT}`);
Enter fullscreen mode Exit fullscreen mode

Let me explain. In that app, there are two routes:

  • / - an example of a simple route without any computations
  • /compute - an example of a route with heavy computations

calculate(amount: number): number - This is a function that simulates heavy computations. It does so by generating an array of random numbers. Then it sums them.

That code has one huge issue.

To find it, let's send a request to / while request on /compute is processing.

Screenshot before worker_threads

As we can see, we cannot access the / route when a request is sent to /compute and the calculations have not been completed.

However, we can resolve this issue by using worker_threads module.

Theoretical break

First, we should understand what the worker_threads module is and how it allows us to solve that problem.

Some tasks, such as complex calculations, can block the event loop, which is why we can't access the / route after calculations start on /compute.

Worker threads enable multithreading in Node.js by allowing each thread to execute code independently each with its own event loop. This means that we can solve the problem by performing calculations in another thread, while the main thread handles the routes, keeping them available.

Later, we will also split our tasks into smaller pieces and assign each worker a smaller part, which will improve performance because worker threads allow us to handle multiple operations simultaneously. This improves performance in CPU-intensive tasks.

First, though, let's focus on unblocking our main thread while heavy calculations are being performed.

Continuing our practice

First, let's create an additional file called "worker.js" near our main file to store our calculations logic:

// worker.js
import { parentPort, workerData } from "worker_threads";

const numbers = Array.from({ length: workerData }, () => Math.floor(Math.random() * 1_000_000));

const result = numbers.reduce((acc, num) => (num % 2 === 0 ? acc + num : acc), 0);
parentPort.postMessage(result);
Enter fullscreen mode Exit fullscreen mode
  • Previously, we got the amount of numbers we needed to calculate from the function argument. We now get it from workerData, which is a piece of information that you can provide to the worker.
  • Next, we will use parentPort.postMessage to send the result to the main thread. (parentPort is a communication channel that allows the worker to send messages back to the main thread)
// main.js
import { Hono } from "hono";
import { serve } from "@hono/node-server";
import { Worker } from "worker_threads";

const app = new Hono();
const PORT = 4200;

function calculate(amount) {
    return new Promise((resolve, reject) => {
        const worker = new Worker("./worker.js", { workerData: amount });

        worker.on("message", (data) => {
            resolve(data);
        });

        worker.on("error", (error) => {
            reject(error);
        });
    });
}

app.get("/", (c) => {
    return c.text("Hello, World!");
});

app.get("/compute", async (c) => {
    return c.text(`Result: ${await calculate(100_000_000)}`);
});

serve({
    fetch: app.fetch,
    port: PORT
});

console.log(`Server started on http://localhost:${PORT}`);
Enter fullscreen mode Exit fullscreen mode
  • We rewrote the calculate function to return a new promise that creates a new worker and provides the workerData with the amount of numbers. The promise resolves or rejects only when the calculations are complete or an error is received.
  • We converted the callback for the /compute route into an asynchronous function that waits for the promise returned from the calculate function to resolve or reject, and we added await to the calculate function invocation.
  • You can also wrap the calculate function invocation in a try/catch block to handle promise rejection if you want your code to be more reliable.

Now, let's run our code!

Screenshot after worker_threads

As we can see, we can still access the "/" route when calculations are pending.

By the way, the calculation takes 7.63 seconds to complete.

To improve performance, we can now split our tasks into smaller pieces and distribute them among a few workers just by replacing this:

app.get("/compute", async (c) => {
    return c.text(`Result: ${await calculate(100_000_000)}`);
});
Enter fullscreen mode Exit fullscreen mode

with this:

app.get("/compute", async (c) => {
    const result = await Promise.all([
        calculate(25_000_000),
        calculate(25_000_000),
        calculate(25_000_000),
        calculate(25_000_000)
    ]);
    return c.text(result.reduce((acc, num) => (acc += num), 0));
});
Enter fullscreen mode Exit fullscreen mode

Now, we are splitting our big task into smaller pieces, giving them to four workers, getting the results, and then summing them.

Screenshot with four workers

Since each worker runs on its own thread, distributing the load across multiple workers allows us to take advantage of multi-core processors, reducing overall execution time.

Yes, we just reduced the execution time from 7.63 seconds to 1.56 seconds. Easy, right?

Using worker_threads, we solved the route-blocking problem and significantly improved performance by parallelizing the work across multiple threads.

Conclusion

Using worker_threads in Node.js can drastically improve performance in CPU-bound tasks by:

  • Unblocking the main thread, so your API stays responsive.
  • Parallelizing heavy computations, reducing execution time.

This technique is simple to implement and provides real-world benefits in performance-critical applications.

Thanks for reading - and goodbye!

Top comments (1)

Collapse
 
xzdes profile image
Pavel

Thank you for sharing, and most importantly, the examples are very detailed.