Looking at the progress Node.js has made in the past few years makes me think that many popular and widely used npm packages might soon be unnecessary or even deprecated. Most Node.js developers are familiar with packages like dotenv for working with .env
files, node-fetch for making HTTP requests, and jest for testing.
These packages are awesome, but installing any new npm
package in a project always adds some maintenance overhead. In some cases, that maintenance cost can become a real problem for organizations - especially with heavily distributed microservices. Building a backend or fullstack app with just pure Node.js might help with that, but is it really worth it?
Is it worth to build a REST API in pure Node.js without any dependencies ?
whole code mentioned in the article is here
We’re going to build an API that returns bitcoin rates. For the data source, we’ll use blockchain.com.
Endpoints:
/v1/bitcoin/rates - returns a list of bitcoin rates
/v1/bitcoin/rates/:currency - returns the exchange rate for a given currency
Repo setup with Typescript ❌
The Node.js team announced support for typescript files, also known as Type stripping. This lets us work with .ts
files natively, so you get the benefits of static type checks. But you should keep in mind that this is just a soft layer. Node.js replaces TypeScript syntax with whitespaces at runtime, and doesn’t actually do any type checking. Also, .ts
files can only use erasable TypeScript syntax, which means things like enums
aren’t allowed (which might be for the best).
Routes handling ❌
To support different REST endpoints, you need to write a custom router that can trigger different route handlers for /v1/bitcoin/rates
and /v1/bitcoin/rates/:currency
. This takes a bit of playing around with regex:
get(path: string, handler: RouteHandler): void {
// Convert path parameters like :currency to named regex capture groups
const patternString = path
.replace(/\/:([^/]+)/g, "/(?<$1>[^/]+)")
.replace(/\//g, "\\/");
const pattern = new RegExp(`^${patternString}$`);
routes.push({ method: "GET", pattern, handler, path });
},
So that’s the first big downside of building a REST API without using packages like express.js - you have to build all the route handling yourself, plus add some support for middlewares.
HTTP communication with third-party services ✅
Sending HTTP requests to different APIs is much easier now, since you don’t need to install any extra packages. Node.js includes a browser-compatible version of fetch().
export async function fetchBitcoinRates(): Promise<BitcoinRates> {
const response = await fetch(BITCOIN_TICKER_URL, {
method: "GET",
headers: {
Accept: "application/json",
},
});
if (!response.ok) {
throw new Error(`HTTP error! Status: ${response.status}`);
}
return (await response.json()) as BitcoinRates;
}
Propagating request context ✅
Passing some kind of context
down the request route can be a pain. Your functions always need to take that extra parameter, which was probably set up at request time in some middleware and added to the req
object. For example:
export function createContextMiddleware() {
return async (
req: IncomingMessage,
res: ServerResponse,
next: () => Promise<void> | void,
) => {
const user = await findUser(req.headers)
const context: RequestContext = {
traceId: generateTraceId(),
timestamp: Date.now(),
user,
};
res.setHeader("X-Trace-ID", context.traceId);
req.context = context;
next()
};
}
And then every function in your controller, service, or repository will have to take that context
or request
as an argument to get metadata about the incoming request.
Luckily, this can be handled with Asynchronous context tracking. This lets you store data throughout the lifetime of a incoming request. You just need to call getStore()
on the local storage object you created:
const myContext = asyncLocalStorage.getStore()
With this, you can easily build middleware that attaches caller metadata, and then reuse it anywhere else in your codebase.
import { AsyncLocalStorage } from "node:async_hooks";
import { IncomingMessage, ServerResponse } from "node:http";
import crypto from "node:crypto";
export interface RequestContext {
traceId: string;
timestamp: number;
}
export const asyncLocalStorage = new AsyncLocalStorage<RequestContext>();
function generateTraceId(): string {
return crypto.randomUUID();
}
export function createContextMiddleware() {
return async (
req: IncomingMessage,
res: ServerResponse,
next: () => Promise<void> | void,
) => {
const context: RequestContext = {
traceId: generateTraceId(),
timestamp: Date.now(),
};
res.setHeader("X-Trace-ID", context.traceId);
await asyncLocalStorage.run(context, async () => {
await next();
});
};
}
export function getContext(): RequestContext {
return asyncLocalStorage.getStore()!;
}
Tests with Node test runner ✅
Since version 20, the Node test runner is available as a stable module. This means you can safely use it for testing. The syntax that Node provides is pretty similar to Jest. You check your function results with assert, and test suites are organized with describe, test, beforeEach... aliases, just like in Jest. Here’s a sample test for bitcoinService.ts:
describe("fetchBitcoinRates", () => {
test("should return Bitcoin rates", async (ctx) => {
const mockResponse = {
USD: {
"15m": 10000,
last: 10000,
buy: 10000,
sell: 10000,
symbol: "$",
},
};
ctx.mock.method(global, "fetch", () => {
return Promise.resolve({
ok: true,
status: 200,
json: () => Promise.resolve(mockResponse),
});
});
const rates = await fetchBitcoinRates();
assert.deepEqual(rates, mockResponse);
});
});
The only problem with the current Node test runner is that mocking a function or object from a different module is still only an experimental feature. This was added to the Node ecosystem in the middle of 2024 by cjihrig, so hopefully it will become the official way to mock things soon.
This can be an issue if you want to test your code in isolation from the rest of your codebase.
Env file support ✅
This is a simple feature that makes life a lot easier for developers. Node now has native support for loading environment variables from a .env
file. Just add the --env-file option to your node command, and Node will automatically load environment variables from the file you specify.
In-memory cache with SQLite ✅
Node supports using an in-memory
database with SQL to access your data. This might seem kind of useless at first, but there are actually a few good use-cases for it. Here’s a sample of how I cached bitcoin rates in memory:
export function findAllBitcoinRates(): CurrencyRate[] {
const context = getContext();
const currentTimestamp = context.timestamp;
const expirationTimestamp = currentTimestamp - BITCOIN_RATE_TTL_MS;
const query = database.prepare(
`SELECT symbol, last, buy, sell, "15m", timestamp FROM ${BITCOIN_RATES_TABLE} WHERE timestamp >= ?`,
);
// Can't pass the expected type to the query
const result = query.all(expirationTimestamp);
if (result.length === 0) {
return [];
}
return result.map((row) => row as unknown as CurrencyRate);
}
The only thing missing is being able to pass a generic type to query.all
so I don’t have to force TypeScript types with as unknown as CurrencyRate
.
Would I build a production-grade API in pure Node.js without any npm packages?
The need to build custom route handling and the loose TypeScript support are the main reasons not to skip libraries completely.
Express.js already comes with a bunch of great features and is supported by the OpenJS Foundation, so using it with TypeScript is a really good way to get started. It’s simple, and will be easy to maintain in the future.
Also, it would be tough to avoid libraries that are clients or drivers, for example mongodb driver. So I’d always try to keep my package.json
dependencies as minimal as possible, and only add new packages when really needed - someone will have to update them in the future!
Top comments (2)