I don’t know how many times I’ve written some version of this:
for (let i = 0; i < data.length; i += chunkSize) {
chunks.push(data.slice(i, i + chunkSize));
}
It works. But after the 10th time — especially when working with buffers, emoji-heavy strings, or even async streams — I decided it was time to stop repeating myself.
So I built chonkify
.
Why?
Because I needed a chunking function that:
✅ Works with:
- Arrays
- Strings
- Buffers
- Typed arrays
- Sets & Maps
- Array-likes
- Even
AsyncIterable
objects
😅 Handles:
- Complex Unicode (grapheme clusters like
🏳️🌈
,👨👩👧👦
) - Multi-codepoint emoji without slicing them in half
💡 Is:
- Zero dependencies
- ~870 bytes (core)
- ESM-first and TypeScript-ready
Examples
import { chonkify } from 'chonkify'
for (const chunk of chonkify("👨👩👧👦🎉🎊🍕", 2)) {
console.log(chunk)
}
// → ["👨👩👧👦", "🎉"], ["🎊", "🍕"]
Or with an async iterable:
for await (const chunk of chonkify(streamOfItems, 100)) {
await sendBatch(chunk)
}
Try it
npm i chonkify
I mostly built this for myself, but figured someone else might find it useful too.
Top comments (0)