Struggling with 1020 errors in your Node.js scrapers? This snippet shows how to combine residential proxies and Playwright-stealth to stay under the radar.
Table of Contents
- Step 1: Basic Playwright Setup
- Step 2: Add Stealth Plugin
- Step 3: Integrate Residential Proxies
- Step 4: Handle Headers and User-Agent Rotation
- Step 5: Add Random Delays and Navigation Tweaks
- Step 6: Final Complete Script
- Conclusion
Step 1: Basic Playwright Setup
Let’s start with plain Playwright. First, install the package:
npm install playwright
We’re not scraping anything yet, just trying to load a full page.
const { chromium } = require('playwright');
(async () => {
const browser = await chromium.launch();
const page = await browser.newPage();
await page.goto('https://example.com');
console.log(await page.content());
await browser.close();
})();
Now, if the site has anti-bot protection, you’ll likely get a CAPTCHA or a challenge, maybe even a 1020 error after a few tries.
If you want to dig deeper into scraping with Playwright, we’ve got full guides for both Node.js and Python.
Step 2: Add Stealth Plugin
Now let’s try again, but this time with stealth plugins. These help by hiding the fact that you're using a headless browser. They tweak stuff like navigator.webdriver, spoof plugins and fonts, and patch properties that give away automation.
npm install playwright-extra playwright-extra-plugin-stealth
Here’s how we modify the earlier script to include those:
const { chromium } = require('playwright-extra');
const stealth = require('playwright-extra-plugin-stealth');
chromium.use(stealth());
(async () => {
const browser = await chromium.launch();
const page = await browser.newPage();
await page.goto('https://example.com');
console.log(await page.title());
await browser.close();
})();
Just this step alone already lowers your chances of hitting a 1020 error.
Step 3: Integrate Residential Proxies
Still getting blocked? Then you’ll need proxies. Residential proxies work best. We’ve listed some trusted providers in another post. Free proxies might work, but they’re risky, your traffic isn’t private anymore.
Let’s plug proxies into the script:
const { chromium } = require('playwright-extra');
const stealth = require('playwright-extra-plugin-stealth');
chromium.use(stealth());
const residentialProxies = [
'http://user:[email protected]:8000',
'http://user:[email protected]:8000',
'http://user:[email protected]:8000'
];
const proxy = residentialProxies[Math.floor(Math.random() * residentialProxies.length)];
(async () => {
const browser = await chromium.launch({
proxy: { server: proxy }
});
const page = await browser.newPage();
await page.goto('https://example.com');
console.log(await page.url());
await browser.close();
})();
We added rotation so a random proxy is picked for each request. This helps them last longer and reduces the chance of getting blocked.
Step 4: Handle Headers and User-Agent Rotation
Next up – headers. One of the most important ones? The User-Agent. We’ve already shared a list of latest User Agents in another article.
Let’s add them to the code:
const { chromium } = require('playwright-extra');
const stealth = require('playwright-extra-plugin-stealth');
chromium.use(stealth());
const userAgents = [
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36'
];
const ua = userAgents[Math.floor(Math.random() * userAgents.length)];
(async () => {
const browser = await chromium.launch();
const context = await browser.newContext({
userAgent: ua,
extraHTTPHeaders: {
'Accept-Language': 'en-US,en;q=0.9',
'Accept': 'text/html,application/xhtml+xml'
}
});
const page = await context.newPage();
await page.goto('https://example.com');
console.log(await page.title());
await browser.close();
})();
But User-Agent alone won’t cut it. There are more headers that matter:
If you’re building a scraper for a site protected by Cloudflare, pay attention to these headers too.
Step 5: Add Random Delays and Navigation Tweaks
Add some random delays between requests to look more human. To go a step further, simulate some mouse movements, scroll around, maybe click a few elements. Some sites track that.
const { chromium } = require('playwright-extra');
const stealth = require('playwright-extra-plugin-stealth');
chromium.use(stealth());
function randomDelay(min = 1000, max = 3000) {
return Math.floor(Math.random() * (max - min)) + min;
}
(async () => {
const browser = await chromium.launch();
const page = await browser.newPage();
await page.goto('https://example.com', { waitUntil: 'domcontentloaded' });
await page.waitForTimeout(randomDelay());
await page.mouse.move(200, 300);
await page.evaluate(() => window.scrollBy(0, 250));
await page.waitForTimeout(randomDelay());
console.log(await page.content());
await browser.close();
})();
These small tricks are easy to add and can really help. Don’t skip them.
Step 6: Final Complete Script
Here’s the full script with everything we’ve added:
const { chromium } = require('playwright-extra');
const stealth = require('playwright-extra-plugin-stealth');
chromium.use(stealth());
const proxies = [
'http://user:[email protected]:8000',
'http://user:[email protected]:8000'
];
const userAgents = [
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36'
];
function randomDelay(min = 1000, max = 3000) {
return Math.floor(Math.random() * (max - min)) + min;
}
(async () => {
const proxy = proxies[Math.floor(Math.random() * proxies.length)];
const ua = userAgents[Math.floor(Math.random() * userAgents.length)];
const browser = await chromium.launch({
proxy: { server: proxy }
});
const context = await browser.newContext({
userAgent: ua,
extraHTTPHeaders: {
'Accept-Language': 'en-US,en;q=0.9',
'Accept': 'text/html,application/xhtml+xml'
}
});
const page = await context.newPage();
await page.goto('https://example.com', { waitUntil: 'domcontentloaded' });
await page.waitForTimeout(randomDelay());
await page.mouse.move(200, 300);
await page.evaluate(() => window.scrollBy(0, 250));
await page.waitForTimeout(randomDelay());
console.log('Page title:', await page.title());
await browser.close();
})();
Think this is the end? Not really. It’s just the beginning. Cloudflare is constantly evolving. Even if you get past the 1020, you might still face CAPTCHA or JS challenges.
One of the most useful upgrades now is CAPTCHA-solving integration. Cloudflare rarely throws a 1020 right away – usually it asks you to prove you're human first.
Conclusion
Instead of a long wrap-up, here are some helpful links:
- For Python fans, check our GitHub repo. Node.js examples are there too. If it helped, give it a star.
- We also have a full write-up on the HasData blog with actual metrics comparing these methods.
- And if you’re not there yet – come join our Discord.
Top comments (0)