DEV Community

yao tang
yao tang

Posted on

πŸ˜΅β€πŸ’« Integrating Alibaba DeepSeek AI in a Frontend Project β€” Why `netlify dev` Breaks Streaming and How to Fix It

🧾 Background

While integrating the Alibaba Cloud DeepSeek Assistant API into a frontend project, I wanted real-time AI responses via fetch + stream (ReadableStream). Backend used Express to call DashScope’s streaming endpoint with X-DashScope-SSE: enable, and the frontend expected to receive incremental AI tokens like a chat assistant.

This setup worked perfectly in production.

But in local development with netlify dev?

❌ Streaming completely failed, frontend received nothing, and eventually crashed with:

 net::ERR_EMPTY_RESPONSE
Enter fullscreen mode Exit fullscreen mode

🧨 Problem Summary

Even though the backend successfully received and printed streaming chunks like:

data: {"output":{"text":"Hello"}}
Enter fullscreen mode Exit fullscreen mode

The frontend never got them. Instead:

  • fetch().body.getReader().read() was never triggered
  • The browser hung waiting for data
  • Connection dropped with ERR_EMPTY_RESPONSE

πŸ” Root Cause

❌ Suspected Issue Explanation
Backend forgot res.end() Nope β€” it's there
Streaming from DashScope failed Nope β€” chunks are printed in the backend logs
βœ… netlify dev local proxy strips streaming βœ”οΈ Yes β€” this is the real issue

The Netlify CLI (netlify dev) emulates serverless locally, but its internal proxy does not support chunked transfer encoding, which is required for streaming with res.write().


βœ… Solution: Bypass Netlify CLI, Run Express Locally

βœ… Step 1: Run your backend manually

ts-node src/server.ts
Enter fullscreen mode Exit fullscreen mode

Make sure your Express server listens on a custom port (e.g. 3001):

app.listen(3001, '0.0.0.0', () => {
  console.log('Running at http://localhost:3001');
});
Enter fullscreen mode Exit fullscreen mode

βœ… Step 2: Configure frontend proxy (Vite)

// vite.config.ts
server: {
  proxy: {
    '/chat': {
      target: 'http://localhost:3001',
      changeOrigin: true,
    },
  },
}
Enter fullscreen mode Exit fullscreen mode

Now your frontend code still calls /chat, but under the hood it hits your local Express server instead of the broken netlify dev proxy.

βœ… Step 3: Make sure your backend properly streams

externalRes.on('data', (chunk) => {
  res.write(chunk); // Stream directly to frontend
});

externalRes.on('end', () => {
  res.end(); // Always close stream
});
Enter fullscreen mode Exit fullscreen mode

πŸ’‘ Use res.setHeader('Content-Type', 'text/event-stream') for best results if you’re mimicking SSE.

βœ… Alternative: Deploy to Netlify

Deployed Netlify Functions do support streaming!

netlify deploy --prod
Enter fullscreen mode Exit fullscreen mode

But don't rely on local dev for anything stream-based.

🧠 Summary

Method Streaming Supported? Use Case
netlify dev ❌ No ❌ Never use for streaming/debugging SSE
ts-node + Express βœ… Yes βœ… Local dev (DeepSeek, OpenAI, etc.)
Deployed Netlify βœ… Yes βœ… Production use with DeepSeek or DashScope

πŸ“ Takeaway

If you're building real-time chat UIs, AI copilots, or stream-based assistants using Alibaba DeepSeek, OpenAI, or similar APIs:

⚠️ Don't use netlify dev during development.

βœ… Use Express locally + Vite proxy for a reliable streaming experience.

Top comments (0)