9

I did some search on the question, but it seems like people only emphasize on Non-blocking IO.

Let's say if I just have a very simple application to respond "Hello World" text to the client, it still needs time to finish the execution, no matter how quick it is. What if there are two request coming in at exactly the same time, how does Node.js make sure both requests will be processed with one thread?

I read the blog Understanding the node.js event loop which says "Of course, on the backend, there are threads and processes for DB access and process execution". That statement is regarding IO, but I also wonder if there is separate thread to handle the request queue. If that's the case, can I say that the Node.js single thread concept only applies to the developers who build applications on Node.js, but Node.js is actually running on multi-threads behind the scene?

2
  • Please note that "In node.js, everything runs in parallel except your code". Your code runs in single thread, but events are processed by multiple threads behind the scene. Commented Mar 24, 2015 at 5:56
  • @BibekSubedi, I google In node.js, everything runs in parallel except your code, it helps clarify some confusions, thanks. Commented Mar 24, 2015 at 16:57

1 Answer 1

10

The operating system gives each socket connection a send and receive queue. That is where the bytes sit until something at the application layer handles them. If the receive queue fills up no connected client can send information until there is space available in the queue. This is why an application should handle requests as fast as possible.

If you are on a *nix system you can use netstat to view the current number of bytes in the send and receive queues. In this example, there are 0 bytes in the receive queue and 240 bytes in the send queue (waiting to be sent out by the OS).

Proto Recv-Q Send-Q Local Address               Foreign Address             State
tcp        0      240 x.x.x.x:22                 x.x.x.x:*                   LISTEN

On Linux you can check the default size and max allowed size of the send/receive queues with the proc file system:

Receive:

cat /proc/sys/net/core/rmem_default
cat /proc/sys/net/core/rmem_max

Send:

cat /proc/sys/net/core/wmem_max
cat /proc/sys/net/core/wmem_default
Sign up to request clarification or add additional context in comments.

8 Comments

So you are saying Node.js replies on OS to queue the requests, it does not have its own implementation to handle queue, is it correct?
Yes, there is nothing magical about Node.js, like every other application it depends on the OS to handle the network layer.
I am still confused. I just read This is in contrast to today's more common concurrency model where OS threads are employed from Node's about page. It seems like Nodejs is not using OS thread, not sure I understand it correctly.
They are talking about single-thread vs thread-per connection there.
If you are worried about multiple requests coming in at the exact same time then you are moving down to the network layer. Let the OS worry about that.
|

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.