I've been working on some abstractions of setTimeout and setInterval in order to process large sets of data without blocking the event loop in the browser.
Upon this, I have discovered that browsers "clamp" the number of milliseconds you specify for a timeout or an interval.
For example, if you write setTimeout('do something', 0), the browser will actually do setTimeout('do something', 4).  In other words, the browser will force a minimum of 2 to 10 milliseconds, depending on which browser and version your code is running on.
This can have a big impact on the amount of time for a callback/promise to wait.  There are ways around this (for timeouts, anyway), such as using postMessage or MessageChannel (poor solutions, in my opinion); they can provide the same effect as a timeout while effectively having 0 wait time.
Why do browsers clamp timeouts and intervals this way?  I can find no reasoning for this, official or otherwise.  My only hypothesis is that it somehow allows room other tasks to be added to the call stack, but this doesn't seem to be an issue with postMessage as far as I can tell, so beyond that I can't think of any reason why this helps anyone.    
setTimeoutit means the minimum amount of time to wait. In practice, there's no guarantee that your callback will be called in the set amount of time, or at all.