I wrote this code:
if (10 < prompt("Enter a number from 10 to 30") < 30) {
alert("Ok");
}
The problem is it always returns true (if I enter any number). Could anybody explain why?
I wrote this code:
if (10 < prompt("Enter a number from 10 to 30") < 30) {
alert("Ok");
}
The problem is it always returns true (if I enter any number). Could anybody explain why?
In javascript there aren't a < b < c comparisons. You must use a < b && b < c:
var num = prompt("Enter a number from 10 to 30");
if(10<num && num<30) alert("Ok");
This is not the right way to test if a variable is between two values.
10<prompt("Enter a number from 10 to 30")<30
Parses as
(10 < prompt("Enter a number from 10 to 30")) < 30
The problem is that that first bit, (10 < ...), will return either true or false. However, both when comparing Booleans to integers, true is coerced to 1, and false is coerced to 0. So in the next part of the statement 0 < 30 and 1 < 30 will both return true.
Try this:
var value = prompt("Enter a number from 10 to 30");
if(value > 10 && value < 30) { alert("Ok"); }