Subtyping is about behavior.
For type B to be a subtype of type A, it must support every operation that type A supports with the same semantics (fancy talk for "behavior"). Using the rationale that every B is a A does not work - behavior compatibility has the final say. Most of the time "B is a kind of A" overlaps with "B behaves like A", but not always.
An example:
Consider the set of real numbers. In any language, we can expect them to support the operations +, -, *, and /. Now consider the set of positive integers ({1, 2, 3, ...}). Clearly, every positive integer is also a real number. But is the type of positive integers a subtype of the type of real numbers? Let's look at the four operations and see if positive integers behave the same way as real numbers:
+: We can add positive integers without problems.-: Not all subtractions of positive integers result in positive integers. E.g.3 - 5.*: We can multiply positive integers without problems./: We can't always divide positive integers and get a positive integer. E.g.5 / 3.
So despite positive integers being a subset of real numbers, they're not a subtype. A similar argument can be made for integers of finite size. Clearly every 32-bit integer is also a 64-bit integer, but 32_BIT_MAX + 1 will give you different results for each type. So if I gave you some program and you changed the type of every 32-bit integer variable to 64-bit integers, there's a good chance the program will behave differently (which almost always means wrongly).
Of course, you could define + for 32-bit ints so that the result is a 64-bit integer, but now you'll have to reserve 64 bits of space every time you add two 32-bit numbers. That may or may not be acceptable to you depending on your memory needs.
Why does this matter?
It's important for programs to be correct. It's arguably the most important property for a program to have. If a program is correct for some type A, the only way to guarantee that program will continue to be correct for some subtype B is if B behaves like A in every way.
So you have the type of Rectangles, whose specification says its sides can be changed independently. You wrote some programs that use Rectangles and assume the implementation follows the specification. Then you introduced a subtype called Square whose sides can't be resized independently. As a result, most programs that resize rectangles will now be wrong.