I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thingAre null references really a bad thing?). Also implementing an option type isn't really much more complex than null references.
Why did they decide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users.
Is it simply because of conservatism - "other languages have it, we have to have it too..."?