Skip to main content

If null is evil (and it is) why do modern languages implement it?

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thing). Also implementing an option type isn't really much more complex than null references.

Why did they decide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users themselves.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

zduny
  • 2.6k
  • 2
  • 21
  • 26