Skip to main content
replaced http://programmers.stackexchange.com/ with https://softwareengineering.stackexchange.com/
Source Link

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thingAre null references really a bad thing?). Also implementing an option type isn't really much more complex than null references.

Why did they decide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thing). Also implementing an option type isn't really much more complex than null references.

Why did they decide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see Are null references really a bad thing?). Also implementing an option type isn't really much more complex than null references.

Why did they decide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

deleted 11 characters in body; edited title
Source Link
user39685
user39685

If null is evil (and it is)bad, why do modern languages implement it?

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thing). Also implementing an option type isn't really much more complex than null references.

Why did they decide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users themselves.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

If null is evil (and it is) why do modern languages implement it?

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thing). Also implementing an option type isn't really much more complex than null references.

Why did they decide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users themselves.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

If null is bad, why do modern languages implement it?

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thing). Also implementing an option type isn't really much more complex than null references.

Why did they decide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

Post Closed as "Opinion-based" by gnat, GrandmasterB, CommunityBot, Bart van Ingen Schenau, jwenting

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thing), also. Also implementing an option type isn't really much more complex than null references.

Why did they decideddecide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users themselves.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thing), also implementing an option type isn't really much more complex than null references.

Why they decided to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users themselves.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

I'm sure designers of languages like Java or C# knew issues related to existence of null references (see http://programmers.stackexchange.com/questions/12777/are-null-references-really-a-bad-thing). Also implementing an option type isn't really much more complex than null references.

Why did they decide to include it anyway? I'm sure lack of null references would encourage (or even force) better quality code (especially better library design) both from language creators and users themselves.

Is it simply because of conservatism - "other languages have it, we have to have it too..."?

Tweeted twitter.com/#!/StackProgrammer/status/462209055482978304
Source Link
zduny
  • 2.6k
  • 2
  • 21
  • 26
Loading