Skip to main content
51 events
when toggle format what by license comment
Jul 2, 2023 at 13:45 comment added JacquesB Actual answer: Null references are unavoidable in Java-style OO, because object fields may be assigned in the constructor. This means at any point during the execution of the constructor, some fields might not be assigned yet. Since other methods can be called in the constructor, it is just not possible for the compiler to ensure a field always has been assigned a value before it is accessed. Even the advanced nullability analysis in C# or TypeScript cannot ensure this. But typing fields as optional would be wrong if the field is always assigned after the execution of the constructor.
Jun 4, 2023 at 15:13 review Reopen votes
Jul 4, 2023 at 15:20
Oct 16, 2018 at 5:35 comment added Andy Using null isn't a sin. The real sin is programming languages that don't produce compile errors when the user tries to dereference a nullable variable without checking that it's not null. Having used flow to type check my JavaScript programs for the past few years, I can tell you that I rarely see null pointer errors anymore. The Java compiler is a crime against humanity for not warning/erroring on code that may produce a NullPointerException.
Sep 7, 2018 at 0:00 comment added B T The existence and use of null isn't actually the problem. The problem with null in languages like java is that ANY VARIABLE can be set to null. Even tho java is statically typed and tries so hard to catch type-related errors, it provides absolutely no way of preventing a variable from holding null during static analysis. This is a huge gaping hole in an otherwise very tight very restrictive type system. Its really an inconsistency in java, and languages don't have to have this inconsistency. In Kotlin, for example, String? a = null is fine while String b = null is a compiler error.
Apr 12, 2017 at 7:31 history edited CommunityBot
replaced http://programmers.stackexchange.com/ with https://softwareengineering.stackexchange.com/
Apr 27, 2015 at 1:18 comment added Jim Balter "WTF is up with you guys?" -- Intelligence and knowledge. "Of all the things that can and do go wrong with software, trying to dereference a null is no problem at all." -- false. " This is like asking why mathematics has zero, when it causes so many problems for division." -- No, it's nothing like that. " It refers to "no object" in exactly the same way that "zero cows" refers to no cows." -- false. "using Options - does not in fact get rid of null references. Instead it allows you to specify which variables are allowed to hold null, and which aren't." -- false.
May 14, 2014 at 3:04 review Reopen votes
May 14, 2014 at 9:06
May 8, 2014 at 19:33 review Reopen votes
May 8, 2014 at 20:50
May 8, 2014 at 19:23 history edited user39685 CC BY-SA 3.0
deleted 11 characters in body; edited title
May 5, 2014 at 6:32 history closed gnat
GrandmasterB
CommunityBot
Bart van Ingen Schenau
jwenting
Opinion-based
May 4, 2014 at 23:59 comment added Doval @MartinJames That's what a foreign function interface is for.
May 4, 2014 at 23:21 comment added Jonathan. I will take objective-c's nil over Java's null (and NullPointerExceptions) and Haskell's Maybe anyday.
May 4, 2014 at 23:09 comment added Martin James Also embedded, also drivers. Those gonna need pointers, which can be null or unassigned.
May 4, 2014 at 23:03 comment added Martin James @Doval well that's good, me too, right up to the point where I want to do I/O and need to interact with the OS API which, in most cases, requires C-style vars and calls. That means pointers, and they can be unassigned or null.
May 4, 2014 at 17:55 comment added Doval @MartinJames Neither. I want to write applications in a safe language with a decent type system.
May 4, 2014 at 16:12 comment added Martin James 'myStruct *thing;' what do you want it pointing at, some random shit on the stack, which may, or may not, point to a valid object. or something that will immediately raise a segfault/AV if dereferenced? Your choice...
May 4, 2014 at 3:30 answer added B T timeline score: 6
May 4, 2014 at 3:24 comment added B T Null is not evil. If you watch his misleadingly named famous speach "The Billion dollar Mistake", Tony Hoare talks about how allowing any variable to be able to hold null was a huge mistake. The alternative - using Options - does not in fact get rid of null references. Instead it allows you to specify which variables are allowed to hold null, and which aren't.
May 3, 2014 at 20:28 comment added Voo @MartinJames Considering that the inventor himself claims the introduction of null pointers to be his biggest mistake and the fact that your claim that it's not a big deal is completely wrong (e.g. the Linux guys had to use aspecial commandline flags in gcc while they tracked down all those - possibly security relevant - bugs due to null referencing before checking - no idea if they still do), do you want to reconsider your statement? ;)
May 3, 2014 at 15:19 answer added Matthieu M. timeline score: 9
May 3, 2014 at 12:21 comment added JVE999 It's not a character is why I use it, leaving me free to use whatever character I want without it being interpreted as null.
May 3, 2014 at 10:31 comment added Dawood ibn Kareem No, "null" is a valid member of its type (references). It refers to "no object" in exactly the same way that "zero cows" refers to no cows.
May 3, 2014 at 10:23 comment added user541686 @MartinJames: The fewer invalid states your program has, the fewer ways it can be incorrect. Simple as that.
May 3, 2014 at 10:17 comment added zduny @DavidWallace 0 is not like null at all. 0 is valid member of its type (integers, rationals, real numbers etc.) while null has no type at all - it's a special marker.
May 3, 2014 at 9:52 comment added Dawood ibn Kareem -1. This is like asking why mathematics has zero, when it causes so many problems for division.
May 3, 2014 at 8:54 vote accept zduny
May 3, 2014 at 8:46 comment added detly @MartinJames "It ALWAYS generates an AV/segfault and so gets fixed" - no, no it doesn't.
May 3, 2014 at 3:22 answer added supercat timeline score: 3
May 2, 2014 at 22:48 comment added user7043 @MartinJames It's true that there are much worse bugs, but this style of crash is annoying and can be eradicated very easily. Option types provide good bang for the buck; they aren't more complex than pervasive nullability, just different. In contrast, it's not really clear how a language might help preventing more insidious bugs, and such features would probably be more complicated.
May 2, 2014 at 21:22 comment added Eric Lippert @GrandmasterB: The questions in here are reasonable questions. The editorial comments are off-topic and I would encourage the OP to delete them and rather focus on the question being asked rather than on expressing an opinion about a counterfactual world.
May 2, 2014 at 21:13 answer added Eric Lippert timeline score: 127
May 2, 2014 at 20:55 comment added Martin James WTF is up with you guys? Of all the things that can and do go wrong with software, trying to dereference a null is no problem at all. It ALWAYS generates an AV/segfault and so gets fixed. Is there so much of a bug shortage that you have to worry about this? If so, I have plenty spare, and none of them invoves problems with null references/pointers.
May 2, 2014 at 19:42 answer added Eric J Fisher timeline score: 30
May 2, 2014 at 19:40 comment added Doval @GrandmasterB He doesn't have to make is own language - there's already Standard ML, OCaml, and Haskell. There's also Scala and F# which only have null for interoperability with Java and .NET respectively. Considering null offers no advantages and big problems compared to Maybe/Option types, "evil" is a fitting term for it.
May 2, 2014 at 18:54 vote accept zduny
May 2, 2014 at 21:25
May 2, 2014 at 18:39 answer added CodesInChaos timeline score: 4
May 2, 2014 at 18:34 review Close votes
May 5, 2014 at 6:32
S May 2, 2014 at 18:06 history suggested Cloudy CC BY-SA 3.0
Grammar fix
May 2, 2014 at 18:03 review Suggested edits
S May 2, 2014 at 18:06
May 2, 2014 at 18:03 comment added FrustratedWithFormsDesigner Have you checked the King James programming bible to see if NULL is truly evil? kingjamesprogramming.tumblr.com ;)
May 2, 2014 at 17:43 answer added Maxthon Chan timeline score: 4
May 2, 2014 at 17:27 answer added anon timeline score: 4
May 2, 2014 at 13:09 comment added user7043 @PieterB But when the majority should not be nullable, wouldn't it make sense to make null-ability the exception rather than the default? Note that while the usual design of option types is to force explicit checking for absence and unpacking, one can also have the well-known Java/C#/... semantics for opt-in nullable references (use as if not nullable, blow up if null). It would at least prevent some bugs, and make a static analysis that complains about missing null checks much more practical.
May 2, 2014 at 13:06 comment added Doval @PieterB Unfortunately, it also makes bugs trivially easy.
May 2, 2014 at 13:04 comment added Pieter B @delnan I don't want it for the majority of my references. But when I use it, it makes things trivially easy. Like with lists with objects.
May 2, 2014 at 12:54 comment added user7043 @PieterB But do you use it for the majority of references, or do you want most references not to be null? The argument is not that there shouldn't be nullable data, only that it should be explicit and opt-in.
May 2, 2014 at 12:38 answer added Doval timeline score: 100
May 2, 2014 at 12:36 history tweeted twitter.com/#!/StackProgrammer/status/462209055482978304
May 2, 2014 at 12:33 answer added Telastyn timeline score: 10
May 2, 2014 at 12:22 comment added Pieter B null is great. I love it and use it every day.
May 2, 2014 at 12:19 history asked zduny CC BY-SA 3.0