Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

17
  • 10
    Why is this important to you? Should it be defined, and if so, why? There's not much point in assigning x to itself, and if you want to increment x you can just say x++; -- no need for the assignment. I'd say it should not be defined just because it'd be hard to remember what's supposed to happen. Commented Jun 19, 2012 at 7:43
  • 4
    In my mind, this is a good question ("Some men see things as they are and ask why, I dream things that never were and ask why not"). It's (in my opinion) a question purely on language design, using C syntax as an example, not a question on C syntax. I, personally, think that the reason we don't have defined behaviour for expressions such as x++ + ++x or x=x++ is simply because there is a possibility of them being misread. Commented Jun 19, 2012 at 8:06
  • 5
    @ugoren: Why do you need to predict the result. Nobody in their right mind would write code like that (as has been mentioned several times) even if you did write code like this it would be rejected at the first code review. So there is no need to define the behavior and give the optimizer the best chance at optimizing it. In every example you propose I would shoot somebody if they added that to the code base. Commented Jun 19, 2012 at 8:13
  • 3
    I would find a more interesting question to be why isn't an error to write this? Surely a compiler could detect it's undefined behavour and therefore can't possibly be what the user actually wanted, so why isn't in an error? I understand some instances of undefined bahavour are hard to detect but this isn;t Commented Jun 19, 2012 at 12:27
  • 3
    "the rule forbidding changing a variable twice between sequence points is certainly a rule most programmers don't understand." -- Do you have any evidence for this? Most questions I've seen were asked because the programmers didn't know about the rule. Is there any evidence that most of them still didn't understand it after it was explained? Commented Jun 19, 2012 at 15:44