5

A lot has been said on the topic, however I could not find the exact answer to my question.

JavaScript cannot accurately represent decimal numbers such as 0.1, and this is understandable.

For example this is true because of the rounding error that occurs during the multilication:

0.1 * 3 === 0.30000000000000004

This is fine - all according to the IEEE Standard for Floating-Point Arithmetic (IEEE 754).

What I cannot understand is why other languages that also use the standard give more accurate measurements

0.1 * 3 === 0.3

Is this because of the different roundng rules that they use? https://en.wikipedia.org/wiki/IEEE_floating_point#Rounding_rules or am I missing something?

2
  • What other language(s) are you talking about specifically? Can you give examples? Commented Feb 13, 2016 at 13:56
  • Possible duplicate of Is floating point math broken? Commented Jul 25, 2017 at 6:44

1 Answer 1

2

Any language math that uses IEEE 754 Floating Point will have the same rounding issues that you see in JavaScript.

Some languages, such as C# provide a Decimal type which has more accurate math handling, trading decimal precision for range.

In JavaScript there are some different floating point libraries, such as BigDecimal, but since they aren't baked into the language, you can't use the regular math operators, such as + and *.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.