I'm currently doing distance calculations between coordinates and have been getting slightly different results depending on the language used.
Part of the calculation is taking calculating the cosine of a given radian. I get the following results
// cos(0.8941658257446736)
// 0.6261694290123146 node
// 0.6261694290123146 rust
// 0.6261694290123148 go
// 0.6261694290123148 python
// 0.6261694290123148 swift
// 0.6261694290123146 c++
// 0.6261694290123146 java
// 0.6261694290123147 c
I would like to try and understand why. If you look past 16dp c is the only "correct" answer in terms of rounding. What I am surprised as is python having a different result.
This small difference is being amplified currently and over 000's of positions it is adding a not insignificant distance.
Not really sure how this is a duplicate. Also I am more asking for a holistic answer rather than language specific. I don't have a compute science degree.
UPDATE
I accept that maybe this is too broad a question I suppose I was curious as to why as my background isn't CS. I appreciate the links to the blogs that were posted in the comments.
UPDATE 2
This question arose from porting a service from nodejs to go. Go is even weirder as I am now unable to run tests as the summation of distances varies with multiple values.
Given a list of coordinates and calculating the distance and adding them together I get different results. I'm not asking a question but seems crazy that go will produce different results.
9605.795975874069
9605.795975874067
9605.79597587407
For completeness here is the Distance calculation I am using:
func Distance(pointA Coordinate, pointB Coordinate) float64 {
const R = 6371000 // Earth radius meters
phi1 := pointA.Lat * math.Pi / 180
phi2 := pointB.Lat * math.Pi / 180
lambda1 := pointA.Lon * math.Pi / 180
lambda2 := pointB.Lon * math.Pi / 180
deltaPhi := phi2 - phi1
deltaLambda := lambda2 - lambda1
a := math.Sin(deltaPhi/2)*math.Sin(deltaPhi/2) + math.Cos(phi1)*math.Cos(phi2)*math.Sin(deltaLambda/2)*math.Sin(deltaLambda/2)
c := 2 * math.Atan2(math.Sqrt(a), math.Sqrt(1-a))
d := R * c
return d
}
0.6261694290123148, but if you use "higher precision" printing likefmt.Printf("%.25f", v), you'll get0.6261694290123147599302911. The 16th fraction digit becomes7not8compared to when printed using the default format.cosis implemented and if 80-bit floating point is used for the intermediate results.cosine, need the exact syntax for every lanuage?? This is using builtins for every language so dont understand what you would like. Im asking a more holistic question as @icza has donex = 0.8941658257446736; assert(cos(x) == cos(x + argc/1000));fires (one might be computed at compile time, the other at run-time, possibly using different methods). See the "Transcendentals" section at randomascii.wordpress.com/2013/07/16/floating-point-determinism. So yes, the exact syntax (and compiler, and flags, and floating point rounding mode, etc.) used may be relevant.