'JavaScript seems to be doing floating point wrong (compared to C)
From everything I've been able to find online, JavaScript allegedly uses IEEE 754 doubles for its numbers, but I have found numbers that can work in C doubles, but not in JavaScript. For example,
#include <stdio.h>
int main(){
double x = 131621703842267136.;
printf("%lf\n", x);
}
prints 131621703842267136.000000 NOTE: IN AN EARLIER VERSION OF THE QUESTION I COPPIED THE WRONG NUMBER FOR C, but in JavaScript
console.log(131621703842267136)
outputs 131621703842267140. From everything I've read online, both C doubles and JavaScript numbers are 64-bit floating point, so I am very confused why they would output different results. Any ideas?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
