'Why is Math.floor() preferred over Math.round()? [JavaScript]

I found in FreeCodeCamp that to solve the problem of getting a random whole number within a range is to use Math.floor. It's inaccurate in rounding. It returns equal to or less than. It's not what I thought.

This is the given formula: Math.floor(Math.random() * (max - min + 1)) + min

Does anyone know why it is more used for rounding to the nearest whole number?

Thanks in advance!



Solution 1:[1]

Summary: Because with Math.round() the values in min and max are underrepresented.


let's have an example and compare the results when using Math.floor() and Math.random() respectively.

Just for clarity I've added the two formulas we're comparing:

min = 0;
max = 3;

result = Math.round(Math.random() * (max - min)) + min;
result = Math.floor(Math.random() * (max - min + 1)) + min;

| result | Math.round() | Math.floor() |
|:------:|:------------:|:------------:|
|    0   |  0.0 - 0.499 |   0 - 0.999  |
|    1   |  0.5 - 1.499 |   1 - 1.999  |
|    2   |  1.5 - 2.499 |   2 - 2.999  |
|    3   |  2.5 - 2.999 |   3 - 3.999  |

you see that 0 and 3 have only half as big of a range that produces them with Math.random() than every other range in out example.

Solution 2:[2]

This is the real reason, at least the way I see it, Math.random returns a float number between 0 (Included) and 1 (Not included), something like 0 - 0.9999999999999999 (I assume). Looking at this information we can easily think, ok, this is a percentage, so all We have to do is something like:

    let max = 10;

    console.log(Math.round(Math.random() * max))

Right?, Well, let's take a look at what our Math.random number should really represent in our range.

Let's say our range is min=0 and max=5, that creates six different possibilities: 0,1,2,3,4,5. If 0.9999999999999999, or 1 for simplicity, is the maximum number you can get from Math.random, we can divide it into the number of possibilities our range has, something like 1 / (max + 1), the + 1 makes the range inclusive and cover the six possibilities, getting roughly 0.16666666666666666 as the result, this means each possibility will be at least equal to this number multiplied by it's value, that said Math.random values in the following ranges would satisfy each possibility:

From To Expected Result
0 0.1666666666666665 0
0.1666666666666666 0.3333333333333332 1
0.3333333333333333 0.4999999999999999 2
0.5 0.6666666666666665 3
0.6666666666666666 0.8333333333333332 4
0.8333333333333333 0.9999999999999999 5

Now let's look at what Math.round does:

    console.log(0.31181510804047763 * 5, 'rounded to:', Math.round(0.31181510804047763 * 5));
    //Prints 1.5590755402023881 'rounded to:' 2

    console.log(0.1666666666666661 * 5, 'rounded to:', Math.round(0.1666666666666661 * 5));
    //Prints 0.8333333333333305 'rounded to:' 1

As you can see, in some situations Math.round won't return the result we might be expecting, like in the second example, we were expecting 0. This means that in order to print 0 the result Math.round needs to receive should be lower than 0.5, like this:

    console.log(0.0999999999999999 * 5, 'rounded to:', Math.round(0.0999999999999999 * 5));
    //Prints 0.49999999999999944 'rounded to:' 0

While this result might be expected, every float from 0.1 to 0.1666666666666665 will be unexpected and the same thing will happen at different floating points and this is why a Math.floor solution is preferred.

How about we divide our randomly generated float by our fixed possibility float?

    console.log(0.1666666666666661 / (1 / (5 + 1)));//Prints 0.9999999999999967
    console.log(0.1666666666666666 / (1 / (5 + 1)));//Prints 0.9999999999999997
    console.log(0.1666666666666667 / (1 / (5 + 1)));//Prints 1.0000000000000004

If you look at these results, all need to be round down to match the expected results, so now our formula looks like this:

    let min = 0;
    let max = 5;
    let r = Math.random();
    let f = (1 / (max + 1));
    let x = Math.floor(r / f);

    console.log(r, x);

Now we have another problem, if We change our min value to any number other than 0, we need to adjust our formula to reduce the number of possibilities and account for offset created.

    let min = 2;
    let max = 5;
    let r = Math.random();
    let f = (1 / (max - min + 1));
    let x = Math.floor(r / f + min);

    console.log(r, x);

Our final function would look like this:

    const randomNumber = (min, max) => {
        return Math.floor(Math.random() / (1 / (max - min + 1)) + min);
    }
    
    console.log(randomNumber(2, 5));
    
    //If Math.random ever returns 0.999999999999999
    console.log(Math.floor(0.999999999999999 / (1 / (5 - 2 + 1)) + 2));//Prints 5

    //Hope this never happens
    console.log(Math.floor(0.9999999999999999 / (1 / (5 - 2 + 1)) + 2));//Print 6

Which is the same as:

    const randomNumber = (min, max) => {
        return Math.floor(Math.random() * (max - min + 1) + min);
    }
    
    console.log(randomNumber(2, 5));
    
    //If Math.random ever returns 0.999999999999999
    console.log(Math.floor(0.999999999999999 / (1 / (5 - 2 + 1)) + 2));//Prints 5

    //Hope this never happens
    console.log(Math.floor(0.9999999999999999 / (1 / (5 - 2 + 1)) + 2));//Print 6

Solution 3:[3]

Math.floor(Math.random() * (max - min + 1)) + min

will give you a random number in the range of [min, max] because Math.random() gives you [0, 1). Let's use Math.round instead of Math.floor, Math.random() gives you [0, 1) and if you multiply it by 10, you will get [0, 10). This is a floating point and if you round it up, you will get [0, 10] as integer. However, if you round it down, you will get [0, 10) as integer.

In most random function, the norm is to return [min, max).

To answer your question, the author uses Math.floor so that the random number will be in the range of [min, max] instead of [min, max+1] if using Math.round.

FROM WIKIPEDIA

Intervals Main article: Interval (mathematics) Both parentheses, ( ), and square brackets, [ ], can also be used to denote an interval. The notation {\displaystyle [a,c)} [a, c) is used to indicate an interval from a to c that is inclusive of {\displaystyle a} a but exclusive of {\displaystyle c} c. That is, {\displaystyle [5,12)} [5, 12) would be the set of all real numbers between 5 and 12, including 5 but not 12. The numbers may come as close as they like to 12, including 11.999 and so forth (with any finite number of 9s), but 12.0 is not included. In some European countries, the notation {\displaystyle [5,12[} [5,12[ is also used for this.

Solution 4:[4]

Math.floor(Math.random()) will always return 0 while Math.round(Math.random()) will return 0 or 1 thus with the Math.round() the random numbers will follow a non-uniform distribution. Which may not be acceptable for your needs.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2
Solution 3 Bill Cheng
Solution 4 Ameen Ahsan