Videos
node.js - Is there a condition where the Math.ceil function in Javascript doesn't remove decimals from the output? - Stack Overflow
javascript - Random Number, Math.floor(...) vs Math.ceil(...) - Stack Overflow
ceil - Need the nearest integer value in javascript - Stack Overflow
Newbie here. Is using Math.ceil(Math.random()) a bad idea?
as stated in MDN reference about Math.random()
Returns a floating-point, pseudo-random number in the range [0, 1) that is, from 0 (inclusive) up to but not including 1 (exclusive), which you can then scale to your desired range.
Since Math.random can return 0, then Math.ceil(Math.random()*10) could also return 0 and that value is out of your [1..10] range.
About your second question, see Most efficient way to create a zero filled JavaScript array?
Math.floor() is preferred here because of the range of Math.random().
For instance, Math.random() * 10 gives a range of [0, 10). Using Math.floor() you will never get to the value of 10, whereas Math.ceil() may give 0.
Hi, I've been learning JS for a few weeks now, and it's coming along pretty well. Anyways, I've found both the books I'm reading and all the sites I use for tutorials tell me to generate random numbers using the format Math.floor(Math.random()*n+1). Wouldn't it make more sense to just use Math.ceil(Math.random()*n)? I'm only asking because everything teaches the former, but that seems inefficient. Thanks!