General Discussion Undecided where to post - do it here. |
Reply to Thread New Thread |
![]() |
#21 |
|
|
![]() |
![]() |
#22 |
|
|
![]() |
![]() |
#23 |
|
|
![]() |
![]() |
#24 |
|
|
![]() |
![]() |
#25 |
|
|
![]() |
![]() |
#26 |
|
|
![]() |
![]() |
#28 |
|
Originally posted by self biased
what if i'm only looking for certain values? for example, what if i'm looking for a 3+ on a d6 on 15 dice? odds say that ten of those fifteen will be a three or better. dropping some dice now i get: 2, 3, 1, 1, 2, 2, 1, 4, 6, 2, 4, 6, 6, 6, and 1. more than half my values are less than my target of 3+. is there any way of shortcutting this? more often than not, we're looking for 3+ or 4+ on anywhere from four to thirty-six dice. 15 choose 10 divided by 2^15 for 10 4+'s. You'll need a calculator. In general, if you're making n rolls and you want to know the probability of getting k of some result (with a probability p of getting that result on any single roll, e.g. p=1/2 for a 4+), use the formula (n choose k) * p^k * (1-p)^k. edit: (n choose k) = n! / ((n-k)! * k!), where ! = factorial. |
![]() |
![]() |
#29 |
|
Originally posted by Kuciwalker
That said, Standard Deviation doesn't approach zero ever [unless it is of a set of identical items] iirc. It's not supposed to be a measure of the total variance of the set's mean from the true mean, but rather a predictor of how far from the mean any given single item in the set is likely to be. So close, and yet so far. The standard deviation of the set's mean does approach zero. Yes, that would be the variance of the set's mean from the true mean... the standard deviation itself does not approach zero [unless the set is identical items]. |
![]() |
![]() |
#30 |
|
|
![]() |
![]() |
#31 |
|
|
![]() |
![]() |
#33 |
|
|
![]() |
![]() |
#34 |
|
Originally posted by Kuciwalker
Btw, you should use Google for your calculator. Then, if you want to find the odds of getting 10 3+'s in 15 rolls, just type in "(15 choose 10) * (2/3)^10 * (1 - 2/3)^5". No need to do the factorial stuff. i think a point is being missed that i'm not making clear enough. i know how to figure out the odds of rolling a particular number or higher on d6. 3+ is 2/3 (or near 67%) on 1d6. the odds say that on fifteen dice, ten will be 3+ and five will not. if i roll 15d6 and get say twelve results at 3+ (which is 4/5 or 80%), is this within a standard deviation? i realize i may not be using the proper terminology for this. all i know is that i'd kill for a set of dice that always rolled a straight statistical average. |
![]() |
![]() |
#36 |
|
|
![]() |
![]() |
#37 |
|
Originally posted by self biased
i think a point is being missed that i'm not making clear enough. i know how to figure out the odds of rolling a particular number or higher on d6. 3+ is 2/3 (or near 67%) on 1d6. the odds say that on fifteen dice, ten will be 3+ and five will not. if i roll 15d6 and get say twelve results at 3+ (which is 4/5 or 80%), is this within a standard deviation? i realize i may not be using the proper terminology for this. all i know is that i'd kill for a set of dice that always rolled a straight statistical average. Then you use the formula for standard devation I gave: sqrt(n * p * (1 - p)). In the case of 15 dice at 3+, that's sqrt(15 * 2/3 * 1/3) = sqrt(10/3) ~= 1.8. So you're likely to get 8-12 successes. |
![]() |
Reply to Thread New Thread |
Currently Active Users Viewing This Thread: 4 (0 members and 4 guests) | |
|