Roulette
5 views | +0 today
Follow
Your new post is loading...
Your new post is loading...
Scooped by Ryan Schroeder
Scoop.it!

Martingale (betting system) - Wikipedia, the free encyclopedia

A martingale is any of a class of betting strategies that originated from and were popular in 18th century France. The simplest of these strategies was designed for a game in which the gambler wins his stake if a coin comes up heads and loses it if the coin comes up tails. The strategy had the gambler double his bet after every loss, so that the first win would recover all previous losses plus win a profit equal to the original stake. The martingale strategy has been applied to roulette as well, as the probability of hitting either red or black is close to 50%.

Since a gambler with infinite wealth will, almost surely, eventually flip heads, the martingale betting strategy was seen as a sure thing by those who advocated it. Of course, none of the gamblers in fact possessed infinite wealth, and the exponential growth of the bets would eventually bankrupt "unlucky" gamblers who chose to use the martingale. It is therefore a good example of a Taleb distribution – the gambler usually wins a small net reward, thus appearing to have a sound strategy. However, the gambler's expected value does indeed remain zero (or less than zero) because the small probability that he will suffer a catastrophic loss exactly balances with his expected gain. (In a casino, the expected value is negative, due to the house's edge.) The likelihood of catastrophic loss may not even be very small. The bet size rises exponentially. This, combined with the fact that strings of consecutive losses actually occur more often than common intuition suggests, can bankrupt a gambler quickly.

Casino betting limits eliminate the effectiveness of using the martingale strategy.[1]

more...
No comment yet.
Scooped by Ryan Schroeder
Scoop.it!

Probability theory - Wikipedia, the free encyclopedia

Probability theory is the branch of mathematics concerned with probability, the analysis of random phenomena.[1] The central objects of probability theory are random variables, stochastic processes, and events: mathematical abstractions of non-deterministic events or measured quantities that may either be single occurrences or evolve over time in an apparently random fashion. If an individual coin toss or the roll of dice is considered to be a random event, then if repeated many times the sequence of random events will exhibit certain patterns, which can be studied and predicted. Two representative mathematical results describing such patterns are the law of large numbers and the central limit theorem.

As a mathematical foundation for statistics, probability theory is essential to many human activities that involve quantitative analysis of large sets of data. Methods of probability theory also apply to descriptions of complex systems given only partial knowledge of their state, as in statistical mechanics. A great discovery of twentieth century physics was the probabilistic nature of physical phenomena at atomic scales, described in quantum mechanics.

The mathematical theory of probability has its roots in attempts to analyze games of chance by Gerolamo Cardano in the sixteenth century, and by Pierre de Fermat and Blaise Pascal in the seventeenth century (for example the "problem of points"). Christiaan Huygens published a book on the subject in 1657[2] and in the 19th century a big work was done by Laplace in what can be considered today as the classic interpretation.[3]

more...
No comment yet.
Scooped by Ryan Schroeder
Scoop.it!

Variance - Wikipedia, the free encyclopedia

In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. A variance of zero indicates that all the values are identical. A non-zero variance is always positive: a small variance indicates that the data points tend to be very close to the mean (expected value) and hence to each other, while a high variance indicates that the data points are very spread out from the mean and from each other.

The square root of variance is called the standard deviation.

The variance is one of several descriptors of a probability distribution. In particular, the variance is one of the moments of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on moments are advantageous in terms of mathematical and computational simplicity.

more...
No comment yet.
Scooped by Ryan Schroeder
Scoop.it!

How to Beat Roulette with A Simple 3 Point System - Casino-Gambling

How to Beat Roulette with A Simple 3 Point System - Casino-Gambling | Roulette | Scoop.it
Boxing news, reviews, articles, interviews and forum. Start a free email account, homepage or even your own journal. The home away from home for fans of the sweet science.
more...
No comment yet.
Scooped by Ryan Schroeder
Scoop.it!

Compound probability distribution - Wikipedia, the free encyclopedia

In probability and statistics, a compound probability distribution is the probability distribution that results from assuming that a random variable is distributed according to some parametrized distribution, with the parameters of that distribution being assumed to be themselves random variables. The compound distribution is the result of marginalizing over the intermediate random variables that represent the parameters of the initial distribution.

An important type of compound distribution occurs when the parameter being marginalized over represents the number of random variables in a summation of random variables.

A compound probability distribution is the probability distribution that results from assuming that a random variable is distributed according to some parametrized distribution F with an unknown parameter θ or parameter vector θ that is distributed according to some other distribution G with hyperparameter α, and then determining the distribution that results from marginalizing over G (i.e. integrating the unknown parameter(s) out). The resulting distribution H is said to be the distribution that results from compounding F with G. Expressed mathematically for a scalar data point with scalar parameter and hyperparameter:

more...
No comment yet.