Papabravo said:
I'm getting bored with the .99...~=1 thread so I thought I'd start another one with the potential for controversy. Here goes.
Code:
What is the difference between a constant k, and a random variable [B]K[/B]
with mean k and variance zero.
First you need to understand the definition of a random variable. I'll state it and then answer your question.
A random variable is a variable K whose values are random but have an associated probability distribution function. So, generally speaking K has a range of -inf < K < inf (for some types of distributions). This is true for continuous random values, but discrete random values have a finite set of K values that they can take on each of which have a probability associated with them. Drawing a card K from a deck of cards would be considered a discrete random variable.
Now on to your question, you're asking about a continuous random variable K with mean of k and variance of 0. Since our continous random variable has a variance of 0 and a mean of k means we have an impulse at location k with an associated probability of 1. Which means there is no difference between a constant k and a random variable K with mean k and variance 0.
However, I'm not convinced you actually meant variance 0, so I'll assume that you meant variance V and answer your question the way I intended. I'd like to point out when you constrain the variance to 0 that takes all the "randomness" away from the problem. It's technically not defined, but physically thats what it relates to.
Lets assume variance V and mean k. Lets lay out the 3 axioms of probability theory. Lets quickly define our entire sample space (the space that contains all possible values that K can take on as :delta: )
1. Pr(K) => 0
2. Pr
delta: ) = 1
Lets quickly define A1UA2UA3U....UAn = :delta:
3. sum(Ai) over all i = 1
now the equivalent of continuous random variables
1. fk(k) = 0 (which means the probability of k is actually 0
)
2. sum(fk(k)) = 1
3. F(inf) = 1
4. F(-inf) = 0
5. Pr(k1 < K < k2) = F(k2) - F(k1), where F is called the cumulative distribution function.
So, whats the difference between constant k and random variable K with variance V? It means that Pr(k) = 0
which means its probabilistically impossible to get a random number generator (or anything else for that matter) to generate EXACTLY k, but as average an infinite number of random variables together (called an expectation) you'll get your constant k.