Red wrote:
Wayne Stollings wrote:
That defines the range but not the IQ. The IQ range defines the mean, the median, the mode, the upper limit and the lower limit, but the individual IQ is separate from this. Of course, this depends on the way you look at the situation and the semantics used.
That defines the distribution, not the range. The range is the upper limit - lower limit (max - min). The distribution is what johnny was referring to. A single IQ measurement is a point estimate of a variable. A variable that is defined by it's mean (100) and distribution (Gausssian/normal). In a normal distribution the mean=median=mode.
You are correct that I should have included the distribution with in the range in my haste to reply. You must assume there is an actual Gaussian/normal distribution in intelligence as measured by the IQ tests, which we know is not the case even though that is the desired effect.
Quote:
To say you have an IQ of 90 means nothing absent of knowing the mean and distribution of the variable in a population. If you have a valid IQ test and measure the IQ of 1000 people, by definition ~500 should have an IQ below 100 and ~500 should have an IQ above 100. It's not semantics, it's simple statistics.
Thus, you have exhibited the flaw in your position for if it must be an equal number both above and below by definition the outcome would have to be known in advance in order to construct the scale. Also by definition the current scale of IQ would be limited to an upper range of 150 to maintain the curve. However, my first IQ test was several percentage points above that level.
Quote:
Yes, and this is the statistical target but not the true situation.
http://en.wikipedia.org/wiki/IQ
IQ tests are designed to give
approximately this Gaussian distribution. Colors delineate one standard deviation. But the true frequency of low and high IQs is greater than that given by the Gaussian curve.