Standard Deviation |
September 1st, 2011 |
math |
Standard deviation is the average distance between each data point and the mean of the
data. It's defined as:
Why isn't it:
That is, why is it the square root of the average of
(x-mu)^2
instead of just the average of |x-mu|
?
How is the squared version more useful?
Update 2011-09-01 I had left out the square root on the standard deviation.
Comment via: google plus, facebook