In the assumed mean method of calculating the mean of grouped data, you can first *assume a mean,* $ a$ and then calculate the deviation, $ d_i $ of the class marks $ x_i $ and use the formula $$ \bar x = a + \frac {\sum f_i d_i}{\sum f_i} $$

to get the mean. In the **step-deviation method,** you do something similar, but you also scale down the values by a factor *h,* so for example, you get a variable $$ u_i = \frac {x_i - a} {h} $$

and then use the formula, $$ \bar x = a + h \bar u $$

My question is, why does this method work? I'm guessing that we're manipulating the numbers by using some property of the mean, but what are they and why exactly do these methods make sense?

- Serverfault Help
- Superuser Help
- Ubuntu Help
- Webapps Help
- Webmasters Help
- Programmers Help
- Dba Help
- Drupal Help
- Wordpress Help
- Magento Help
- Joomla Help
- Android Help
- Apple Help
- Game Help
- Gaming Help
- Blender Help
- Ux Help
- Cooking Help
- Photo Help
- Stats Help
- Math Help
- Diy Help
- Gis Help
- Tex Help
- Meta Help
- Electronics Help
- Stackoverflow Help
- Bitcoin Help
- Ethereum Help