General - Is there a way to calculate an average based off of an existing average and given new parameters for the resulting average?
Lets say, we're calculating averages of test scores:
Starting Test Scores: 75, 80, 92, 64, 83, 99, 79
Average = 572 / 7 = 81.714...
Now given 81.714, is there a way to add a new set of test scores to "extend" this average if you don't know the initial test scores?
New Test Scores: 66, 89, 71
Average = 226 / 3 = 75.333...
Normal Average would be: 798 / 10 = 79.8
I've tried:
Avg = (OldAvg + sumOfNewScores) / (numOfNewScores + 1)
(81.714 + 226) / (3 + 1) = 76.9285
Avg = (OldAvg + NewAvg) / 2
(81.714 + 79.8) / 2 = 80.77
And neither come up the exact average that it "should" be. Is it mathematically possible to do this considering you d开发者_Python百科on't know the initial values?
You have to know the number of test scores in the original set and the old average:
newAve = ((oldAve*oldNumPoints) + x)/(oldNumPoints+1)
The standard approach is to store the count and the sum of the values.
They can be updated easily, and from them the average can be computed without loss of precision.
Say, you have 2 blocks of scores:
1st: n scores with average = a1
2nd: m scores with average = a2
then average of total scores equals:
a1*(1.0*n/(m+n))+a2*(1.0*m/(m+n))
In case you want just add 1 score (a2) to existing set formula becomes
a1*(n/(n+1))+ a2/(n+1)
精彩评论