Okay, that's one way of looking at the scoring. However, even this method is very difficult and ambiguous. You can't simply count the number of things a person did well on his or her AMV against the bad things. Except for the very worst and very best videos, videos have a countless number of things done well and a countless number of things done wrong. I could not count every good and bad quality of the typical AMV and say "for every 7 good aspects, i found 3 bad aspects, so i'll give this a 7 out of 10." Especially on completely qualitative standards such as originality or reviewability (and i don't find Phade's quantitative standards of reviewability too reliable because I don't watch most of my favorite videos every week, yet i would never give them anything under a 7 for reviewability- and this is why i think the reviewability average is way below any other category's average).Arigatomyna wrote: As for the average vid deserving a 5, it depends on what a 5 means. If it's put to academic scales, a 5 means it fails. If you look at it as the midway point between having everything perfect and everything perfectly wrong, that's fine. You'd be saying it deserves a 5 because there is as much wrong as there is good. But you can't say it deserves a 5 because it's 'average' and isn't better than any other video. The only way you can do that is if every other video has as much wrong as there is good. I think the average video has at least a few more things right than the number of things that are wrong. So I can't give a video a 5 just because it doesn't seem to stand out over the average video. Until the average video here is one with as much wrong as there is right, I can't give 5s as the average - to me a 5 will *always* be the halfway point between perfect and absolutely bad (no good at all).
The mean is skewed from the peak of the distribution curve if there are outliers (data points very far from the mean) and is skewed in the direction of the outliers. Since the average is about 8s, only the lowest scores (1s and 2s) could be considered outliers. In this case, the median is probably higher than the mean (probably mid-high 8's).trythril wrote: Mathematically, the arithmetic average isn't even a very good tool to figure out what the middle of the pack is.
If you want to find the score that better denotes what a more middle-of-the-pack video is like, ask Phade to add a module to calculate and display the median score. Throw in a standard deviation calculation as well. Both data points would work better as a numerical indication of the "average" video, and are trivial to calculate.
The median has nothing to do with the standard deviation, which is calculated off the mean. You could instead look for quartiles, deciles, or as koronoru mentioned, percentiles.
I'm as not concerned about the mean or median of the scores given on the .org as I am about the score reviewers consider an average video. I've already dismissed the high mean score as the result of a lot of people voting 10's on their favorite videos and deleting the crappy ones.
------------------------------------------------------------------------------
Thanks everyone who replied for your helpful comments:
I'll just tell you what I think of the rating system.
Existing anime music videos available to the public fall under a roughly normal distribution in regards to their skill, merit, and enjoyability. The rating system is a basically a coordinate system where we can align the average videos (in terms of skill, merit, and enjoyability) to a score of 5.5, the worst videos to a 1, and the best videos to a 10. The standard deviation is up to the reviewer, but I choose to put it somewhere around 2 (so about 70% videos fall between 3.5 and 7.5, and 95% fall between 1.5 and 9.5).
This sounds way too rigid and mathematical, and in fact it is. There is no way I can watch every AMV out there to determine which videos fall under which score of the population distribution. Additionally, opinions are subjective and qualitative, so there's no consistent guideline to scoring them. So my scoring is basically: if it's average in a category, give it a 5 or a 6, and only give it a 10 or a 1 if it really stands out from the pack. (I don't actually keep track of videos I've watched to fit the normal distribution.)
I think koronoru got the gist of my argument:
Seeing as I am one of the few who consider a 5.5 to be an appropriate score for an average video, I guess I should just warn you that if I give you a 6, it doesn't mean I think it's bad. Of course, now no one is going to want me to score their videos.The org's opinions-scoring system would be most useful if everyone used percentile scoring (adjust things so that the bottom 10% of videos score 0.0 to 1.0, the next 10% score 1.0 to 2.0, and so on) but I don't think there's any hope of getting people to do that. It would mean that half of the videos you review must score less than 5.0 - and in the current environment, a 5.0 is considered to be a pretty strong "this is a bad video" message, reserved for really bad videos. Like it or not, I think we're stuck with something like the school grade curve, and we've even got the same problem that we see in schools - grade inflation. No individual can change things, either; if I started reviewing a lot of videos and giving scores less than 5.0 to exactly half of them, all that'd happen is I'd get a reputation as being that guy who shits on everyone's work.