Before Euphoria, the #1 video was...

General discussion of Anime Music Videos
Locked
User avatar
downwithpants
BIG PICTURE person
Joined: Tue Dec 03, 2002 1:28 am
Status: out of service
Location: storrs, ct
Org Profile

Post by downwithpants » Sun Jun 20, 2004 11:12 pm

if you don't like how the top 10% is set up, don't view it?

the purpose of the bayesian average isn't to reward popularity, the top 10% does that without the average factored in anyways.
maskandlayer()|My Guide to WMM 2.x
a-m-v.org Last.fm|<a href="http://www.frappr.com/animemusicvideosdotorg">Animemusicvideos.org Frappr</a>|<a href="http://tinyurl.com/2lryta"> Editors and fans against the misattribution of AMVs</a>

User avatar
Arigatomina
Joined: Thu Apr 03, 2003 3:04 am
Contact:
Org Profile

Post by Arigatomina » Mon Jun 21, 2004 12:54 am

downwithpants wrote:if you don't like how the top 10% is set up, don't view it?
I like it fine, I just wish I hadn't wasted months writing reviews under the impression that they mattered. And I miss the old system, since I like movement.
the purpose of the bayesian average isn't to reward popularity, the top 10% does that without the average factored in anyways.
Then what was the point of using this new system? The only thing that has changed is that the vids listed are now ones with more reviews (aka, the ones more people watch, aka the popular ones). Most of those were there before, the only thing gotten rid of are the moving vids - the ones 'working their way' up and down the list. They wouldn't have done that if they hadn't wanted the perma vids filling the list.

That is why they're considering a temp list, after all - so they'll have a 'review-based' list that moves and shows videos with under 30 ops. I don't think it was a bad idea to make a list of permanent favorites, I just think they could have done it without demeaning the reviews people give to videos, and without permanently altering the old top 10 set-up.

User avatar
dwchang
Sad Boy on Site
Joined: Mon Mar 04, 2002 12:22 am
Location: Madison, WI
Contact:
Org Profile

Post by dwchang » Mon Jun 21, 2004 1:08 am

Arigatomyna wrote:
downwithpants wrote:if you don't like how the top 10% is set up, don't view it?
I like it fine, I just wish I hadn't wasted months writing reviews under the impression that they mattered. And I miss the old system, since I like movement.
It doesn't matter? Reviews have always mattered. Your review is worth the same as anyone elses. You're one person validating or disagreeing with a score.
the purpose of the bayesian average isn't to reward popularity, the top 10% does that without the average factored in anyways.
Then what was the point of using this new system? The only thing that has changed is that the vids listed are now ones with more reviews (aka, the ones more people watch, aka the popular ones). Most of those were there before, the only thing gotten rid of are the moving vids - the ones 'working their way' up and down the list. They wouldn't have done that if they hadn't wanted the perma vids filling the list.[/quote]

Uhm, I think you're not grasping the mathematics of a Baysian distribution. Videos *have to* work their way up as opposed to pop-up near the top once they hit X reviews and making the list partially invalid. It is literally what you're talking about where a video has to be "working their way up" as opposed to popping up near the top and trickling down.

Phade got rid of the old system since he realized it wasn't serving its purpose. He wanted a list that members could go to to find the "top" videos on the site. "Top" is obviously based on a person-to-person basis, but with a Baysian system, the more people that agree on something, the more "valid" that score (good or bad) becomes. It makes sense and seems fair to me. If something is good, people will continually say so and it will work it's way up.

It's simple statistics. If more people validate say something is X, then more than likely, it's X. If two (or in this case 8) people said something is a 10 while 100 say something is a 9.5, it's only logical that the one with 100 people validaitng it is more proven "in the field" and thus receives a larger % of its average.

If anything, I will agree that the Baysian distribution is harsh in that it doesn't let things "in" without proving themselves prior, but that's the point of the list. "Top" generally isn't something you give anything. I'd rather have such a list be more restrictive then too forgiving.

Don't get me wrong, I do see the point with lack of movement and do like the idea of a separate list to highlight newer videos, but I still don't see the point in arguing that a "top" list is too harsh and lacks movement. It's not like a top 10% video is made every 5 minutes or something. It makes sense that the list would stay static.

In any case, assuming another list is implemented, this is all moot. ;)
-Daniel
Newest Video: Through the Years and Far Away aka Sad Girl in Space

User avatar
Arigatomina
Joined: Thu Apr 03, 2003 3:04 am
Contact:
Org Profile

Post by Arigatomina » Mon Jun 21, 2004 1:12 am

downwithpants wrote:if you don't like how the top 10% is set up, don't view it?
Okay, I'm going to go ahead and explain the biggest problem with this idea. The people I've given reviews to. They are being punished because *I* wasn't basing my reviews on the average the way the bayesian average does. By doing what people kept saying to do - use 5 as the middle number, I've indavertently underscored every single video I've reviewed. That isn't just a matter of me not looking at the top 10 list now, it's a matter of me pretending I didn't just screw over a whole crapload of videos.

If I accept the new system - the new comparison of everyone's reviews to the ridiculously high average - then I'm in trouble. I'm going to have to go back and redo 491 reviews so that I'm using 8 as the 'average' number and not 5. And what happens when I do that? The average goes up even more so people have to give even *higher* reviews to incorporate the '8' = average score. It's just running headfirst into a scoring system where your average video has to get 9s or else it's worse than the 'okay' vids. I still can't believe they'd do this intentionally.

I just don't know what to do aside from argue and hope they realize this is screwing hundreds over - and makes those reviewers with 'low' averages the 'bad guys' for not buying the 'site average' scores as being standard. And really, if I go back and try to incorporate the average, I'll have to change most of my reviews to 10s. After all, if I gave a video 8s thinking that 5 was the 'okay' number, then I was giving 3 points above 'average.' I can't give 3 points above average when the average is already 8 - the best I can do is give straight 10s and appologize for 'underscoring' in the first place.

User avatar
Rozard
Joined: Wed Oct 31, 2001 10:39 pm
Org Profile

Post by Rozard » Mon Jun 21, 2004 1:18 am

You're just mad your videos aren't up there anymore.
Image
RichLather: We are guests of this forum, and as such we do not make the rules.
BishounenStalker The freedom to suck is what makes the Internet rock.

User avatar
Arigatomina
Joined: Thu Apr 03, 2003 3:04 am
Contact:
Org Profile

Post by Arigatomina » Mon Jun 21, 2004 1:18 am

dwchang wrote:It's simple statistics. If more people validate say something is X, then more than likely, it's X. If two (or in this case 8) people said something is a 10 while 100 say something is a 9.5, it's only logical that the one with 100 people validaitng it is more proven "in the field" and thus receives a larger % of its average.
You know, you keep saying this and it sounds like you're saying this new system counts how many people gave a 10 to the video, and adding weight to that because more people gave the same score. I believed you when you first said this, and started thinking of all the problems with that. But looking at the little equation on the top 10 page, that isn't how it's explained at all. It says:
C = the mean score across all videos (average all vids)
It doesn't say anything about how many reviews gave the same score to the same video. So either it does both, or it does the 'average of all videos' as a means of comparison. And as I posted above, that's a horrible thing to do for the future of reviewers (and the people putting weight on the review system).

User avatar
Arigatomina
Joined: Thu Apr 03, 2003 3:04 am
Contact:
Org Profile

Post by Arigatomina » Mon Jun 21, 2004 1:20 am

Rozard wrote:You're just mad your videos aren't up there anymore.
My vids were never up there for more than a day, they're not that good. :roll:

And the ones that *are* up there now (in the Romance section) don't deserve to be up there. My best vids only have 9 or so reviews, so the crappy old ones are being counted as better. Of course I dislike that.

User avatar
dwchang
Sad Boy on Site
Joined: Mon Mar 04, 2002 12:22 am
Location: Madison, WI
Contact:
Org Profile

Post by dwchang » Mon Jun 21, 2004 1:27 am

Arigatomyna wrote:
dwchang wrote:It's simple statistics. If more people validate say something is X, then more than likely, it's X. If two (or in this case 8) people said something is a 10 while 100 say something is a 9.5, it's only logical that the one with 100 people validaitng it is more proven "in the field" and thus receives a larger % of its average.
You know, you keep saying this and it sounds like you're saying this new system counts how many people gave a 10 to the video, and adding weight to that because more people gave the same score. I believed you when you first said this, and started thinking of all the problems with that. But looking at the little equation on the top 10 page, that isn't how it's explained at all. It says:
C = the mean score across all videos (average all vids)
It doesn't say anything about how many reviews gave the same score to the same video. So either it does both, or it does the 'average of all videos' as a means of comparison. And as I posted above, that's a horrible thing to do for the future of reviewers (and the people putting weight on the review system).
Not exactly. When I say "agreeing" with a score, it doesn't have to be exact. If someone gives something a 10 and another a 9, they obviously agreed 90% yes? Let me try and explain:

"Bayesian Average = (v ÷ (v + m)) × R + (m ÷ (v + m)) × C

where:
R = average for the video (mean) = (old score)
v = number of votes for the video = (votes)
m = minimum votes required to be listed in the top 10% (7)
C = the mean score across all videos (average all vids)"

You see the "(v / (v+m))" portion? That will come out to a fraction and a value below 1.00. The more "V" you get (opinions), the higher the fraction becomes. It's a limit that approaches 1.

This value is multiplied by "R" which is your "old score." This is that equation with the (Overall + Reviewability + Rest/X) / 3.

So effectively, the more opinions you get, the more % (that fraction again) of your "old score" you receive. Therefore, the more people who say something "similar" to your score, your "old score" will probably stay the same and you'll continually "get" more % of that "old score."

If they disagree, the old score will obviously go down (like the old system), but they will still receive a slight increase in the fracton point (b/c they received a review and V goes up by 1).

It's really just "proving yourself on the field" prior to giving you all your score. I disliked a system where once you hit "X," you got 100% of your score. 8 is not really a lot and to say that this video is X.YZ b/c 8 people said so, is hardly statistically accurate.

Again, the Baysian system is based on such follies and it's not a coincidence that IMDB.com uses a similar system and has over 20,000 votes across movies. Go look at their list and see if you see any "flukes" in there.

As for your argument that your reviews are "screwing people over." Since the "old score" still exists and is multiplied by the fraction, you effectively "screwed people over" the same with the old system. You're talking about an inherent flaw in a review system where the averages are high. Such a flaw has existed prior to the implementation of this new system.
-Daniel
Newest Video: Through the Years and Far Away aka Sad Girl in Space

User avatar
Arigatomina
Joined: Thu Apr 03, 2003 3:04 am
Contact:
Org Profile

Post by Arigatomina » Mon Jun 21, 2004 1:35 am

dwchang wrote:You're talking about an inherent flaw in a review system where the averages are high. Such a flaw has existed prior to the implementation of this new system.
But the old system didn't use the 'average of all reviews given' when factoring the 'score' of videos. Yes, the average was high, but the scores weren't punished for not matching that overly high average. Now they are - because now the score for the top list is determined by looking at that overly high average.

Or am I misunderstanding the 'xC' part of the new system? Is it, or is is not multiplying that number by the average of all videos? If they did that in the old system (which I don't think they did since R is defined as the old system score) - then sure, it would be the same flaw in both. Now, it certainly is a flaw, new or old, by giving credit to those high scores in a way that encourages overly high reviews.

But you're right. ^_^;; It probably would be simpler to just get started on upping my reviews rather than complaining about the dependency on those high averages. Easier to join them and all that.

User avatar
dwchang
Sad Boy on Site
Joined: Mon Mar 04, 2002 12:22 am
Location: Madison, WI
Contact:
Org Profile

Post by dwchang » Mon Jun 21, 2004 1:53 am

Arigatomyna wrote:
dwchang wrote:You're talking about an inherent flaw in a review system where the averages are high. Such a flaw has existed prior to the implementation of this new system.
But the old system didn't use the 'average of all reviews given' when factoring the 'score' of videos. Yes, the average was high, but the scores weren't punished for not matching that overly high average. Now they are - because now the score for the top list is determined by looking at that overly high average.

Or am I misunderstanding the 'xC' part of the new system? Is it, or is is not multiplying that number by the average of all videos? If they did that in the old system (which I don't think they did since R is defined as the old system score) - then sure, it would be the same flaw in both. Now, it certainly is a flaw, new or old, by giving credit to those high scores in a way that encourages overly high reviews.
No it's not. Even if you "disagree" with a high average, that disagreeing score is still averaged into the "old score" and then multiplied by the fraction. You're still effectively "punishing" to the same degree with each system since the "old score" is still used to calculate and that old score is what you are averaged into.

With the new system, that old score is multiplied by that fraction (v/(v+m)) and the video gets a certain % of the score. I guess I should clarify that when I say "agree" with score, it doesn't mean your score is negated. I meant more of a, more people reviewing = more valid the "old score" becomes. It's a fraction and thus you get a certain % of your "old score." More people reviewing = higher % received.

No matter what score is given, the fraction part will "give you" a certain % of your score regardless of if ppl gave all 10's or all 1's. That part is more based on how many people give a score. The "agreeing" part is the fraction, not the score itself. It's statistics in that, the more people who vote on something, the more valid it becomes.

Make sense?
Arigatomyna wrote:But you're right. ^_^;; It probably would be simpler to just get started on upping my reviews rather than complaining about the dependency on those high averages. Easier to join them and all that.
I disagree. I like seeing different ranges of scores....ok scratch that. I guess I rather prefer comments than scores, but moving along...

Review how you like and since you've reviewed so many videos, effectively all of them have been "rewarded" or "punished" equally. That's really all a reviewer can do, be consistent across videos. Sure the ideal situation would be a reviewer review ALL videos so ALL are punished/rewarded the same, but well...that's impossible.

If you wanna be the defeatist though, go ahead. I was just getting at that your scoring is more or less is the same as before. It's just a matter of how the final score is calculated changing and the reviewer doesn't influence that. You just give your score, it's used in the "old score" and then they get V+1 (one more opinion, one step closer to be "validated").

Anyway, I'm going to bed ;)
-Daniel
Newest Video: Through the Years and Far Away aka Sad Girl in Space

Locked

Return to “General AMV”