I suspect that part of the problem of inflated scores is that people rarely leave ops for videos that they think truly stink.
Think about what the *lowest* scored opinion that you've given out is..
(my lowest three are 6.17, 5.83, and 4.83) I've never given solid numbers across the board, even for some of my favorite videos (top score is 9.86).
If you look at another thread, the star ratingsdistribution, which should realistically be more of a bell curve then you'll see that most of the videos tend toward upwards of average. Line chart. and Area chart. You would think that star ratings, since they are 100% anonymous would be immune from slanting.
This leads me to think that the reason the scores are unbalanced isn't because of widespread inflation, but that people on the whole actually think the videos are better than average. A big part of the opinionation problem is that it is entirely subjective. Defining things like sound quality and video quality might seem simple and quantifiable enough, but think about it. What exactly are the criteria? Essentially; 'does it look good to you'.
I watch videos on a 23" widescreen, with THX rated sound gear. Very few videos can hold up to that at distro quality, so I have to make accomodations based on that. I suspect everyone has some sort of factore that they personally use.
So.... essentially I think that the concept of 'honest op' week is a backwards way to do things. I think that have more consistent methods of grading and judging videos is more likely to produce better, and more reliable feedback. Instead of 'honest op week' something like 'op exchange for newbies week' where you help then write one or two ops that are functionally workable would be a better idea.