All screenplays on the simplyscripts.com and simplyscripts.net domain are copyrighted to their respective authors. All rights reserved. This screenplaymay not be used or reproduced for any purpose including educational purposes without the expressed written permission of the author.
Regarding reviews of good scripts: I'm always looking to give the writer something. So, sometimes, the better the script, the more fine my notes. And, sometimes that can look like I'm a reviewer just looking to knock something. For example: The Stowaway. It was the only script this round that I gave a 5. But, that didn't stop me from saying that the final 4 pages felt a bit disconnected from the first 8.
(Are people really as competitive as Dustin says? I'm pretty naive, I'll admit. But, I just don't look at the OWC in that way. I want to win, sure. But, knocking others down to advance my chances? Weird.)
Agreed.
I may "knock" scripts left and right, but it has absolutely nothing to do with advancing my chances for success.
In fact, if you pay attention to my feedback, you'll find that I almost always "champion" a script, and it's not mine that I'm doing that on.
The best deserve to be recognized, but it's extremely rare that even the best are without issues.
The number of reviews are fine, Imo, though I accept it's often buoyed by non participants.
The harshness is, when looked at in the cold light of day, intimidating and way over the top for scripts that are usually a few hours work at best. That same harshness has undoubtedly seen an enormous improvement in the quality of the writing over the years, however. Including my own.
Regarding reviews of good scripts: I'm always looking to give the writer something. So, sometimes, the better the script, the more fine my notes. And, sometimes that can look like I'm a reviewer just looking to knock something. For example: The Stowaway. It was the only script this round that I gave a 5. But, that didn't stop me from saying that the final 4 pages felt a bit disconnected from the first 8.
(Are people really as competitive as Dustin says? I'm pretty naive, I'll admit. But, I just don't look at the OWC in that way. I want to win, sure. But, knocking others down to advance my chances? Weird.)
I was hesitant to enter this time around because (1) I'm terrible at writing suspense and (2) knew I'd be busy during the review week and wouldn't have had time to review many scripts.
Decided against entering, but didn't announce that to help with the anonymizing feature of the OWC. My busy-level at work is still too high to give these scripts the attention they deserve.
About deadweight (non-reviewers), there doesn't seem to be any solution that doesn't create more work for someone, distort the incentives to leave honest feedback, or both. However, if we accept that it's worth some work to experiment I would err on the side of the positive than the negative.
At academic conferences, researchers submit their papers to a complicated double-blind review process. I'm glossing over the details, but suffice to say that it's not at all a good fit for SS.
The point is that the conference will typically have a "Best Reviewer" award. It's a cheap plaque, but newbies like getting them.
One SS-able possibility is to have voting explicitly for "Best Reviewer" on the ballot. Include the non-participant reviewers as choices, too. It's obviously subjective, but if you disallow voting for oneself then I hope people would be honest. (Worst case: everyone votes for a bad reviewer "who couldn't possibly win" to up their own chances of winning Best Reviewer... and the crappy reviewer makes off with the title!)
A more labor-intensive possibility is to score each reviewer's participation. Someone would need to make a decision on what counts as a minimal contribution (voting?), mediocre contribution (brief comments?), and good contribution (detailed review?) and assign point values for each (say, 0.25, 0.50 and 1.00). Sum up each person's score, then either do something similar to writer's choice (post winner and close runners-up) or split the list into quintiles and list the names in each bucket (one star for lowest, five stars for highest).
That star rating could be factored in like an additional "voter," but I think that would encourage gaming the system.
I was going to say that those that deny it first are usually the biggest offenders... but from my observations both Jeff and Dave are very honest.
Has it been long enough yet to deny it without arousing suspicion?
Maybe I'm naive, but would people really do that in a friendly competition with no financial gain? And are some reviewers swayed by those reviews that have come before? - times like this I wish I had studied pshycology.
I second your comment about Dave - very selfless, I've seen him give countless advice without ever asking for anything in return
Jeff on the other hand... Kidding! His comments may cause controversy but you can't deny the honesty
Maybe I'm naive, but would people really do that in a friendly competition with no financial gain? And are some reviewers swayed by those reviews that have come before? - times like this I wish I had studied pshycology.
It's just gamesmanship and a little bit does no harm. In fact, for me, it enhances the whole thing.
You only have to look at the reviews to see they have been swayed by the previous ones. Sometimes, it looks like they've outright copied certain parts from other reviews and not bothered to read the script at all. Imagine how much time that will save.
Wherever humans can manipulate things to their own ends they will inevitably do so... some will go even further than gamesmanship and resort to outright dishonesty.