Tuesday, September 23, 2008

To Moderate Or Not To Moderate?

Via TechCrunch comes word of the NotCot network, which wants to combine user-generated content with editorial control:

The NotCot Network: A Study in Structured User Generated Content

Their model is simple enough: get users to upload content, but funnel that content through editors and moderators who look it over, assess the quality (and legality), and "ensure the quality of the content." This is roughly equivalent to what we were doing on my previous gig, where we paid moderators to keep an eye on user actions. So it's from the basis of no little experience that I can say that I know what will happen here.

First, this model is not going to scale easily. The NotCot network will fail unless they have a very active community, with lots of people uploading stuff. Moderating a forum post is pretty easy; moderating a video submission is more time-consuming, particularly if the moderator is responsible for checking for copyright violations. NotCot is going to need to keep a lot of people on staff to handle the content, and those people will need to be paid for their time.

Second, they're going to run into serious challenges with international traffic. First, because foreign-language submissions pose a new set of problems (is that song in the background under copyright? If so, who owns it and how do you check the rights?). But beyond that, there's a time challenge: if your moderators work in North America on weekdays during normal business hours, what happens when someone posts a video from Europe just when everyone in the States is headed home? One thing I learned for certain: users hate it when they upload content and have to wait to see it posted. If it's a predictable wait of several hours, they get really upset. NotCot will probably need to hire second-shift workers, and might even need to find someone to police foreign-language markets (see above about not scaling easily).

Third, it's a lot easier to say that you will "ensure the quality of the content," but when you're actually out there in the trenches, what counts for quality? The only objective standard of "quality" is popularity among users, but that standard can't be applied to the pre-approval of content. Ultimately all you'll end up doing is ensuring that the content isn't illegal, obscene, or pornographic. The "quality" bar will be tossed out the window almost immediately, just by the demands of the job.

The biggest risk here for NotCot? That their editorial review will prove too intrusive. You're already asking a lot of your users when you tell them to capture video, edit it, and upload it to your servers. Asking them to also sit through a submission process that benefits them in no way is a dangerous second step. If that second step isn't almost perfectly painless, what's to stop them from going to YouTube and uploading the video over there instead?

My first job on the Internet was with Britannica.com. Our business plan, back in those days, was to improve on Yahoo: we'd also provide an Internet guide, but ours would be composed exclusively of editor-vetted, quality sites. In the end, our visitors didn't really care. Yahoo made billions by including everybody; we ensured quality and went out of business, because the quality we provided wasn't of sufficient value. NotCot will need to find a way to do what we did better.