A content provider in a wide area network such as the Internet can permit registered users to upload media content items for distribution to other registered users accessing the content provider. Example content providers can include social network websites (e.g., YouTube™, Facebook™, etc.), or bulletin board websites that enable registered users to post online messages. Content providers, however, can establish policies that prohibit posting “inappropriate content”, defined in the policies for example based on copyright permissions, age-appropriate content (e.g., no pornographic material), no hateful or offensive material, etc. Hence, any media content item that is deemed inappropriate by the content provider can be deleted by one or more persons assigned as “moderators” by the content provider. Inappropriate content can be detected via different techniques: manual inspection of a media content item by a moderator prior to distribution suffers from the disadvantage of not being scaleable because the amount of uploaded content can quickly overwhelm moderator capacity; collaborative moderation enables users of a website to collectively determine whether content is inappropriate based on a prescribed minimum number of users rating or “flagging” content as inappropriate.
Content providers also can delegate their moderation responsibilities to a moderation firm: the moderation firm retains sole and exclusive responsibility for moderating content on behalf of these content providers. Example moderation firms include Caleris, Inc. (available at the website address “caleris.com”) and eModeration, Inc. (available at the website address “emoderation.com”).