Wikipedia has been hailed as one of the most successful collaborative Web projects of all time. Hosting over 3 million articles in English alone, the site is one of the largest repositories of freely available information. It would take about 1,000 volumes to translate the contents of Wikipedia into print. The philosophy of the Wikimedia foundation, which runs Wikipedia as well as several other sites, is that anyone can contribute and anyone can access.

However, the site is not without its critics. Accusations include systemic biases, subjective points of view, undemocratic treatment of some editors and, most of all, inaccurate information. And while even traditional encyclopedias, such as Encyclopedia Britannica, are subject to errors, the open nature of Wikipedia makes it particularly prone to erroneous information.

To combat vandalism and inaccuracy, Wikipedia recently announced a new tool called WikiTrust. An optional plug-in available for registered users, WikiTrust works by color-coding information — giving a bright orange background to text added by new or unknown editors, while giving a white background to information that comes from trusted editors or has remained uncontested for a long time.

WikiTrust doesn’t actually verify the information’s factual content — it only measures how much users agree with it. So even when a page has been vandalized and has yet to be corrected, the vandal’s edits will be flagged as untrustworthy (so there is no wondering whether Barack Obama really is a terrorist, since that information will be flagged).

But the system seems to conflict with the democratic nature of the site. Even if correct, a new editor’s information will be flagged as untrustworthy. This favors older editors, and this kind of favoritism has plagued Wikipedia for some time.

The WikiTrust tool would only exacerbate the problem, making it more difficult for new editors to contribute to the site while allowing more “elite” editors to retain control over pages. Statistics show that approximately 1 percent of Wikipedia editors are responsible for nearly half of the changes on the site, which is certainly not a model of democracy in action. Some of this is likely attributable to a higher level of devotion on the parts of editors, which makes them more active than casual users. However, some of this imbalance is due to elite editors chaperoning certain pages.

Although the idea of implementing an algorithm to help improve Wikipedia’s trustworthiness is not a bad one, the method that the WikiTrust algorithm uses seems to widen the gap between casual editors and older, more devoted editors. The specifics of WikiTrust haven’t been revealed, but if it measures an editor’s trustworthiness based simply on the number of other articles they have edited rather than the content they have created, it seems possible that a trusted editor of a page about a video game could make edits on a page on some totally unrelated topic that would be flagged as trustworthy, even if it was wrong. If the system rates trustworthiness only on individual pages, it could lead to some editors essentially gaining total control over information on a page.

Hopefully the WikiTrust algorithm will help address some of the issues regarding incorrect information or vandalism on Wikipedia — but not at the expense of allowing the site to continue as an open, democratic platform for sharing knowledge.