Wikipedia and Citizendium
Wikipedia is notorious for its editorial ambiguity. Not only is the positivist norm, which includes neutral presentation of facts, accountability and consensual writing, not always complied with (forgery, disputes, etc.), but, more importantly, its leverage inside this project is structurally challenged by other philosophical approaches (liberalism, communitarism, relativism) [1]. The debate surrounding the online encyclopaedia Citizendium might not have received sufficient attention in this regard. In the light of Wikipedia’s success, Larry Sanger’s rivalling project might, at first glance, appear somewhat childish and vain. Yet, personal quarrels aside (Larry Sanger is, in fact, cofounder of Wikipedia, even though Jimmy Wales is still contesting this [2]), the weakness it has pointed out (quality) and the solution it comes up with (expertise) legitimately call for some reconsidering.
As pointed out by Larry Sanger himself [3], his project is, in many respects, completely different from Jimmy Wales’ brainchild. Admittedly, both of them are what you call wikis, that is a website worked upon by a variety of people, with writing and reading modes that can grant more or less access [4]. In addition, Citizendium uses the same wiki engine as Wikipedia (mediawiki, open source), which accounts for the resemblance in appearance. Only, Larry Sanger restricted the “open” access in the wiki’s writing mode [5], opted, right from the start, for English as the only language [6] and limited peer review [7]. Without it constituting its main goal, Citizendium strives to recycle Wikipedia, while improving the spontaneous contributions through a subsequent “gentle expert oversight”, fully aware of this not being a real expert control [8]. The editorial policy could be described as being somewhere between a “bazaar” (a spontaneous and chaotic bottom-up build-up) and a “cathedral” (a controlled and planned top-down construction) [9]. It also includes a legal responsibility [10]. Everyone contributing is required to sign their article and have their identity authenticated. Featuring only 5600 articles, one language and expressing a certain feeling of déjà vu, Citizendium might seem ridiculous. And yet, this sort of distant position could prove most efficient in the future.
Citizendium, in fact, and by no coincidence, picks up where Nupedia – the online encyclopaedia that was supposed to use its wiki space (which, soon afterwards, was to become Wikipedia) as a training ground in preparation for the real thing, providing a makeshift storage space for unfinished articles that were still awaiting approval by the team of experts – left off. And, this training ground then ran off to become this untamed hybrid we all know. On the face of it, this novel and surprising autonomy appeared to have completely changed the relationship between the two projects: Nupedia seeming to be some sort of stillbirth, irrevocably supplanted by Wikipedia’s exponential growth. When, in fact, this supremacy is neither as absolute nor as definite as one might think, at first. Nupedia’s scientific aspirations – a resolute editing policy, decided on by a team of experts – did, indeed, not simply disappear, but can still be retraced, almost in their original form, in Citizendium’s methods, with only the experts’ access being made a lot easier. Having fully accepted Wikipedia’s precedence, this new rival plays the second fiddle quite aptly, taking on a more reflective character. All in all, Larry Sanger’s only real mistake is that, in keeping a universal range of topics and hanging on to an ideal of neutrality [11], he is pursuing the very same ambitions as Wikipedia, for it is, as we are about to see, right here that Wikipedia (Version 1.0) is likely to become once again a fierce competitor. The idea of “remixing” relatively crude material under expert control, while following a responsible editing policy, on the other hand, seems actually destined for a bright future.
Two criteria and one problem
That Larry Sanger’s intuition was not far off becomes evident considering the ongoing debate inside Wikipedia on selecting the most reliable articles after authentication by a review team. The “Wikipedia 1.0 project” is going to be finalised in the medium term (probably by 2010). The French Wikipedia Version 0.5 should see the light of day in 2008, with an approximate 2000 articles (in march there were about 800 of them [12]). The Version 0.5 in English, on the other hand, has already been out since 2007, the Version 0.7 (30000 articles) being due in 2008. The question of the article selection came up when the release of an offline or print version of the encyclopaedia was being considered, leading to the idea of dividing the encyclopaedia in an unstable and a so called stabilised version.
The two stabilisation criteria that have been found are an article’s quality and its importance score, the two of them being, however, completely independent from one another. An article’s importance can range from low, mid, high to top. The importance, or priority, is determined as it applies to a specific WikiProject and can thus vary from one project to the next, for one and the same article. Only, that being the case, why bother insisting on “the article on unemployment being more important than the one on Paris Hilton” [13]? There really seems to be some sort of contradiction here, between an absolute but implicit guideline (seriousness) and the fundamental principle of freedom (relevance depending on a project and its thematic focus).
The following categories have been established for rating an article’s quality: start-class, C-class, B-class, GA good article, A-class, FL Featured List (almost “FA featured article”); the same scale applies to the “portals”. Yet, these quality criteria are also affected by the afore mentioned editorial vagueness: apart from formal criteria (spelling, grammar, layout, pictures) and the respect of documents’ licences, the “encyclopaedic quality” demands entries to be clear, exhaustive, neutral, relevant and their sources to be quoted, etc. It is noteworthy that the quality score is established by vote, which, in case of the article being of (high) quality (marked by a yellow star in the upper right corner), is indicated at the bottom of the page. This is how the French article on “The mutiny in Jerusalem in 1920” got elected in the first round of voting (13 in favour), whereas the one on “Adam Smith” had to go into a second round (12 votes against one). With every user who has added more than 50 contributions having the right to vote, the status will only be awarded if a quorum and a qualified majority have been obtained (80 % in favour of granting the FL status).
In spite of being an eligibility criterion, this project internal evaluation does, however, not guarantee validation. This subtlety proves quite revealing in terms of Wikipedia’s ambiguous attitude towards expertise. In a very engaging article, Christian Vandendorpe takes a somewhat prudent stance on the crucial matter of “peer review” [14]: he points to Wikipedia’s “General Disclaimers” to remind of the absence of formal peer review, only to go on to describe passionately the case of a university teacher encouraging his students to write an entry on Wikipedia in order to learn to face up to the challenge of peer review (notes 2 and 11). It is exactly this sort of ambiguity that is so typical of Wikipedia. First of all, the wording is unclear: the English version does indeed speak of “peer review”, while at the same time emphasising that it is not an academic peer review. The French version, on the other hand, speaks of an informal “reader committee” (comité de lecture) to whom one might refer oneself (voluntarily) by way of a simple request. And then, ambiguity is also furthered by the fact that the encyclopaedia sort of outsources peer review, by insisting, as it does, on providing articles with preferably peer-reviewed notes and sources. An article of the category A should, for example, “include a sufficient number of external written references, coming preferably from print sources (wherever possible from reader committee approved publications) rather than online sources”. Some of the English web sites downright subscribe to the objective of Wikipedia becoming more and more expert-controlled, using even the term “validation” – all while being perfectly aware that this will lead to a break with the founding philosophy of this project [15].
Leakage paths
As we can see, there are, essentially, two limitations in Wikipedia that make a “remix” (à la Citizendium) inevitable, which are, truth be told, inherent in any encyclopaedic project, but constitute, as far as Jimmy Wales’ venture is concerned, outright digital “leakage paths”. The first of these limitations can be called clinical, the other one critical.
Aside from its medical significance, “clinical” points to the problem of the knowledge of singularity, its modalities and, most of all, its scale (or its “pixel”, in terms of photography). The encyclopaedia’s virtuality leads in the long run to its universality, as it is in a continuous state of expansion. Then again, completeness is by definition impossible, for knowledge always succumbs to the details of reality. Take, for example, the incredibly detailed entry on “bike” (vélo) on the French Wikipedia: why do they not mention the bicycle races in Quimper in October 2002 and link it with a page on the weather they had back then, the colour of the leaves, the shape of the clouds, etc.? Anyone who cares to give it a try will find his article deleted for being “too detailed” to be really “relevant” [16]. In the end, any kind of knowledge is based on selecting information in the chaotic multitude that reigns in this world, rendering any hope of being exhaustive as complete and ludicrous an illusion as was Borges’ fictional 1:1 scale map. Wikipedia therefore draws the line at a certain level of generality that one might, however, wish to exceed in a legitimate pursuit of clinical precision – like a fan of cycling sport in the Bretagne might for example.
The second leakage path is the critical one [17]. Time and again, the so-called “neutral point of view” has often been denounced as being ambiguous, especially by commentators coming from the online “wiki” galaxy [18]. There is, for example, this one critic on Meatball Wiki who decries the alleged objectivity of such an ideal, the false symmetries it induces and the apathy of a forced consensus (reached by the elimination of controversies). In fact, this kind of “neutral” editing policy is only possible in cases where existing knowledge is being summarised and impossible where research is concerned that involves a strong possibility of controversy [19], having, of course, a certain tendency to stir up a variety of points of view [20].
Remix
These two difficulties will eventually cause the fragmentation of this common-knowledge platform – Wikipedia – and the subsequent emergence of new, more specialised versions with stricter editing policies [21]. These “micro-remixes” offer a space completely different from what existed so far, not so much a substitute for, as a perfect complement to today’s encyclopaedia: a universe of small interlinked groups of scientists who face each other in an arena of conflict, their struggle for interpretations constituting the way critical knowledge is formed (rather than being only a remnant of it [22]). Such a “battle” of different viewpoints, as envisioned by Jimmy Wales and Larry Sanger, can never take place on Wikipedia (not even on the so-called “discussion” pages), nor, in all likelihood, inside one of the small groups that common concern or interest brought together; eventually, it will come into existence, though, in between these groups, born out of their mutual opposition. Dissent, especially in Social Sciences and Humanities, does, indeed, require the kind of critical editorial choices Wikipedia’s “neutrality” makes impossible. Which is why, apart from knowledge transfer, Wikipedia should limit itself, in the future, to mapping out arenas of conflict and to simply stating existing positions; further critical argument should be left to other micro-wikis. Incidentally, every new Wikipedia “remix” is likely to increase the dependency on this matrix of each of them, eventually turning it into a new koinè of knowledge, a necessary starting point for opinion building [23].
Obviously, this disaggregating-reassembling movement does not in the least conflict with its tendency to unify. Far from being an exception, such a symbiosis matches the very evolution process of this new web that has so inaptly been dubbed “web 2.0” or “web 3.0” and is, in fact, essentially socio-semantic in nature [24]. This not only becomes obvious in the way online knowledge is being built, but also, for example, in the search for information. Indeed, the absolute reign of big generalist search engines seems to draw to an end, as a multitude of specialised, more or less “social” engines (educated by users who grade the search results [25]) make their appearance; they are “remixed” by meta-engines applying ranking scales that are not simply automatic (as opposed to Google’s “Page Rank” [26]). By turning the engine into a human-controlled machine rather than a self-sufficient robot, each time the relevance of the search results can be increased: the computer is “assisting” in a search conducted by a human [27]. That way, micro-engines recycle search results generalist engines have come up with, and meta-engines (like ari@ne or ixquick) reassemble results found by micro-engines, and standard engines… It would be possible to analyse the formation of social networks in very much the same way [28].
One could therefore say that Wikipedia is in itself less important than the principles it stands for: collaboration, writing empowerment, openness to modification and completion. These principles will, undoubtedly, be transformed along the lines of the afore-mentioned critical and clinical leakage paths. The more mobile the content (technically [29] and legally speaking) of the web – indefinitely deformable and apt to join the new “organs of the soul” – the more applications there will be. If one can overlook the pseudo argument of the enormous proportion of poor quality produced by spontaneous contributions [30], one understands this original new intellectual coming into existence before our eyes: he is at the same time critical and committed (Sartre), specific (Foucault), collective and virtual – a veritable DJ of digital knowledge.
Translated from French by Marie Reetz