Feel free to contact me about articles, websites and etc. you think I may find of interest. I'm also available for consulting work and speaking engagements. Email: ernest.miller 8T gmail.com
Last Wednesday, the Syracuse Post-Standard published an article lambasting the authority of Wikipedia because it is user-edited and anyone can make a change to its content (Librarian: Don't use Wikipedia as source).
There's just something that seems to freak people out about Wikipedia, when they can't fathom the idea that "the masses" could produce something of value by simply being able to correct each other, allowing them to build something much more beneficial and much more useful than an expensive encyclopedia edited by just a few people. The columnist ends his piece by stating: "you need to be careful about trusting what you read," while taking this email from a random librarian completely at face value.Techdirt then contacted the author of the offending newspaper article with more information about how projects like Wikipedia work and why they can be authoritative. However, that exercise apparently collapsed into sheer invective on the part of the newspaper writer. This seems odd since the author of the original piece, Al Fasoldt, is a long-time tech reporter. In any case, see Techdirt's version of the exchange (Who Do You Trust, The Wiki Or The Reporter?).
The fact that anyone can edit the pages appears to be why people like Mr. Fasoldt question its authority, but that is that exact reason that it has authority. Any comments that are extreme or not true just do not survive on Wikipedia. In fact, on very heated topics, you can see the back and forth negotiation of wordings by people with different views on a topic until, in many cases, a neutral and mutually agreeable wording is put in place and all parties are satisfied. Tradition authority is gained through a combination of talent, hard work and politics. Wikipedia and many open source projects gain their authority through the collective scrutiny of thousands of people. Although it depends a bit on the field, the question is whether something is more likely to be true coming from a source whose resume sounds authoritative or a source that has been viewed by hundreds of thousands of people (with the ability to comment) and has survived.Speaking of which, Techdirt challenged Fasoldt to make some factual changes to the Wikipedia and see how long untruths could survive. Fasoldt has not taken the challenge, but Alex Halavais, an Assistant Professor of Communication and the Director of the Masters in Informatics program within the School of Informatics at the University at Buffalo, has (The Isuzu Experiment):
No matter which side of the debate you find yourself on, this sounds like an interesting experiment. So, I have made not one, but 13 changes to the wikipedia site. I will leave them there for a bit (probably two weeks) to see how quickly they get cleaned up. I’ll report the results here, and repair any damage I’ve done after the period is complete. My hypothesis is that most of the errors will remain intact.Nope. According to Halavais "all [the changes] were identified and removed within a couple of hours. I could have been a bit trickier in how I made changes; nonethess, I am impressed." There are also some great comments on the ethics of the experiment as well as suggestions for future experiments.
One place this debate has been discussed with great insight is Corante's own Many 2 Many which also provides a wealth of linkage (Wikipedia Reputation and the Wemedia Project). In addition to the insight, there is announcement of a cool new project for journalism schools and media centers
Which brings me to an lingering thought — that explicitly codifying reputation introduces a cost which can constrain commons-based peer production. Wikipedia was never supposed to work, somehow does because of good club theory and transaction costs, and has gained a reputation as a resource. Introducing reputation for contributors or articles is the greatest risk to the Wikipedia community. Getting a base study on factual accuracy can help inform this decision as well as educate the public on how to use and participate with this commons resource.Read the whole thing.
I’ve been quitely forming a group of journalism schools, media centers and experts to engage in the Wemedia Project, which begins with a formal Wikipedia Article fact checking excercise and publishing findings. The USC Annenberg Center has already announced their support and next month we will begin the collaborative research process within a Socialtext Workspace. Without getting into defining truth, you can separate issue of fact, value or policy. The approach is to apply a formal fact checking process to a sample of articles to gain a baseline measure of factual accuracy and explore issues of reputation. [links in original]
One aspect of this that is interesting to me, is the distinction between the authority of a relatively anonymous collective in contrast to the authority of named bloggers. For example, Dana Blankenhorn argues that transparency is a key element to the authority of bloggers (Transparency Makes Blogs Believable):
This transparent relationship is at the heart of blogging credibility. J.D. Lasica tried to explain this to the "media industry" in a recent OJR piece...Absolutely. Transparency is also critical in Wikipedia, but the emphasis is different. Process and mistakes (I would call it "corrections") are emphasized, rather than motives or expertise.
- Transparency of motives
- Transparency of process
- Transparency of expertise, and
- Transparency on mistakes are all keys to success, he writes.
Finally, Mary Hodder, who is now working like a demon for Technorati, has an intriguing post that unintentionally ties these two concepts (blog authority and wikipedia authority) together (Digital Ethics II.. and the New Commodity In Online Media). Her thread regards a debate about digital ethics, which is worth following as well.
Fascinating reading, all of it.
Not to tootle my own horn, but I wrote about this very topic back in July...
see Andrew Lih: Wikipedia as Participatory Journalism: Reliable Sources? Metrics for evaluating collaborative media as a news resource
Tracked on August 31, 2004 03:56 AMExcellent discussion of Wikipedia authority from Applied Abstractions Corante.com has a great summary of The Importance of... > The Great Wikipedia Authority Debate" href="http://www.corante.com/importance/archives/005925.php">The Great Wikipedia Authority Debate. I have used the Wikipedia as a student excercise and foun... [Read More]
Tracked on August 31, 2004 03:32 PMWikipedia Authority from Anders Jacobsen's sideblog The Great Wikipedia Authority Debate (via Espen)... [Read More]
Tracked on August 31, 2004 07:38 PMThe neverending story from The Aardvark Speaks Alex Halavais proves that Wikipedia is not trustworthy and curiously interprets his experiment as a proof why Wikipedia works. In the meantime, a number of Wikipedia advocates attack a journalist because he quoted the Wikipedia disclaimer and was disap... [Read More]
Tracked on August 31, 2004 08:37 PMMore on authority and Wikipedia from lbr.library-blogs.net There has been intense discussion on blogs far and wide this week about Wikipedia. The recent attention has been driven by a Hiawatha Bray's July critique of Wikipedia in the Boston Globe, and even more by a more recent column by Al Fasoldt of the [Read More]
Tracked on September 1, 2004 04:06 PMDiacritical from Jonathon Delacour
The first sentence of Seymour Hersh's current New Yorker article, [The Coming Wars] , caught me by surprise:
George W. Bushâ€™s reëlection was not his only victory last fall.
Not that I believed tha...[Read More]
Tracked on January 19, 2005 08:28 AM