Filters

The Brittanica is doing something very daring: they are starting a discussion about web 2.0 (like what is the relation between Wikipedia and the Brittanica) on a web 2.0 platform (blogs). Sun Tzu whom is no doubt also in the Brittanica, would have taught never to fight on the enemy terrain. Like I said: daring.
The opening piece is from Michael Gorman where he more of less attacks the collective intelligence and defends the traditional way of finding and selecting information. The reactions by the blogging crowds are as you can expect.
However, I think there is one point overlooked by the people reacting (like Clay Shirky) to what Gorman says. Of course is unlimited information flow good. People can express themselves and lots of different opinions are available. However, filtering and rating information is important, in science as well as in everyday news. We have to know how far we can trust the information and the source. In the “old days” our filter were based in the production side. Production was costly through printing and distribution. To make these decisions we employed professionals hired by firms that printed the books and magazines. We as users could select with our feet by buying the magazine or not. Maybe not the best model overall but the best of all the inadequate models available at us at that time. The nice thing of this model is that the information that is presented to us by trusted sources is usually fairly good. We know that Nature (almost always) uses a rigorous process before publishing.
Comes the Internet and Web 2.0. With Web 2.0 our filters from the production side have been removed. It has been said often: unlimited copies, free distribution. So now we are flooded with information where it is hard to distinguish in objective quality. One of the scientific risks is that this leads to the use of information that fits a priori with your thinking, without the check on the validity of conclusion. A bit of web surfing always leads to articles that support you suspicion. For example, I very much like the writings of Clay Shirky. But most of the information I find is from his blogs. He is eloquent, he is convincing but also in many cases totally insubstantiated. Maybe it is based on facts but I have no way of knowing.
Science does need a thorough process of checks to determine quality. The work I do builds upon the works of others. If I can not trust my sources, how can I trust my results. And again, peer review is the best of all the inadequate models we have for this. I agree with Gorman that, in the end, science needs facts, not hearsay. I wonder how much of the facts in Wikipedia are based on (checked) information people found in one of those bulky paper encyclopedia’s.
We have to understand why and how our quality mechanisms work in the physical production and distribution in order to make the translation to how we deal with it in the digital world. The goals remain the same (quality and trustworthy information) but the mechanisms will be fundamentally different because the new possibilities web 2.0 gives us. Exciting new possibilities and maybe even better ones than we had in the physical domain. There is the possibility for more transparency in the peer review process. The use of scientific papers is seriously hampered by the fact that commercial organisations are running the publications. It would be better if this peer review process would be an open one (open science?) and that the scientific community is responsible for this process.
However, I think that also in that case we will need all kinds of governance structures. More open, more democratic and more transparent. It will not be “like the mind of god” nor will it be like the Hyves mind. Just work but a bit less inadequate than it is now. Small steps.
That’s how progress works.

Leave a Reply

Your email address will not be published. Required fields are marked *