Mirko Tobias Schaefer is als junior onderzoeker verbonden aan de Universiteit Utrecht (Media and Culture Studies). Hij studeerde theater, film en media studies en communicatie studies aan de Universiteit van Wenen en Digitale Cultuur aan de Universiteit Utrecht. Schaefer werkt momenteel aan zijn proefschrift "Bastard Culture! Competent Users, Networks and Cultural Industries".
Schaefer is tegen de tweedeling tussen de gewone gebruiker ofwel als de held van het informatietijdperk ofwel als de amateur die slechts middelmatige inhoud produceert.
What were your first thoughts on wikipedia?
a) a tool, using the wiki software mediawiki;
a) What a handy tool; the interface is easy to use and invites users to participate.
b) What a great resource; Wikipedia is a huge encyclopaedia and very easy to access. Aside of a great deal of well written articles, the format provides valuable links to sources and further reading. But of course there are many poor articles that need editing and improvement as well.
Is wikipedia in fact an encyclopedia, comparable to for example the Encylopaedia Britannica?
No, it is in many aspects different from established encyclopaediae; as indicated above Wikipedia is a plurality of encyclopaedias in many different languages with manifold regional differences in respect to contributors and cultural references. The easy to use interface stimulates people to participate even if it is on a level of merely correcting spelling mistakes. It does not rely on a distinguished expert forum, but on collective production strategies that developed online.
But it would be very superficial to think that there is no reviewing process taking place, that anybody could publish any article. Wikipedia developed guidelines for the production and modification of articles.These articles are not written by a community. There are communities within the different Wikipedia platforms but many articles are written by "pseudo-teams", and most often the majority of the work is accomplished by a few individuals only.
Wikipedia and Encyclopaedia Britannica are in fact very different formats. It is short sighted to compare them. The problem for Britannica is the challenge in producing and distributing knowledge. The Britannica is in need of a sustainable business model for the digital age.
There are other encyclopaedias that are used especially in scholarly work, and that are solely constructed by experts from the field in question. Those formats should not be mistaken for the Wikipedia or Britannica. Unfortunately journalists do pay little attention to the various formats in knowledge organisation.
Should encyclopedia writers and editors be professionals, should they be paid for their work?
That depends on the kind of encyclopaedia. For many formats a paid expert group makes sense. But Wikipedia would probably not have been possible to develop as it has had it been based on the work of a group of paid authors.
Should they be experts? And what makes one an expert?
An expert is someone who allocates comprehensive knowledge on a subject. Experts do contribute to Wikipedia. Despite the fact that there are many superficial, incomplete or even inaccurate articles on Wikipedia, the encyclopaedia provides an astonishing resource and valuable information. But Wikipedia offers in general the possibility to change poor articles.
The qualitative filtering in Wikipedia takes place after the intitial publication of an article, while the expert driven encyclopaedias use academic selection process. Those individuals recognized by the academic community as experts are invited to contribute to texts.
How about the uses/readers? What does one need to know as a wikipedia visitor?
For many users Wikipedia is just an easy to access and very informative source. But in fact, Wikipedia requires more intelligent users. Users should be aware of the fact that a Wiki is not a stable publication format but dynamic and easy to change.
On the other hand Wikipedia makes the process of constructing and reconstructing an article explicitly visible. By clicking the history button a user is able to review each stage and all changes an article has been undergoing during its process of production. Furthermore users should pay attention to the further reading tips, the articles are providing.
In general reading more, and reading critically is good advice to all users/citizens.
Could web 2.0, and specifically wikipedia, contribute to digital citizenship and to democratization in the world?
It is furthermore a good example of user interfaces that stimulate users to participate easily in cultural production. Wikipedia's graphical user interface requires almost no technical skills, and each site can be easily changed or extended. In so far Wikipedia contributes to 'digital citizenship' by
a) providing the platform for knowledge production and
b) constructing a growing cultural resource for retrieving information. It should be not forgotten that the maintenance of this knowledge is also an extensive aspect of the daily labour accomplished by Wikipedia users.
Is truth democratic? Is it possible for truth to be democratically determined, do you think?
Wikipedia does not construct truth, Wikipedia represents knowledge. I do not argue that knowledge is completely relative, but it has to be acknowledged that knowledge is a construction. Many cases of deliberate misinformation in Wikipedia have been revealed and corrected. Debate, transparency and the reviewing of sources, and verifying of facts is always necessary to establish credibility.
An often acclaimed criticism about Wikipedia is the inaccuracy or the lack of control of information provided by persons that are not officially recognized as experts. It actually should raise less suspicion about Wikipedia as it should point to the status of knowledge in our information society. Why do these new technologies cause such insecurity, such doubts and create a need for "secured", or "guaranteed" knowledge?
Where do the ideals of wikipedia originate, do you think?
The era of Enlightenment formulated knowledge as key resource for free citizens. Two pioneers in constituting broadly accessible knowledge were Jean-Jacques Rousseau and Jean le Rond d'Alembert who in collaboration with others produced the 'Encyclopédie'. (Rousseau and d'Alembert were businessmen too, and the Encyclopédie has been a business as well).
The legacy of Enlightenment, of access to information, the right to free press and the right to publish knowledge, is very much represented in new media. Fred Turner, assistant professor at Stanford university showed how later the counter-culture of the 1960s expected computers and computer based information technologies to conform to these requirements. Many computer and Internet pioneers even incorporated them into the actual technological design, (see: Fred Turner, From Counterculture to Cyberculture).
How do you see the future of the internet? What will web 3.0 be like, or web 4.0, 5.0? More participation, the return of the export?
Web 3.0 will be another buzz word. For now it would be important to get away from the binary discussion, that sees the Web 2.0 either as an enabling technology turning users into the heroes of the information age, or as a platform where a cult of amateurs produces mediocre artefacts. It is necessary to analyze to what extend power structures are established in the back ends of the information systems we currently still perceive as user-led development.
Participation has to be evaluated in respect of skills, invested time, and the individual’s position within a power structure. It makes a difference if a user community member is also part of a Microsoft development team or a hacker whose programming might interfere with the vendor's business model, or if she is an administrator of a commercially funded user board, or only a "newbie" without any social network, credits or credibility.
What do you think the future of information, of knowledge looks like?
A large part of information will be gathered, evaluated and represented through 'machines', hence software agents. Knowledge becomes increasingly modular and dynamic, and represents itself not anymore in the rather monolithic format of the printed encyclopaedia. Two things are necessary to focus on:
- socio-technical ecosystems, the collective production of large user groups and information systems generating and structuring information and contributing to its representation. It will be necessary to analyse the working mechanism and the 'invisible hands' inside these complex systems. Until now these were described with blurring metaphors like 'collective intelligence', 'wisdom of crowds' or the 'hive mind'. We still lack the analytical tools and the terminology to describe and examine these formations properly.
- search algorithms, intelligent software agents examining, filtering and representing information. Our world view and our decision making are increasingly influenced by information machines which generate, and represent the information our decisions are based on. Google represents the Internet, but we do not know how this representation is accomplished, we do not know what is left out and what is a constructed interpretation. Search results are not neutral but an effect of a dynamic interplay of many actors, and most of them are unknown to the users. For example, to what extend can we analyse how Yahoo search results are affected by Flickr meta tags and how the Google engine takes advantage of YouTube click ratings?
The question would be, how can the construction of information and knowledge be turned into a transparent process?
How do you think we, in free societies, should ideally approach knowledge and truth?
(Knowledge does not equal truth)
With reference to Bruno Latour it is crucial for a democracy to "make things public". The construction of facts, and the many actors contributing to the construction of knowledge have to be mapped and need to be represented. If we can understand how knowledge is constructed it will be possible to criticize it.