Open access and epistemology

In a recent Nature piece, Paul Ginsparg recounts the evolution of ArXiv, an online repository for preprint article submissions in physics that he created two decades ago at the dawn of the Internet, and reflects on the compartmentalized nature of the academic publishing system.

ArXiv has become a hugely popular source for freely accessible, full-text preprint articles in various fields of physics, including new ventures by physicists into quantitative finance and biology. But that popularity exists concomitantly with a system of academic publishing that rewards intellectual precedence and prestige based on rigorous processes of peer-review for quality control and limiting access to vetted (“quality”) content.

David Mermin memorably wrote in Physics Today in 1991: “The time is overdue to abolish journals and reorganize the way we do business.” By the mid 1990s, it seemed unthinkable that free and unfettered access to non-refereed papers on arXiv would continue to coexist indefinitely with quality-controlled but subscription-based publications. Fifteen years on, researchers continue to access both, successfully compartmentalizing their different roles in scholarly communication and reward structures.

Current academics embody the tension between the culture of expertise reinforced by subscription-based publication systems and the move toward democratizing knowledge embodied by open-access, Internet-based publishing schemes. Requiring payment in exchange for access to academic content can certainly complicate the research process, especially for students who are affiliated with smaller universities who cannot afford the financial burden of providing access to large numbers of journals. But shifting to open-access repositories or Internet-based publication hosting systems that are more inclusive comes with its own share of problems:

Navigating increasing quantities of data inevitably raises concerns of information overload… The superficial response is a call for better filters, but an imperfect filter can be more harmful than none. For example, commonly used recommender systems based on passive measures of global popularity can broaden individual reading choices, but effectively broaden everyone in the same direction, thereby leading to less overall community diversity. (The cynic would say reinforcing faddishness in already faddish fields.)… On arXiv, we have seen some of the unintended effects of an entire global research community ingesting the same information from the same interface on a daily basis. The order in which new preprint submissions are displayed in the daily alert, if only for a single day, strongly affects the readership on that day and leaves a measurable trace in the citation record fully six years later

The trade-off here is a question of the nature of knowledge itself: under what conditions or circumstances does information count as knowledge? When it proves most useful, or when there is a consensus on its validity? What kind of consensus is necessary, a few experts or many diverse perspectives? Does democratizing information sacrifice quality for quantity? But perhaps one of the most salient questions concerning the structure of the academic publication industry is whether there is a true dichotomy at work – democratic or aristocratic certification of knowledge – or whether there is some continuum along which information is judged to be knowledge and within which a balance between the two modes of quality control can be struck.

This entry was posted in Future of the University, Libraries, Open Access. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>