Open Access to scientific results - from publications to data - is high on the agenda of science policy drivers. Since long there is no doubt that making your own research available on the web is the best way to gain visibility and impact (Lawrence 2001). The Open Access movement started with calls for self-archiving (Harnad 2001), led to new technological solutions for scholarly communication (Macgregor et al. 2014), conquered the science political area (Budapest Open Access Initiative 2002), and has now reached the stage of being implemented in evaluation schemes. For the latter Current Research Information Systems (CRIS) play an increasing role. In Europe, OpenAire2020 and euroCRIS have joined forces to promote CERIF - an XML based data model - as a standard to exchange research information, and commercial parties (such as PURE and CONVERIS) have in principle agreed to make their systems CERIF compliant. This poster discusses the use of CRIS for monitoring OA. In particular we discuss different definitions of Open Access as implemented in technical systems, and the consequences of applying those different measures to determine the percentage of OA publications of a country. More particular, we present empirical analysis of growth, institutional and disciplinary distribution of Open Access publications based on NARCIS, the RIS portal into the Dutch research landscape. We discuss limitations and differences in OA counts due to different technical solutions. Based on those empirical explorations, we see a need of further methodological research about the metrics of Open Access prior to implementation of some metrics in evaluation schemes. At the same time, we demonstrate the potential of standardized and harmonized research information, and argue for further collaboration to create networked observatories of scholarly activities, in which Open Access is one important indicator.
|Publication status||Published - Jan 2016|