Over the past two decades, global rankings of universities have become a dominant feature of the international higher education and research landscape. The launch of the Academic Ranking of World Universities in 2003 was followed by an explosion in efforts to measure, rank and compare the performance of universities at national and international scale. However, quantitatively focused efforts to measure university performance are, by their nature, problematic (Chan, Fung, and Chang 2016). The commercially owned bibliometric databases that dominate international rankings focus on a narrow range of research outputs, provide poor coverage of research published in languages other than English, and limited coverage of the humanities and social sciences (Aksnes & Sivertsen 2019). In spite of their limitations, rankings have had profound consequences for the ways in which universities are funded, their status and ability to attract students, as well as for the ways in which resource allocation, promotion and tenure decisions are made (Stack 2020).
In this paper we argue that a race to compete within a global landscape defined by incomplete metrics has narrowed the focus of many university administrators, and researchers themselves. Rather than supporting greater transparency and rewarding ‘excellence’, global rankings are reinforcing privilege, deepening inequality and creating structural disincentives for the cross-boundary collaborations needed to move knowledge beyond discipline and credential-based silos. The result is a growing misalignment between the values-based missions of many universities and the pragmatic actions that must be taken by the institution in order to ensure performance in international league tables. However, a change in direction is possible.
It is possible to change the questions being asked of metrics systems. It is also possible to address gaps in the data sets that metrics depend on; and to build new tools that can help research communities to understand and explore their effects on the world in new ways. In order to achieve this, research communities must develop new forms of digital literacy capable of supporting critical interactions with the tools used to measure and narrate their performance; as well as new conversations about what universities are for, and how they can best deliver on their mission.
The Curtin Open Knowledge Initiative (COKI) is an example of an effort to develop the data sets, tools and critical perspectives needed to support such conversations (COKI n.d.). COKI is a major, strategically funded research project located within the Centre for Culture and Technology at Curtin University. It is combining critical humanities perspectives with cloud computing and big data capabilities, as well as cutting-edge data science. A key goal of the COKI project is the development of new approaches for capturing university performance that are capable of recognising and supporting the value of the knowledge created when different communities collaborate, explore, and compete to solve problems. This includes transparency in data gathering; sharing data, methodologies and tools openly; and enabling institutions and individuals to independently examine their own research performance data and to choose the indicators that they wish to be measured by.
The higher education and research landscape is peppered with a growing number of examples of grass-roots efforts to transform universities into institutions that both represent and include the diversity of the communities they serve; and to find better ways to mobilize the knowledge that universities create in order to maximize its positive impact. As digital technologies create new possibilities for the ways in which data can be gathered and shared, the scope for communities to become active partners in the research process is growing. Researchers and universities are experimenting with new approaches to working with communities to help shape research directions and questions, as well as the scholarship and education that research enables. Citizen science projects such as the University of Adelaide’s Echidna CSI initiative, which invites the public into the data-collection process, are one example (University of Adelaide, 2020). The ‘Noongarpedia’ project, a university-led, community engaged effort to create Australia’s first indigenous language wiki-pedia is another (Tan, 2016). Attempts are also being made by universities, research funders, libraries, and researchers themselves to widen access to university research outputs through open access (OA) (Moore 2017); and to ensure that the benefits of research work are maximized through open data (Mangul et al. 2019).
However, the open knowledge aspirations and practices of research communities exist in tension with the ways in which university performance is measured and rewarded (Benneworth et al. 2019). In spite of a growing list of genuine, thoughtful, and often creative efforts to ensure that knowledge generated within universities benefits the widest possible communities, universities themselves continue to be assessed, ranked and evaluated according to narrow measures of performance that depend on limited data sets and ignore the value of open knowledge. The absence of evaluation frameworks capable of recognising or capturing open knowledge efforts within universities makes it difficult for university administrators to systematically support open knowledge across an institution; or to engage effectively with the challenges and opportunities that universities face in the context of rapid social and technological change.
The dominant role that international rankings now play in university policy makes it easy to imagine that rankings have always been with us. In fact, they are a relatively recent phenomenon. The three rankings that currently dominate the global rankings market emerged in 2003 and 2004. In 2003 Shanghai Jiaotong university released a league table of world university performance, focusing on citations data. The release of the Shanghai ranking was quickly followed by the launch of the Times Higher Education-Quacquarelli Symonds (THE-QS) ranking in 2004; and the widespread incorporation of international rankings as a touchstone in university strategy (Stergiou and Lessenich 2014). In 2009 THE and QS rankings split into two separate rankings. The Shanghai Ranking became the Academic Ranking of World Universities (ARWU) in 2009.
The success of THE, QS, and ARWU rankings reflects the ability of these products to fill a genuine information gap: providing performance proxies that can help university management, policy-makers, students and communities to navigate a deluge of information in order to understand complex institutions – and to support decisions relating to national higher education and research policies, the investment of research dollars, and (in the case of students) the investment of the time and money needed to undertake a degree. Tools that help stakeholders to understand relative performance and change over time are especially needed. Universities have used rankings to satisfy public and government demands for transparency, and to drive strategy (Marginson 2014; Elliott 2017). Increasingly they have a strong impact on the decisions of prospective students, the major income source for most universities, in turn strengthening their hold on the imagination of university (Gadd 2020) and national leadership (Saisana, d’Hombres & Saltelli 2011; Billaut, Bouyssou & Vincke 2010).
The extent to which stakeholders in universities have come to depend on rankings highlights a key problem with narrowly defined performance metrics. Rankings are not just a tool for the evaluation of university performance: rankings performance has real consequences for a university’s reputation and access to resources, including research and student income. As a result, rankings drive behaviour at both the level of the institution and the level of the individual researcher. In highly competitive landscapes characterized by scarcity of funding, jobs, and places within the top-tier of league tables, the indicators that are measured become the focus, not just of university strategic plans, but also of the day-to-day research and knowledge-sharing practices of academic communities (Niles et al. 2020). The end result is a race to produce the narrow indicators of ‘performance’ and ‘excellence’ that rankings claim to measure, rather than on delivery of the qualities that these indicators hope to capture.
Moore et al. (2017) make the ‘argument that “excellence” is not just unhelpful to realizing the goals of research and research communities but actively pernicious’, in part because ‘an emphasis on the performance of “excellence” as the criterion for the distribution of resources and opportunity will always be backwards looking’. Nowhere is this clearer than with respect to university rankings, where circular reinforcement effects mean that rankings are perceived as both a measure of how excellent a university is, but also as defining which universities are excellent.
Not only do rankings fail to measure the qualities of universities that actually matter (Billaut, Bouyssou & Vincke 2010; Huang 2012; Selten, Neylon, Huang & Groth 2020) but they also create their own narrative which drives behaviour. They influence the results of surveys that determine future rankings, driving future funding decisions and university conformity (Espeland & Sauder 2007; Marginson 2007). The rankings that currently dominate our higher education and research landscapes do not even attempt to measure qualities that we identify as important to open knowledge institutions. They tend to drive behaviour that reduces diversity (Stack 2020), can limit collaboration, and coerce institutions into protecting power, preventing access, and excluding those who might share in success.
Perhaps one of the greatest dangers of over-reliance on international rankings is the incomplete data that these rankings rely on. THE, QS and ARWU each depend on just a single source of bibliographic data to capture the publications produced by a university; and the reach and impact of these publications. THE and QS use Elsevier’s Scopus database as the source of this information; while ARWU uses the Clarivate Analytics-owned competitor database Web of Science (WoS). The Scopus and WoS databases each have limitations – perhaps most notably the poor coverage provided for the research outputs of scholars from Africa, Latin America and parts of Asia (Tennant 2020); as well as limited coverage of outputs in the humanities and social sciences. Scopus and WoS both focus on a select set of publication sources that favour English-language outputs and well-established publishers (Aksnes & Sivertsen 2019). In order to boost their rankings performance, many institutions actively encourage researchers to publish in the journals that are indexed by Scopus and WoS, perpetuating the dominance of such publications and creating structural disincentives for diversity and experimentation in scholarly communication practices (Tian, Su & Ru 2016; Vuong 2019).
Scholarship on digital literacies has largely focused on teaching and learning, young people and students in the use of technology, digital and social media and digital material (McDougall, Readman & Wilkinson 2018). Noting the divergence of digital literacy definitions according to context, media and discipline, Luciana Pangrazio (2016) argues for a critical digital literacy that engages with social, cultural and political issues, including visualisation and critical self-reflection (p. 171). Similarly, Nascimbeni (2018) identifies a critical dimension of digital literacy extending beyond functional and technological skills to question the “non-neutral nature” of digital tools, environments and data (p. 3). Increasing levels of cynicism and frustration among teaching and research staff with the tools and processes used to measure their performance; as well as a growing body of scholarship engaging with the problems of rankings and their consequences (Shahjahan, Blanco Ramirez & Andreotti 2017), suggests that this form of critical literacy is emerging, at least at some levels within universities.
In spite of this, many university staff lack a detailed understanding of what dominant rankings do and don’t measure. The need for transparency and accountability, and to demonstrate value for money in research and higher education, has increased the dependence of university administrators, research funders and researchers themselves on both rankings, and commercial products that draw on a narrow set of commercially operated bibliometric data sources. There is limited awareness of the prejudices and blind spots that these measurement tools create and perpetuate. Even when stakeholders are aware of the limitations of widely used commercial sources of data relating to research performance, the range of alternative tools or services capable of supporting strategic decision-making is limited.
However, it is possible to change the questions being asked of metrics systems. It is also possible to create new tools for understanding the performance of universities and to address the gaps in data that exclude whole areas of value or knowledge-making populations. Shifting this conversation demands new critical literacies that can help researchers to better understand the tools, data sets, and methodologies used to measure the performance of universities; as well as the agency that institutions and individuals have within this process. Doing so has the potential to help universities to create a stronger alignment between their own values and the pragmatic realities that drive behaviour across an institution. As organizations that employ critical humanities scholars, data scientists, and specialists in the scholarship of evaluation and strategic decision-making, universities are well positioned to support this change.
Curtin University’s decision to make a strategic research investment in a major interdisciplinary project focusing on the development of new theoretical frameworks and tools to support open knowledge is an example of a constructive intervention that is supporting the new critical literacies that are needed in this space. It is also a practical effort to shift conversations about the evaluation of universities in a more productive direction. The next section of this paper takes a deeper-dive into the practical work that the Curtin Open Knowledge Initiative project COKI is doing to support new approaches to understanding the performance of universities and to enable open knowledge institutions.
In 2018 Curtin University made a major strategic investment in the development of new research, tools and datasets capable of supporting the university’s vision for itself as ‘a beacon for positive change, embracing the challenges and opportunities of our times to advance understanding and change lives for the better’. (Curtin University 2020). COKI, the Curtin Open Knowledge Initiative is a collaborative project located in the Centre for Culture and Technology, and supported by the Faculty of Humanities, the School of Media, Culture and Social Inquiry, and the Curtin Institute for Computation. With initial university funding of $2 million, COKI is combining critical humanities perspectives with data science and cloud-based computing approaches in order to capture and explore the data needed to help universities to maximize the positive impacts of the work that they do.
COKI is underpinned by the assertion that universities have the potential to act with principles of openness at their centre; and that doing so will be vital to their ability to engage effectively with the opportunities that new technologies present for making and sharing knowledge; as well as the capacity of universities to advance understanding and change lives for the better (Montgomery et al. 2020; in press). As the project’s first major output, Open Knowledge Institutions: Reinventing Universities (2020; in press), observes, the context in which universities operate is changing. Open and networked digital technologies, networked practices of knowledge production, and new possibilities for knowledge certification and sharing are challenging the privileged position of universities as sources of ‘expert’ knowledge.
In spite of this, universities remain powerful institutions, and important actors in the processes that shape global knowledge landscapes. In order to ensure their activities contribute to the common stock of human knowledge and create benefits for all of humanity, rather than a privileged few, universities must turn their attention to their performance as open knowledge institutions. Open knowledge institutions don’t just benefit from diversity and openness, they also have a responsibility to ensure that the principles of openness, transparency, and fair governance of shared resources are at the heart of all that they do.
COKI grew out of a need to critically address the methods of evaluation and judgments of university research output and performance using measures that are perceived as ‘objective’. This requires a multidisciplinary approach to critique and understand the underlying institutional pressures and constraints that world rankings have reinforced. While research may develop and expand in methods, material and scope, the narrow encompassing academic structures and external evaluation bodies, including governments, do not reward or encourage open research practices. Achieving institutional openness and diversity involves the coordination and communication of policies and programs, and collaboration within institutions.
Our initial goal within the Curtin Open Knowledge Initiative COKI was to identify what data and evidence was available that was relevant to the evaluation of open knowledge institutions. In Open Knowledge Institutions: Reinventing Universities (2020; in press), we identified three areas of activity and capacity that were important: communication, diversity and coordination (which incorporates aspects of public and societal impact). While there are limitations in the data that can be gathered across these areas, and many kinds of data differ across countries and institutions, the ultimate goal was to make an effort to touch on all three aspects.
Existing rankings and the data on which they are based cannot provide this information. New data sets, forms of analysis and tools are needed. COKI is developing these resources through intensive collaboration between data scientists, data analysts, humanities and social science scholars with knowledge and experience in statistics, networks, publishing and scholarly communication.
Starting with an examination of the data available on formal publications and open access (i.e. one aspect of communication) we are gradually expanding a data warehouse that is focused on the outputs of universities. This database includes trillions of data points on more than 100 million outputs from more than 20,000 institutions. While we have reason to believe that this may be among the largest of similar datasets it still has limitations in the coverage of geographies, types of outputs, and relevant indicators. In order to address these limitations, we are working to expand the scope of the COKI database. For instance, we are actively working towards better coverage of Latin American and African research outputs, of scholarly books, and of a wider range of indicators of usage and impact.
Alongside this data we have critiqued and collected staff demographic data for institutions in the UK, Australia, Aotearoa New Zealand and South Africa, examining both the differences in the data available and means of combining it (Wilson et al. submitted 2020). This provides one window onto diversity. We have also collected information on the websites of 10,000 universities and are building a corpus of policy documents, mission statements and annual reports as a means of interrogating the narratives that universities generate about themselves. Finally, we are investigating a range of possible indicators of the openness of campuses, and their usage, including accessibility to public transport, walkability, and the degree to which the unaffiliated members of the public can use library resources (Wilson et al. 2019).
This data is making it possible for the COKI team to identify relationships between national policy interventions; the demands of funders; and the extent to which university research outputs are available in open access (Huang et al, 2020). The research that is building on the data is shedding new light on pathways that universities in different regions, or with different funding profiles, take when on a journey towards more open approaches to scholarly communication.
In addition to scholarly research outputs and analysis, the Curtin Open Knowledge Initiative (COKI) is also developing practical tools that can help universities and other stakeholders to better understand their performance as open knowledge institutions. Interactive dashboards and automated reports on progress that make it possible to explore data relating to open scholarship are an important first step in this process. The interactive dashboard tools present visual narratives and analysis of university research output performance and staff gender (See Figure 1).
Critical to building useful tools is a user-centred design approach being applied across all of the reporting tools that COKI is developing. One example of this process is the development of a dashboard focused on the open-access reporting needs of Australian and New Zealand university libraries. In 2020, COKI undertook a review of dashboards developed for universities in Aotearoa New Zealand and Australia (Curtin University Human Ethics approval HRE2020-0086). We invited members of CAUL (Council of Australian University Librarians) and CONZUL (Council of New Zealand University Librarians) via email to register for access to the dashboard, to review their institution’s data, and to participate in an anonymous online survey with an optional follow up telephone interview. Survey participants provided feedback on the COKI data and the dashboard functionality for their institutions. Respondents noted the usefulness of the dashboard’s ability to provide a new perspective on existing data, as well as access to new or previously unavailable data in a visual format.
The Curtin Open Knowledge Initiative COKI identifies a need for institutional cultural change. But the project also recognizes that this change is complex. It necessarily involves engagement from university leadership, in the context of a wide range of demands on their time attention. It therefore needs the support and expertise of functional leads, such as heads of libraries, or of digital services, as well as that of academic and other staff. All of these differing groups have different information needs, expertise and motivations.
Key to the approach COKI is developing is a coordinated platform that provides information in different forms, with different narratives appropriate to various stakeholders, in a way that aims for connection and coordination. At the level of university leadership this might be represented by a single number, bullet point or graph, that can grab their attention, and identify options for action. That single point needs to be supported by connected data and evidence and tools that will guide those who become responsible for that action at the level of planning, implementing and evaluating progress. If we are to achieve an institutional literacy then we must support the growth of that literacy across the set of relevant institutional stakeholders and at all the relevant levels of granularity.
Providing relevant and usable information to those stakeholders that are already well informed of issues is an important first step. The dashboards and reports described above, intended for those within libraries and digital services who are engaged on issues of open access are one example of that. In these cases, information and evidence need to be well summarized and organized but can also be detailed and nuanced. The detail and the caveats are important to these audiences.
There are potentially new audiences within the university at these levels in areas such as technology transfer, community and government relations, and in human resources. Many of these groups already collect data, some of it detailed and sophisticated. Here our goal is to connect their existing information gathering and strategy activities to the broader body of information and evidence. These groups are both users of information and information products, but also potential sources of information. One example of this is our inclusion of the staff demographics data within the open access dashboards. While we do not have direct current links between open access and staff diversity, our goal is to raise the question of what linkages there might be, and to signal that these are both part of a bigger narrative around open knowledge and the place of the university in society (See Figure 2).
While these information products can support the existing advocates of open approaches within universities, we have also identified a clear need for ‘permission to act’ at all levels. In the case of heads-of-function this often means that the leadership they report to needs to see open practice as a strategic priority. In other cases, it may be the need to justify resource allocation. In both cases this requires gaining the attention of leadership-with-limited-time. Doing this successfully requires a mix of messages, on both areas of strength and opportunity, as well as challenges and risks. It also requires succinct, targeted and clear messaging with defined points of action. Perhaps most importantly, it requires information of a type and form that is immediately comprehensible and digestible. That is, it must match the expectations of leadership for format and context.
Ultimately, this returns us to rankings. Rankings and summarized performance metrics are among the core information resources that university leadership work with. Comparisons, dashboards and trend analysis are key. A ranking on which a university performs well is a cause for celebration, one on which a university performs less well is an area for concern or opportunity for improvement. The issue here is focusing attention on aspects of performance as an open knowledge institution. Summaries will be imperfect, and indeed are at odds with the centrality of diversity as a core component of an open knowledge platform. However, we will not achieve an in-depth literacy on these issues prior to the platform gaining attention.
The goal of the Curtin Open Knowledge Initiative COKI is to build a flexible information architecture that can support the growth of a collective and institutional literacy on open knowledge practices. Strong narratives, supported by transparent and reproducible evidence, can be shaped into differing products for different audiences. If these products are connected and built effectively then one can offer an entry point to another. If one simply garners attention, then another can be used to focus that attention on relevant actions. Some of these products will be simplified and lacking the full depth of nuance in the underlying evidence. Ensuring that simplified perspectives on complex data are grounded in evidence, that connections between headline information and underlying data are visible, and that underlying data is accessible, will be vital to supporting critical data literacies and more nuanced conversations about what data means.
All of these resources are products that may be read. But digital literacy and agency is more than consumption. At the core of our idea of an Open Knowledge Institution is action by diverse communities, being coordinated to create knowledge that can be effectively communicated. At the core of our information products is the data warehouse itself, composed (as far as is possible) with open data in an environment where it can be directly interrogated by expert users. This enables the generation of new information products by community members and experts in specific contexts, allowing data to be translated into formats specific to the focus and needs of diverse audiences and communities.
Literacy in the sense that we mean it, as with culture, is necessarily a characteristic of communities, not individuals (Hartley & Potts, 2014). It is not simply a capacity to absorb but the agency to interact and make meaning collectively, again a concept at the core of our idea of open knowledge. This requires shared constraints on information structures and meanings, but also the flexibility to build anew. A data warehouse without constraints or structure is a meaningless jumble. But all structure constrains the meanings that can be constructed. Our goal is therefore a grammar, not a vocabulary, but we must take care in our work that the grammar constrains what can be said as little as possible.
There is therefore, a separation between the data warehouse and the information resources that COKI is creating. The latter are entry points, structured to enable connection between them, and as a result constrained in the narratives that they support. The data warehouse is much more flexible, supporting a wider range of narratives and analyses. The challenge lies in maintaining both a strong connection between them and the separation of concerns that provides that flexibility. This, in the end, mirrors the challenges of the university, in providing a flexible platform for communities to make many forms of knowledge, while simultaneously supporting and coordinating efforts to make those knowledge products mutually comprehensible.
Our existing frameworks, built on exclusive and proprietary data that cannot be reshaped and filtered in new ways, through third-party rankings that drive behaviour and tactics in ways that are at odds with the societal missions of universities set a strikingly low bar to improve on. They neither support literacy (because it reveals their shortcomings) nor allow the construction of new narratives (because that is at odds with their business models). We can do substantially better.
The Curtin Open Knowledge Initiative COKI project has identified the need for open and critical digital literacies among university researchers and administrators whose institutional research output is measured and judged by world university rankings. It critiques the production and communication of data relating to university research performance and the impact of ranking evaluations. The project investigates data inclusions and exclusions, and builds open datasets from multiple publicly available data sources. This process requires examining the inequalities, knowledge and skill gaps within the higher education rankings and metrics datascape. We are developing tools to enable researchers, university executives and administrators to interrogate, evaluate and understand their research performance and their open knowledge achievements, individually and collectively. Using the COKI tools and comprehensive datasets, universities can build digital literacies and capabilities to make independent assessments and ask questions about rankings, and contribute to knowledge-making and sharing processes.
Our goal is to build tools and systems that enable the creation of new narratives and new modes of evaluation. However, we also need to meet time-poor and resource-challenged leadership where they are, and to make the case for the strategic importance of open knowledge within higher education and research institutions. We are tackling this through the creation of new data resources and coherent information products that speak to these different audiences. The opportunity is before us to reshape the way we think about the role of the university in society and to build a new literacy and culture throughout institutions that supports that change.
The authors acknowledge the financial assistance of Tencent Research in the preparation of this article.
Aksnes, DW and Sivertsen, G. 2019. A criteria-based assessment of the coverage of Scopus and Web of Science. 4(1): 1–21. DOI: https://doi.org/10.2478/jdis-2019-0001
Benneworth, P, Olmos Peñuela, J, Montgomery, L, Neylon, C, Hartley, J and Wilson, K. 2019. The ‘open’ university as a transformer of public service ideals. In: Humanities and Higher Education: Generating synergies between Science, Technology and Humanities, Higher Education in the World 7. Girona, Spain: Global University Network for Innovation, pp. 453–461.
Billaut, J-C, Bouyssou, D and Vincke, P. 2010. Should you believe in the Shanghai Ranking? Scientometrics, 84(1): 237–263. DOI: https://doi.org/10.1007/s11192-009-0115-x
Chan, T, Fung, M and Chang, N. 2016. The role of universities, the rise of rankings, and internationalization. In: Liu, NC, Cheng, Y and Wang, Q. Matching visibility and performance: A standing challenge for world-class universities. Global perspectives on higher education. Rotterdam: Sense Publishers. DOI: https://doi.org/10.1007/978-94-6300-773-3_14
COKI. (n.d.). COKI – Home. Curtin Open Knowledge Initiative. Available at http://openknowledge.community [Last accessed 25 May 2020].
Curtin University. 2020. Values, Vision, Strategy. Available at https://about.curtin.edu.au/values-vision-strategy [Last accessed 5 February 2020].
Department of Education and Training. 2019. Higher education statistics: Staff data. Canberra: Commonwealth of Australia. https://www.education.gov.au/staff-data.
Elliott, JE. 2017. Prestige auditing and the market for academic esteem: A framework and an appeal. Prometheus, 35(1): 57–73. DOI: https://doi.org/10.1080/08109028.2017.1366018
Espeland, WN and Sauder, M. 2007. Rankings and reactivity: How public measures recreate social worlds. American Journal of Sociology, 113(1): 1–40. DOI: https://doi.org/10.1086/517897
Gadd, E. 2020, January 16. Wrong question? The Bibliomagician. https://thebibliomagician.wordpress.com/2020/01/16/wrong-question/.
Huang, C-K, Neylon, C, Hosking, R, Montgomery, L, Wilson, KS, Ozaygen, A and Brookes-Kenworthy, C. 2020. Meta-Research: Evaluating the impact of open access policies on research institutions. ELife, 9, e57067. DOI: https://doi.org/10.7554/eLife.57067
Huang, M-H. 2012. Opening the black box of QS world university rankings. Research Evaluation, 21(1): 71–78. DOI: https://doi.org/10.1093/reseval/rvr003
Mangul, S, Martin, LS, Langmead, B, Sanchez-Galan, JE, Toma, I, Hormozdiari, F, Pevzner, P and Eskin, E. 2019. How bioinformatics and open data can boost basic science in countries and universities with limited resources. Nature Biotechnology, 37: 324–326. DOI: https://doi.org/10.1038/s41587-019-0053-y
Marginson, S. 2007. Global university rankings: Implications in general and for Australia. Journal of Higher Education Policy and Management, 29(2): 131–42. DOI: https://doi.org/10.1080/13600800701351660
Marginson, S. 2014. University rankings and social science. European Journal of Education, 49(1): 45–59. DOI: https://doi.org/10.1111/ejed.12061
McDougall, J, Readman, M and Wilkinson, P. 2018. The uses of (digital) literacy. Learning, Media and Technology, 43(3): 263–279. DOI: https://doi.org/10.1080/17439884.2018.1462206
Moore, SA. 2017. A genealogy of open access: Negotiations between openness and access to research. Revue Française des Sciences de l’information et de la Communication, 11. DOI: https://doi.org/10.4000/rfsic.3220
Moore, S, Neylon, C, Eve, MP, O’Donnell, DP and Pattinson, D. 2017. ‘Excellence R Us’: University research and the fetishisation of excellence. Palgrave Communications, 3(1): 1–13. DOI: https://doi.org/10.1057/palcomms.2016.105
Nascimbeni, F. 2018. Rethinking digital literacy for teachers in open and participatory societies. International Journal of Digital Literacy and Digital Competence, 9(3): 1–11. DOI: https://doi.org/10.4018/IJDLDC.2018070101
Niles, MT, Schimanski, LA, McKiernan, EC and Alperin, JP. 2020. Why we publish where we do: Faculty publishing values and their relationship to review, promotion and tenure expectations. PLoS ONE, 15(3): e0228914. DOI: https://doi.org/10.1371/journal.pone.0228914
Pangrazio, L. 2016. Reconceptualising critical digital literacy. Discourse: Studies in the Cultural Politics of Education, 37(2): 163–74. DOI: https://doi.org/10.1080/01596306.2014.942836
Saisana, M, d’Hombres, B and Saltelli, A. 2011. Rickety numbers: Volatility of university rankings and policy implications. Research Policy, 40(1), 165–177. DOI: https://doi.org/10.1016/j.respol.2010.09.003
Selten, F, Neylon, C, Huang, C-K and Groth, P. 2020. A longitudinal analysis of university rankings. Quantitative Science Studies, 1–28. DOI: https://doi.org/10.1162/qss_a_00052
Shahjahan, RA, Blanco Ramirez, G and de Oliveira Andreotti, V.. 2017. Attempting to imagine the unimaginable: A decolonial reading of global university rankings. Comparative Education Review, 61(S1): S51–S73. DOI: https://doi.org/10.1086/690457
Stack, M. 2020. Academic stars and university rankings in higher education: impacts on policy and practice. Policy Reviews in Higher Education, 4(1): 4–24. DOI: https://doi.org/10.1080/23322969.2019.1667859
Stergiou, KI and Lessenich, S. 2014. On impact factors and university rankings: From birth to boycott. Ethics in Science and Environmental Politics, 13(2): 101–111. DOI: https://doi.org/10.3354/esep00141
Tan, M. (2016, September 2). Introducing “Noongarpedia” – Australia’s First Indigenous Wikipedia. Our Languages. Available at http://ourlanguages.org.au/introducing-noongarpedia-australias-first-indigenous-wikipedia [Last accessed 18 July 2020].
Tennant, J. 2020. Web of Science and Scopus are not global databases of knowledge. SocArXiv. DOI: https://doi.org/10.31235/osf.io/qhvgr
Tian, M, Su, Y and Ru, X. 2016. Perish or publish in China: Pressures on young Chinese scholars to publish in internationally indexed journals. Publications, 4(2): 9. DOI: https://doi.org/10.3390/publications4020009
University of Adelaide. 2020. Support research: Can you help Echidna CSI? Environment Institute. http://www.adelaide.edu.au/environment/give/echidna-csi.
Vuong, Q-H. 2019. The harsh world of publishing in emerging regions and implications for editors and publishers: The case of Vietnam. Learned Publishing, 32(4): 314–324. DOI: https://doi.org/10.1002/leap.1255
Wilson, K, Neylon, C, Montgomery, L and Huang, C-K. 2019. Access to academic libraries: An indicator of openness? Information Research, 24(1): paper 809. Available at http://InformationR.net/ir/24-1/paper809.html [Last accessed 18 July 2020]. Archived by WebCite at http://www.webcitation.org/76tOSpfrn.