• No results found

ber of researchers, and a low relative number of registered publications can be identified. A further interpretation of these gaps will not be pos­

sible at this point, because it could mean either a) that these researchers do publish little, or that b) WOS covers their publications badly. The picture that evolves from my study derives from the same data that almost any bibliometric analysis is based on, so it makes a lot of a difference for the as­

sessment of the soundness of mainstream bibliometrics to which extent b) is true. To that end, the results from this mainstream­scientometric study will be revisited in Chapter 6.

3.6.1 Scientometrics of SSH

Analyses of SSH can to some extent be considered a peripheral part of sci­

entometrics, since they are not part of the scientometric core identity. Sci­

entometric indicators were originally developed to analyse publications in STEM. Therefore, mainstream scientometric methods and data sources are problematic in terms of analysing SSH publications (see e. g. Aksnes and Sivertsen 2019). They should not be adopted without considering the ef­

fects of different publication and citation behaviour, both between the SSH and STEM, as well as between different SSH disciplines (Hicks 1999; Moed, Luwel et al. 2002; Hicks 2004; Nederhof 2006; Hammarfelt 2016). For instance, STEM bibliometrics almost never include monographs and book chapters, while for the SSH, they cannot be ignored. Hammarfelt (2012, p. 31) did a meta­study on SSH citation data from 1995­2005 and found that the share of book publications in reference lists ranges between 45% in information research to 88% in studies of religion. Engels, Istenič Starčič, et al. (2018) could not find any clear trend, at least for European countries, that the status of book publishing is decreasing.

For older citation analysis studies in the humanities (1951­2010), Ard­

anuy (2013) established that only 22% made use of mainstream bibliomet­

ric databases, while the remaining collected data mostly through citation chaining.⁹⁶ More recently, the collection of full publication records on

96 For whatever reason, there is more methodological discussion about scientometrics

institutional or even national level in CRIS became an ideal data source for studies limited to those levels (see e. g. Sivertsen 2016). In general, because of scientometric research interests in the SSH leaning more to­

wards the mapping of disciplines or topics than assessing quantities of production, few studies with a broad scope are available (cf. Franssen and Wouters 2019). One exception is the global study on SScI (WOS Social Sci­

ences Citation Index) data by Mosbah­Natanson and Gringas (2014), who found that over the period 1980­2009, North America and Europe are the unchallenged ‘centres’ of social sciences, and even though more authors from other regions appear in the index, they tend to cite ‘central’ research.

Zincke (2014), whose study of Chilean social sciences I will introduce in Section 4.1, even found that ‘using the SScI database, the citations from Latin American authors to other Latin Americans have been declining’.

However, a centralisation effect naturally appears with larger national re­

search communities, because their communication is more inward­looking than the communication of smaller research communities. Large struc­

tures also attract more attention globally (Danell 2013). However, Danell also found that this tendency is somewhat levelling out internationally. A cartogram visualising all publications indexed in Scopus, comparing 2007 with 2017, also shows that global indexing proportions are stable.⁹⁷

One of the major problems with the use of mainstream databases for scientometrics of the SSH is the clear language bias found there, most evid­

ently in WOS (van Leeuwen, Moed et al. 2001; Archambault, Vignola­

Gagné et al. 2006; van Raan, van Leeuwen et al. 2011; van Leeuwen 2013).

Since the SSH frequently deals with phenomena that are strongly connected to local cultures, local languages are often used to describe them. Yet a drift towards publishing in English and WOS­indexed journals is recognisable in the publication behaviour of European researchers (Kulczycki, Engels et al.

2018; Guns, Eykens et al. 2019). At least for those social sciences authors

of the humanities than of the social sciences.

97 Alperin, Juan Pablo, and Rodrigo Costas, World Scaled by Number of Documents, http://scholcommlab.ca/cartogram, visited on 29 June 2020. The cartogramm for the number of publications as a proportion of the population in 2017 is displayed on the cover of this book.

who publish in WOS­indexed journals, in different ‘Global North’ coun­

tries, publication and citation patterns are similar (van Leeuwen 2006).

Since the study of SSH WOS indexing on a global scale does not intend to analyse the SSH, but rather WOS indexing, this short overview solely served to emphasise that SScI and AHcI (Arts & Humanities Citation Index), both included in WOS, are not accepted uncritically by the scientometric com­

munity. However, the indexes are still used a lot, and therefore feed back on the SSH, and on the reproduction of splits and trenches that follow geo­

political lines.

3.6.2 Global Basic SSH in the Web of Science

The data for the following analysis was derived from the UNEScO Institute for Statistics⁹⁸ and from WOS (see the query in Appendix A), and limited to publication years 2007 to 2016. I queried the SScI, AHcI, as well as the Conference Paper, Book, and Emerging Sources Citation Indexes to get the broadest coverage of basic SSH indexed in WOS. Beyond the WOS collections I queried, there are Regional Citation Indexes available, ingest­

ing data from ScIELO, as well as from indexes compiled in China, South Korea, and Russia. However, this data is not as rich as data from the Core Collection, and all of the following filters I applied to my query are not available for this data, so I decided to set those collections aside: since I focus on basic research, I excluded all WOS ‘category terms’ which, to me, denoted rather applied or natural sciences. The search result of 1,366,280 records was not edited in any way, so there might be duplicates and errors in this data. After the critical analysis of the WOS indexes by Tüür­Fröhlich (2016), it can be taken as a given that each search in WOS returns highly defective results. I am therefore convinced that publication counts derived from this data can only be read as a vague relative indicator of publication activity directed towards Wos­indexed journals, while the absolute number means very little. To make that visible in the text, I never number the pub­

lication records for any country, and my indicator for the relative numbers will be rounded to two decimal places.

98 See http://data.uis.unesco.org, visited on 11 April 2017.

Further, co­authorships count as one full publication for every author from a different country. For some countries, combinations of researcher count and publication data seem to be implausible, but I decided not to look into it deeply for these single cases because I wanted to treat every record equally. It is simply not feasible to review and correct the data.

Recent demographic data was not available for every country, so I de­

cided to preferably select data from 2012 or the year closest to that, with more recent data being preferred if there was a tie. 2012 is exactly be­

tween 2007 and 2016, the year range chosen for the WOS publication data collection. The UNEScO Institute for Statistics does not provide detailed descriptions of the data, and it can be expected that countries deliver data which correspond to divergent phenomena in reality (also see Keim 2008).

In the UN Statistical Yearbook (2017, p. 475), the following description is given for the researcher counts:

Data for certain countries are provided to UNEScO by OEcD, Eurostat and the Latin­American Network on Science and Technology Indicators (RIcyT). The definitions and classifications applied by UNEScO in the table are based on those set out in the Frascati Manual (OEcD 2002).

Further, professionals are considered as researchers if they are ‘engaged in the conception or creation of new knowledge, products, processes, meth­

ods and systems and also in the management of the projects concerned’

(UN 2017, p. 475). This includes postgraduate students at PhD level.

Inconsistencies in the UNEScO data became apparent when I discovered that the year chosen is very relevant for the results: for Russia, the SSH HEI head count (Hc) doubled within four years (2009­2012); for Maylasia, it doubled within three years (2010­2013); and for Thailand, it increased from 16,000 in 2011 to 24,000 in 2014. According to the data, six times as many researchers were employed in Bosnia 2014 compared to the pre­

vious year, and in Ethiopia, 2007 compared to 2013, eight times as many.

For Iran, the number almost doubled every other year between 2004­2008 to more than 21,000. For Kazakhstan, the growth rate was 160% between 2011 and 2013; and for Kyrgyzstan, almost four times as many researchers were counted in 2014 than in 2011. However, global demographic data

always has to be taken with a grain of salt, since there is no single agency which creates them. Therefore, lacking any alternatives, caution is deman­

ded when interpreting the results.

For a number of countries, UNEScO did not provide the head counts for SSH researchers working in HEI, which is my preferred indicator, and also the indicator for which UNEScO provides researcher counts for most countries. It also is my preferred indicator, because for a part­time SSH researcher at HEI , it is rather atypical to spend working hours outside of this position on something that is totally unrelated to research, except when caring for relatives. Publications are not necessarily written during paid hours.⁹⁹ If there were no head counts for HEI SSH, I collected data for these countries with the following preferences:

1. full­time equivalents (FTE) at HEI SSH, 2. head counts in the SSH,

3. FTE in the SSH,

4. head count for HEI, no disciplinary limitation, 5. FTE at HEI, no disciplinary limitation,

6. FTE, limited to neither discipline nor institution type.

As long as the data were distinguished by disciplinary fields, I included the counts of researchers working in other institutions than HEI, and FTE instead of head counts, simply because in most countries, SSH researchers typically work at HEI. I accept that a comparability is not fully given be­

cause of this decision. Furthermore, if information about the discipline was missing, I had to estimate the SSH share. Since for 98 countries, the head counts for SSH at HEI as well as the head counts for science, technology, engineering and medicine (STEM) at HEI was available, I could calculate the mean ratio with the help of 476 data sets from the years 2005­2014.

In conclusion, globally, there are 1.88 STEM researchers for every SSH researcher. This ratio was then used to estimate the head counts for SSH at

99 In Africa, many researchers are, besides their position at a university, consultants;

see e. g. Cloete, Maassen et al. 2015; Kell and Czerniewicz 2016; Maassen 2012b; Mouton 2010; Zeleza 2002. However, for Africa, the literature does not indicate that this type of research leads to a publication output.

HEI for 21 countries (see Appendix B, Tables 18 and 19). For the Southeast African countries Rwanda, Tanzania and Zambia, I estimated the numbers according to the head counts for SSH at HEI of other Southeast African countries because the higher education development in this region roughly follows similar lines in the whole region, and I will continue to work with these estimations in Chapter 4. As with WOS publication data, the initial data as well as my estimations have to be taken with a grain of salt, so I decided to round the counts to hundreds, or tens if there were less than one hundred researchers, when I mention them in the text.

For 30 countries with a population larger than one million, UNEScO did not provide researcher counts at all; in some cases, a note said ‘magnitude nil or negligible’. Countries with smaller populations and unavailable data were excluded. For eight of these countries, WOS SSH publication count 2007­2016 was higher than one hundred (see Appendix B, Table 16). Since it is unlikely that this could be achieved without any researchers present in the country, I estimated the number of SSH researchers with the help of the UN Human Development Index 2016 (HDI). I decided to use the HDI, and not e. g. only the education index which is one of the three sub­

indexes from which the HDI is calculated, because health and a decent living standard (the other two indexes) are also relevant preconditions for doing research.

For each of the 30 countries lacking researcher counts, the HDI was con­

sulted to identify the neighbouring countries in the index. Some ranks are shared by several countries, and some are not occupied, in order to compensate for shared ranks. I took all countries into consideration which were positioned closest to the country without researcher counts, in both directions, and including those which it shares the rank with, but exclud­

ing those for which I do not possess the necessary data. I made sure to base my estimation on at least two other countries. The eight countries for which the SSH researcher count and related indicators have been estimated according to this procedure are flagged with a double asterisk in the charts.

For the remaining 22 countries, the ratio x of one SSH researcher to n inhabitants has been set to equal the population count (see Appendix B,

Table 17). The large majority of these countries are low­income countries or/and are involved in conflicts. This serves the purpose of representing the inhabitants of these countries in my analyses, and their non­representation in research, in general. For the following analysis, to make visible that these are estimations without serious grounds, these countries are not mentioned by descriptions in the text, and are starred in the charts.

In 2012, all 161 included countries together have a total population of approx. 7,030,510,000, so roughly 27,000,000 world inhabitants, presum­

ably living in states with a population lower than one million, are not rep­

resented by the researchers of their home country in my study. The total estimated number of SSH researchers working at HEI in all included coun­

tries is 1,290,000.

3.6.3 The Size of the SSH Workforce vs. Web of Science Publications In Figures 2 to 7, the country with the most SSH researchers relative to the population ‘ranks’ first, reflected by a low ratio x of one researcher per n inhabitants. The top­ranked country then is Iceland.

The first 26 countries listed in Figure 2 feature mostly European coun­

tries, but also Oceania, Canada, Tunisia, and Japan. The bars show the number of publications in basic SSH research 2007­2016 registered in WOS, relative to the number of inhabitants. In the description, I will also men­

tion total WOS SSH publication counts, which will again play a role in Figures 8 to 10, but for the ranking diagrams, relative publication counts are in focus. To attain an indicator that is manageable on a diagram’s scale, for each country, I first multiplied the total number of WOS publications by 100,000, and then divided the result by the number of inhabitants.

For example, the four countries with the highest relative number of SSH publications in WOS are Australia, Iceland, New Zealand, and Norway.

The indicator tells us numbers between 280 and 244 for these countries.

For Australia then, this number means that per one Australian, there is al­

most a three­hundredth WOS publication, or, in different terms, to make it graspable, per 280 Australians, there is one SSH publication registered in WOS within ten years.

Figure 2. The first in a series of six diagrams, each displaying a group of 26­28 countries in a ranking according to the ratio of one SSH researcher per n inhabitants (the line, right scale). The bars (left scale) display the number of publications in WOS, 2007­2016, relative to the population.

For four countries in the first group, Tunisia, Japan, Poland, and Mon­

tenegro, the publication bar is quite low, although their relative number of SSH researchers is comparable with the other countries in this group and also with the countries that have been mentioned before as examples of countries with the highest relative WOS publication output. For the ex­

amples of Tunisia, Japan, Poland, and Montenegro, it becomes apparent that the two indicators displayed here only somewhat correlate. However, compared to the countries well represented in WOS, which are complemen­

ted by other countries predominantly found in Northern and Western Eu­

rope, it is worth mentioning that countries which have a very high total number of publications registered in WOS, such as the Uk and Germany, are in sync with many Eastern and Southern European countries.

Naturally, the size of a country, in terms of inhabitants, at least for the first, but, with limitations, also for the second group, makes a huge differ­

ence: in relative terms, local research facilities and staff have a similar extent,

Figure 3. Second group of countries, see Figure 2 for a description.

and publication output is often correlated to that. However, this results in

‘large countries’ having a large total number of publications in WOS, and therefore they dominate search results, while the relative productivity of

‘small’ countries is similar in many cases, but less noticed.

It becomes more visible in Figure 3 that the four countries with rather low relative publication counts despite high researcher numbers in group one are not exceptions. In this second group, 16 countries have a WOS indicator lower than 50: amongst them, low to high, Morocco, Egypt, Bulgaria, South Korea and Turkey. Turkey is an interesting case, because it has a high total number of WOS publications, but a very large population.

In relative numbers, it compares well with Hungary, on the higher end, and South Korea, on the lower end.

In this second group of countries, all continents are represented: besides many European countries, Argentina is the top­ranked Latin American country, and Senegal ‘leads’ sub­Saharan Africa in terms of researchers/in­

habitants ratio. I would also like to point out the position of the United States: while having, by far, the largest total number of publications in

Figure 4. For the third, here, and the fourth group, to display the ratio between researchers and inhabitants, a scale ten times less detailed as for the first two groups is suitable.

WOS, its relative number of researchers (estimated, see Appendix B, Table 18 and 19) is, compared to the first group, rather low, and its rank by relative publication count is only 18, just before Singapore.

For groups three and four (see Figures 4 and 5), the scale of researcher counts was zoomed in by a factor of ten, and the same magnification is used once more for the publication scale from group four on, resulting in a zoom­factor of 100. Despite the different scales, jumps in the data visualisation are relatively smooth. In group three, some countries pop out with a high relative number of SSH basic research publications; first of all, as mentioned before, Singapore, but also Romania, Chile, Qatar and South Africa. Russia is, in terms of relative WOS publications, positioned exactly between Ecuador and Pakistan.

In the fourth group, four Southeast African countries appear: Zimbabwe has the most researchers per inhabitants of the region, and a respectable publication indicator of three, just a little lower than those of Thailand,

Figure 5. For group four, here, and the following groups, the scale for displaying the relative number of publications had to be zoomed in by a factor of ten.

Figure 6. In the fifth group, the scale for the researchers/inhabitants ratio was again zoomed in by a factor of ten.

Figure 7. In the sixth and last group, many publication bars are almost invisible, but some stick out. The researcher ratio is mostly based on estimations, and the scale was adapted again, by a factor of 23.

Moldova, or Venezuela in group three. Mauritius, another Southeast Af­

rican country, has, after Jordan, the second­highest publication indicator of this group. Kenya and Madagascar are also found in this group. While having almost the same relative share of researchers as Mauritius, China has considerably less publications registered, in relative terms. However, according to the total number of WOS publications, China would rank fourth, between Canada and Australia, both in group one. China has only one SSH researcher working at HEI per 19,400 inhabitants, less than Sudan, but 70,200 SSH researchers at HEI in total, while Sudan has only 2,100. Pu­

erto Rico, also found in group four, is the only country on the entire list that does not feature even one SSH basic research publication in WOS.

In the fifth group (see Figure 6), with a researcher/inhabitants scale ten times less detailed, Botswana dominates the picture. In groups four to six, most African countries can be found, side by side with many South Amer­

ican and Asian countries which are also comparable in terms of relative WOS publications.