SCIENCE PUBLICATION INDICATORS FOR INDIA: QUESTIONS OF INTERPRETATION

14
Join@published by Elsevier Science Ltd, Oxford and Akad~miai Kiad6, Budapest Scientometrics, VoL 44, No. 3 (1999) 347-360 SCIENCE PUBLICATION INDICATORS FOR INDIA: QUESTIONS OF INTERPRETATION APARNA BASU National Institute of Science, Technology & Development Studies (NISTADS) Dr. K. S. Krishnan Road, Pusa, New DelhL 110012 (lndia) (Received January 13, 1999) We comment on a letter to Nature in 1996 on the long term decline of Indian science pointing out methodological reasons why the (SCI) data used by the authors do not unambiguously lead to their stated conclusions. Our arguments are based on the contention that no valid statement on change in a country's output may be made for a period in which the journal coverage from that country in SCI has changed significantly. We have suggested that for longitudinal comparisons of country level performance, it should be verified that the joumals from that country in SCI remained constant within the period. This could be ensured if the country of publication of journals could be included as a field in the SCI database. We define a Visibility Index as the cumulated impact and derive a relation to estimate change in visibility combining changes in output and average impact. In the period during which Indian journal coverage remained unchanged, a detailed analysis of output for two years (1990-94) leads us to conclude that, with the exception of Agriculture, there has been an increase in publication in virtually every field, with significant increase in the overall mean Impact Factor. At least 25 subfields have been identified with statistically significant increase in mean Impact Factor and Visibility. The impact of foreign collaboration on visibility has also been considered. In conclusion we touch upon the question of citation as a performance indicator for Third World countries as high citation and relevance may be in conflict as objectives. Introduction In an article on Indian science that received widespread attention in India 1 it was stated that papers from India in the SCI showed a decline of 15% between the years 1981-1995. World share of publications from India were shown to be falling, and, in particular, the percentage fall in world share of 32 percent placed it alongside Sub Saharan Africa and CIS countries. This latter bit of information was culled from the Worm Science Report, 1996 which also showed the US share to have declined. It was lamented that citations were poor and long term monitoring of Indian science was absent. The authors concluded that Indian science was declining and proceeded to 0138-9130/99/US $15.00 Copyright 1999 Akad~miai Kiad6, Budapest All rights reserved

Transcript of SCIENCE PUBLICATION INDICATORS FOR INDIA: QUESTIONS OF INTERPRETATION

Join@published by Elsevier Science Ltd, Oxford

and Akad~miai Kiad6, Budapest

Scientometrics,

VoL 44, No. 3 (1999) 347-360

SCIENCE PUBLICATION INDICATORS FOR INDIA: QUESTIONS OF INTERPRETATION

APARNA BASU

National Institute of Science, Technology & Development Studies (NISTADS) Dr. K. S. Krishnan Road, Pusa, New DelhL 110012 (lndia)

(Received January 13, 1999)

We comment on a letter to Nature in 1996 on the long term decline of Indian science pointing out methodological reasons why the (SCI) data used by the authors do not unambiguously lead to their stated conclusions. Our arguments are based on the contention that no valid statement on change in a country's output may be made for a period in which the journal coverage from that country in SCI has changed significantly. We have suggested that for longitudinal comparisons of country level performance, it should be verified that the joumals from that country in SCI remained constant within the period. This could be ensured if the country of publication of journals could be included as a field in the SCI database.

We define a Visibility Index as the cumulated impact and derive a relation to estimate change in visibility combining changes in output and average impact. In the period during which Indian journal coverage remained unchanged, a detailed analysis of output for two years (1990-94) leads us to conclude that, with the exception of Agriculture, there has been an increase in publication in virtually every field, with significant increase in the overall mean Impact Factor. At least 25 subfields have been identified with statistically significant increase in mean Impact Factor and Visibility. The impact of foreign collaboration on visibility has also been considered. In conclusion we touch upon the question of citation as a performance indicator for Third World countries as high citation and relevance may be in conflict as objectives.

Introduction

In an ar t ic le on Ind ian sc ience tha t r ece ived w i d e s p r e a d a t t e n t i o n in Ind ia 1 it was

s ta ted tha t p a p e r s f r o m Ind ia in the SCI s h o w e d a dec l ine o f 15% b e t w e e n the years

1981-1995 . W o r l d share o f pub l i c a t i ons f rom Ind ia we re s h o w n to b e fa l l ing, and, in

par t i cu la r , the p e r c e n t a g e fall in w o r l d share o f 32 pe r cen t p l a c e d it a l ongs ide Sub

S a h a r a n A f r i c a a n d CIS count r ies . Th i s lat ter b i t o f i n f o r m a t i o n was cu l led f r o m the

W o r m Science Report , 1996 w h i c h also s h o w e d the U S share to h a v e dec l ined . I t was

l a m e n t e d tha t c i ta t ions we re p o o r a n d long t e r m m o n i t o r i n g o f I n d i a n sc ience was

absent . T h e au thors c o n c l u d e d tha t Ind ian sc ience was d e c l i n i n g a n d p r o c e e d e d to

0138-9130/99/US $15.00 Copyright �9 1999 Akad~miai Kiad6, Budapest All rights reserved

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

analyse some reasons for the decline, going on to suggest that Indian authors were likely facing more rejections in the SCI joumals.

We have tried to show methodological reasons why the conclusions derived in the above paper could be misleading. A precaution in making comparisons over time using SCI data involves the use of a constant journal set.2, 3 I f indeed that is not possible, it

may be ensured that journal coverage from that country remains constant while judging country performance, since a country's publications are concentrated in its own journals. In the case of India the number of Indian journals included in SCI fell from 35 to 12 in this period. This has been noted by Raghuram but the significance has been missed by them. By some simple estimates we show that a corresponding decline in publications is expected and is neither an indicator of falling productivity, nor, indeed of increased rejections of Indian papers by SCI journals. It should be clear that rejections can never be inferred from publication data. Over the last 10 years the number of Indian joumals has been constant and the publications show a slowly increasing trend. The rate is however less than the growth in world publications as seen from SCI.

In country comparisons using ratio scales such as world share of a country, we need to ask if the world share is falling because coverage of the SCI has increased or because the actual publications have fallen. The problem is further compounded when one considers percentage change in world share that involves taking ratios of ratios. A small change in a small producer can get unduly amplified. This distortion may lead to errors of judgement for policy makers for whom science indicators are of some importance. Change in world share should be quite adequate as an indicator. We advocate the use of some basic data alongside the indicators to ensure a proper perspective.

In the case of citations as indicators, it is well known that citation data are highly skewed and vary with citation practices within disciplines, age of publication, etc. Whether a paper is actually cited or not can depends on a host of sociological factors. For a paper from a Third World country, citation depend primarily on its recognition by authors in the developed countries. 4 Highly cited papers are often written in collaboration with authors from developed countries: High citations as an objective may conflict with questions of the relevance of the research for the country concerned. 5

For Third World countries the choice of representative indicators may not necessarily be the same as that of the developed countries. With limited resources, developing countries need to distinguish between research that is internationally recognised and research addressed to its own needs.

348 Scientometrics 44 (1999)

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

On the other hand, visibility is an important criterion for the research of Third World countries, where 'lost science' can be a genuine problem. 6 Questions of national relevance can also assume international dimensions, for example, the resurgence of diseases such as malaria may require the retrieval and re-examination of studies which may not have received much attention at the time of publication. The inclusion of research in standard journals gives a higher probability of retrieval at the appropriate time. For national purposes indicators of national performance need to be constructed at a more detailed level than are available from international analyses.

Methodology

In this paper we have estimated the effect of journal coverage on country output using the Pearson correlation coefficient between the data on India's output and coverage of Indian journal in SCI. We have also described a possible scenario that may explain the high rate of journal deletion.

Next we report on a national level study on the level and change in India's output 7 using the SCI CD ROMS for 1990 and 1994. The data included all papers with at least one author from India, and has been subjected to cleaning standardisation and classification. Integer counting has been adopted in assigning a paper to a country. We construct a visibility index based on the impact factors listed in the 1994 Journal

Citation Report (JCR). We have assumed that the cumulated Impact Factor constitutes an adequate index of 'visibility' of a set of publications. It may be mentioned that due to the rather short time window used in the definition of the ISI Impact Factor, the figures obtained here may not be stable over long periods of time.

Visibility is defined here as the cumulative impact, i.e. V = PI where P is the number of papers and I the mean impact factor. The relative change in visibility (dV/V) is then given by the relation

(dV/V) = ( ~ / P ) + (~/J3

Using this index, changes due to papers and mean IF over time may be combined. For both years, the Visibility and mean Impact have been calculated based on the 1994 IF values. Any changes seen will therefore be due to actual change in the number of papers in journals of differing impact rather than a change in the IF of the journals themselves.

We have tested for statistically significant changes in the mean Impact Factor(IF) in the main fields and subfields of science, and have indicated the changes in output and visibility over time. For classification of output into fields and subfields we used a

Scientometrics 44 (1999) 349

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

scheme developed at NISTADS based on the CHI and SCI classifications. The effect of

foreign collaboration on visibility has also been considered. All statistical calculations

have been performed using the SPSS package for statistical analysis. It should be mentioned that our analysis of data for two years cannot take year to

year fluctuations into account and our conclusions are subject to this limitation.

Effect of deletion of journals on SCI output

The number of Indian journals in the SCI increased rapidly in the 1970's to a high of 40 journals. Between 1980 and 1987 the number fell progressively to its present

value of 12 journals (Fig. 1). The number has remained constant thereafter. This has to

be considered against a background of a 50 percent increase in overall coverage of journals in SCI in almost the same period. Since no other country has suffered such a

sharp decline, cognisance should be taken of this factor. Even though not more than 12 percent of India's output in the SCI appears in Indian

journals, the estimated correlation between total output and the number of Indian

journals was found to be high. For the period 1981-97, the correlation was 0.86

increasing to 0.91 for the reduced period 1981-86, i.e., when Indian journal coverage

decreased by 23 journals (Table 1). The correlation is no longer significant after 1987

during which the number of Indian journals has remained constant. This implies that change in Indian journal coverage is a highly significant factor that

cannot be ignored while estimating India's output. The reason for the high correlation stems from the likelihood of concentration of

output from a country in its own journals. In the case of India we found that an Indian joumal in SCI publishes on average more than a hundred Indian articles as compared to

five articles in any other SCI journal publishing papers from India (Table 2).

Table 1 Statistical correlation of Indian papers to Indian journals in the SCI

Period Correlation Significance level

1980-1996 0.862 1% 1980-1987 0.907 1% 1988-1996 0.463

350 Scientometrics 44 (1999)

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

14000 45

Papers 40

12000 --o,--No: of hdan Journa~

�9 35

100OO

30

8OO0 - 25

6011O 2O ~

�9 15

.10

2OOO 5

0 0

Year

Fig. 1. Papers from India & Indian journals in the SCI

Thus the loss in perceived SCI output from India on average due to the deletion of a single Indian journal cannot be compensated for by the addition of as many 20 journals from other countries. This last figure is likely to be much higher considering that India publishes in only 2000 of the nearly 4000 journals in SCI, -and increased journal coverage in SCI would include many journals in which papers from India have not appeared.

While we have only considered the specific case of India, the concentration of a country's output in its own journals may be a general feature. This implies that in order to get a correct picture of the change in country output over a period of time, one should at least ensure that the journals from the country have remained constant during the period. This may be considered as a simpler alternative to using a constant journal set, which requires one to identify many more journals.

Scientometrics 44 (1999) 351

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

Table 2 Average annual papers in journals from India and other countries

Country of publication No. of Indian papers in Change Papers/journal journals = average

1990 1994

India 12 1413 1382 -31 116.5 Other 2133 8690 9932 +1242 4.4

Total 2145 10103 11314 1211 4.7

Indian papers in international journals

Papers in Indian journals accounted for 35 percent o f India's output in SCI around

1980, and only 12 percent in 1994, suggesting a likely drop o f 23 percent due to fall in

the coverage o f Indian journals alone. Since the observed decline is only about 15

percent, there must have been an 8 percent increase in Indian papers in other SCI journals. There is therefore no reason to conclude, as in Ref. 1, that Indian papers are

facing more rejections in the SCI. We also note from Table 2 that from 1990 to 1994

the number o f papers in Indian journals has decreased slightly while the number in other international journals has increased by over one thousand papers. The largest

increase o f more than 800 papers is in journals published from USA followed by Switzerland and Australia. The largest fall has occurred in journals from Europe,

specifically Netherlands, Germany, UK and Hungary. (Fig. 2)

Cascading effect of journal deletion

One may ask why such a large number o f journals were progressively deleted over a short span of time. One reason for journal deletion could be its low level o f citation: I f a

journal is deleted, say, due to its low citation, this may have a lowering effect on the citation o f other journals cited in it. If, as a result, the citation o f these journals now

falls below the threshold for inclusion in SCI they may also be deleted. This could have

a further deleterious effect on the remaining joumals, setting up a cascading effect.

Such a cascade could possibly affect the journal coverage from a country adversely,

although we have no way of knowing if such was the case for India. We do know however that in 1973 out o f the 14 Third World journals with a citation level higher

than 1.0, eight were Indian journals that together received 94% of the citations from the Third World.

352 Scientometrics 44 (1999)

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

-200

. . . . . . . . :~ ....................... USA

,,,~"1 . . . . . . ~,~:;~,~ : :j UKD I I

I1~ I IND

I NLD .......... i l l

l i d ~ GER

~ s ~

JPN

- AUS

�9 ~ DNK

"-'7 CSK

"7 IRL

6 FRA

'~ CAN I

HUN I I

ITA

pc~

" - - 7 ROM

SWE

"1 AUT

ISR

BEL

l _ _ - - PRC

n (AVG.*'iO) PAPERS/JOURNAL ,SUN

[ ] NO OF JOURNAL..s 90- 94 NOR

�9 CHANGEINPAPERSgOT094 OTH

~::;-: : :'~,! ...... UNI

0 200 400 600 800 1000 1200

Fig. 2. Indian publication in international journals (by country of publication)

Scientometrics 44 (1999) 3 5 3

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

Mapping of Indian science: A bibliometric analysis

The mapping of Indian science based on bibliometric analysis was begun as a project funded by the National Information System on Science & Technology (NISSAT) in the Department of Scientific Industrial Research, India. The objective was to extract information on India's scientific activity through an analysis of its publications from data for two years (with an interval of four years) from the Science Citation Index, and to assess change in the quantity of Indian scientific publications and choice of journals as measured by the Impact Factor. A longer term objective was to build a database of Indian publications from which long and short term trends could be analysed.

We note here some results arising from our study 7 with respect to the major fields and subfields of science, and papers written in collaboration with scientists from other countries. Since bibliometric techniques are statistical, their validity relies upon the size of data. This could not be ensured in all cases especially in the case of subfields when numbers were small.

Change in Output: As far as possible, we have focussed only on statistically significant changes. Table 3 indicates the output, mean impact and change from 1990 to 1994 in all fields of science showing that there has been an overall increase of 12 percent in output and increase of 0.17 in the mean impact factor taking it to a value of 1.39. With the exception of a small reduction in Biology (which may fall within the limits of year to year fluctuation), and a larger fall in Agriculture, we see an increase in output in every field. In Agriculture there is not only a fall in output but also a fall in impact factor resulting in a net fall in visibility of 26 percent.

Change in lmpact and Visibility: There has been a significant increase in mean impact factors in 7 out of 12 fields, namely Biology, Medicine, Biomedical Research and Chemistry (1% level of significance), Physics and Materials Science (5% level of significance), and a decline only in the articles in Multidisciplinary journals (5% level of significance). In spite of a fall in impact factor in three remaining fields (not statistically significant), it has been compensated by an increase in output, resulting in increased visibility for all fields except Agriculture.

Subfields within major science fields: A large number of subfields were identified with increased impact factor. Statistically significant changes are indicated in Tables 4 and 5. Subfields from several main disciplines were represented e.g. Medicine, Chemistry, Physics, Biomedical Research, Computers, Material Sciences and Engineering & Technology. An increase of more than 1.0 in impact is found in General Biology and several subfields of Medicine, namely General and Internal Medicine,

354 Scientometrics 44 (1999)

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

Haematology, Neurology and Neurosurgery and Gastroenterology. Of these two show a simultaneous fall in output of more than 10 papers, viz. General and Internal Medicine and Gastroenterology. In addition, areas of Physics viz. Astronomy, Astrophysics and Solid State Physics, and Computer Science (Theoretical Computer Science and Artificial Intelligence) show an increase in impact factor, as also three areas in chemistry (Analytical, Organic and Physical). Increased impact was seen to be associated in every case with an increase in visibility with the sole exception of Inorganic and Nuclear Chemistry (Table 5).

The areas with a decline in impact factor (Table 5) are Textiles, Fluids and Plasmas, Pharmocology and Oceanography. Interestingly all these areas except the last show a fall in publications as well as reduced visibility.

Foreign Collaboration: Foreign collaboration has almost doubled from '90 to '94. Collaboration appears to play a significant role in the visibility of papers (Table 6). The mean impact factor of papers written in collaboration with foreign scientists is significantly higher for both years. At the same time, their impact has not increased significantly. On the other hand, the mean impact of papers written without foreign collaboration has increased at the 1% level of significance, with an increase in visibility of 18%.

Table 3 Change in productivity and average Impact Factor in Indian scientific publications (All fields: 1990-1994)

Change in

Field Papers Papers Mean IF Mean IF Papers Percent Impact Factor Visibility '90 '94 (1990) (1994) (dP) % (d/) (dV/V) %+

Computers & Communications 51 113 0.73 0.82 62 121.6 0.09 133.9

Biology 566 556 0.85 1.51 -10 -1.8 0.66** 75.9 Biomedical Research 879 1151 1.45 1.65 272 30.9 0.20** 44.7 Earth & Space 388 504 0.92 0.9 116 29.9 -0.02 27.7 Engineering 707 912 0.63 0.62 205 29.0 -0.01 27.4 Medicine 1676 1761 1.71 2.09 85 5.1 0.38** 27.3 Physics 2189 2438 1.47 1.64 249 11.4 0.17' 22.9 Chemistry 2359 2480 1.09 1.26 121 5.1 0.17"* 20.7 Materials Science 340 376 0.73 0.8 36 10.6 0.07* 20.2 Mathematics 168 189 0.51 0.54 21 12.5 0.03 18.4 Multidiciplinary 417 551 1.18 0.95 134 32.1 -0.23* 12.6 Agriculture 363 284 0.72 0.69 -79 -21.8 -0.03 -25.9

All fields 10103 11315 1.22 1.39 1212 12.0 0.17"* 25.9

Values significant at (a)** the 1% level of significance; (b)* 5% level of significance (t-test) *Change in visibility (d V/V) = (dP/P)+(dl/l)

Scientometrics 44 (1999) 3 55

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

Table 4 Subfields in science with increase in Visibility and significant change in Impact Factor (India: 1990-1994)

Field Papers Papers Mean IF Mean IF Papers '90 '94 (1990) (1994) (dP)

Change in

Percent Impact Factor Visibility % (dO (dV/V) %+

General Biology 34 74 0.38 4.95 40 118 4.57** 1320 Nephrology 4 17 1.49 2.33 13 325 0.84* 381 Hematology 9 25 1.46 3.51 16 178 2.05* 318 Ophthalmology 25 58 0.73 1.70 33 132 0.97** 265 Theoretical Computer

Science 3 9 0.42 0.66 6 200 0.24* 257 Neurology and

Neurosurgery 61 172 2.05 3.56 111 182 1.51"* 256 Artificial Intelligence 8 20 0.69 1.18 12 150 0.49** 221 Cell Biology, Cytology

and Histology 27 26 0.72 1.47 -1 -4 0.75* 100 General & Internal

Medicine 272 116 3.62 7.98 -156 -57 4.36** 63 General Biomedical

Research 64 127 0.96 0.55 63 98 -0.41"* 56 Polymers 217 313 0.89 0.98 96 44 0.09* 54 Biochemistry and

Molecular Biology 321 417 2.01 2.41 96 30 0.40** 50 Physical Chemistry 320 372 1.22 1.60 52 16 0.38** 47 Fertility 34 31 1.12 1.66 -3 -9 0.54** 39 Gastroenterology 67 51 2.14 3.45 -16 -24 1.31"* 37 General Materials

Science 251 301 0.68 0.79 50 20 0.11 ** 36 Analytical Chemistry 149 172 1.18 1.41 23 15 0.23** 35 Misc. Engineering and

Technology 57 60 0.39 0.49 3 5 0.10"* 31 Solid State Physics 421 440 1.44 1.75 19 5 0.31"* 26 Organic Chemistry 591 629 1.05 1.17 38 6 0.12" 18 Astronomy and

Astrophysics 213 161 1.45 1.96 -52 -24 0.51"* 11

Values significant at (a)** the 1% level of significance; (b)* 5% level of significance (t-test) *Change in visibi l i ty (dV/V) = (dP/P)+(dI/l)

356 Scientometrics 44 (1999)

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

Table 5 Subfields in science with decrease in Visibility and significant change in Impact Factor (India: 1990-1994)

Field Papers '90

Change in

Papers Mean IF Mean IF Papers Percent Impact Factor Visibility '94 (1990) (1994) (dP) % (d/) (dV/V) %+

Textiles, Fibres, Leather 25 7 0.47 0.28 -18 -72 -0.19"* -112 Fluids and Plasmas 36 24 1.22 0.76 -12 -33 -0.46** -71 Pharmacology 247 184 1.40 0.98 -63 -26 -0.42** -56 Oceanography and

Limnology 20 22 1.07 0.71 2 10 -0.36* -24 Inorganic and Nuclear

Chemistry 607 490 0.64 0.74 -117 -19 0.10"

Values significant at (a)** the 1% level of significance; (b)* 5% level of significance (t-test) +Change in visibility (d V/V) = (dP/P)+(dI/I)

Table 6 Foreign collaboration in Indian scientific publications (1990-1994)

Change in

Papers Papers Papers Mean IF Mean IF Papers Percent Impact Factor Visibility with '90 '94 (1990) (1994) (dP) % (d/) (dV/V) %+

Foreign Collaboration 1025 1525 1.81 1.99 500 48.8 0.18 58.7 No Foreign

Collaboration 9077 9596 1.15 1.29 519 5.7 0.14"* 17.9

Difference 8052 8 0 7 1 0.65** 0.70**

Values significant at (a)** the 1% level of significance; (b)* 5% level of significance (t-test) +Change in visibility (dV/F) = (dP/P)+(dI/1)

R e l e v a n c e vs. c i t a t i o n

A cluster ing o f co-cited articles done more than 10 years ago 4 indicated that unl ike

the rest o f the Third Wor ld where research is focused on clinical and biomedical

sciences ref lect ing their immedia te concerns, India contributes substant ia l ly from the

Third Wor ld to research publ icat ions in areas such as physics, chemist ry and biology.

This reflects a long tradition o f educat ion in the pure sciences. A misp laced emphasis

on high citation could shift the balance be tween areas o f pure research and research that

is re levant to the count ry ' s needs. Since highly cited articles are typical ly those

authored with and cited by western scientists they usual ly relate to western scientific

Scientometrics 44 (1999) 357

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

concerns. An indicator of relevance will need to be developed in order to maintain

parity between these factors.

Conclusions

Through this paper we have tried to show that scientometric data can be misinterpreted even when used in good faith, that certain caveats are required and it is not always self evident what these may be. In the case studied the caveat was the consideration of the falling coverage of Indian journals in SCI. Once this fact was taken into account, the purported 'decline' in Indian science was easily explained as an effect of altered coverage rather than falling productivity.

This underlines the importance of using a 'constant journal set' for longitudinal comparisons of country performance over long intervals of time. It also leads us to suggest a simpler alternative, i.e., to ensure a constant journal coverage from the country in question. We deduce this from the fact that a country's papers are likely to be concentrated in its own journals (in the Bradford sense). This point will be discussed in detail in another paper. 8

As demonstrated by our simple example, analysis based solely on SCI data may lead to erroneous conclusions unless controlled for journal coverage. The difficulty is that the country of origin of the journal is not available as a field in SCI data. ISI could consider including country of origin of the journal in their data (Country of origin is included in Journal Citation Reports (JCR) but does not appear as a column in the database output). The inclusion would make it particularly useful for certain types of analysis and lead to a better understanding of reasons for changes in observed country

output. In our data on Indian science for the period when Indian journal coverage was

unchanged, contrary to the suggestion by Raghuram that Indian papers were possibly 'facing more rejections in international journals', we found a 10% increase in papers in other international journals from '90 to '94, and a slight fall in papers in Indian journals. We agree with Raghuram et al., however, that there has been an absence of continued long term assessment and monitoring of Indian science.

For purposes of national planning we need indicators at a microscopic level of disciplines, sub-disciplines, sectors, institutions, and examine them in detail for a succession of years before we can have a meaningful debate on the decline of Indian science rather than arrive at these conclusion from aggregate figures. The process has been started in the form of the National Mapping of Science Project, funded by the Department of Scientific and Industrial Research, in India. As a part of this project, a

358 Scientometrics 44 (1999)

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

detailed analysis on Indian Science output using SCI data of 1990 and 1994 was conducted at National Institute of Science Technology and Development Studies, New Delhi. 7

Based on this study, we have derived here a Visibility Index to monitor change in relative visibility (cumulated impact) resulting from both a change in output and average journal impact. For example, we found that Agriculture was the only major discipline with a decrease in both output and average impact, resulting in a fall in visibility for India of more than 25%. This should be a point of concern for an agro- based developing country which has achieved self sufficiency in food through the use of hybrid seed varieties and a Green Revolution supported by extensive agricultural research. Since a study covering two years is subject to year to year fluctuations,the study is now being extended to other years to monitor trends.

We have used statistical significance to distinguish between apparently large changes in Impact or Visibility, and those changes that should be taken cognisance of. This addition to the usual scientometric methodology enables us to pick out only a small subset (25) of the hundred subfields which show changes in impact. There were subfields in almost every major field of science that showed significant increases in output and impact.

We also found that papers written with co-authors from other countries increased in number from 6% to 10% of total papers in the period, and had significantly higher average impact than the average impact of all papers from India. This average impact, however, has not increased significantly with time. This is contrasted with the significant though small increase in IF for papers where authors were based solely in India.

In brief, there appears to be little ground for the despondency reflected in Reference 1. This point may be noted by those seriously interested in India's scientific output and in science policy and management in India who are often assailed by criticism of Indian science from within their own fold, both in national 9 and international 10 publications.

In conclusion we note that while a more detailed analysis does give us some idea of changing emphasis in different areas of science in India, these indicators alone will not suffice for a developing country. The centre-periphery relation in science does not necessarily imply, and in fact may be in conflict with the question of relevance. For a developing country with limited resources this factor needs to be kept in mind. Some thought therefore needs to be devoted to defining an indicator for policy requirements and decision-making that could incorporate relevance.

Scientometrics 44 (1999) 359

APARNA BASU: SCIENCE PUBLICATION INDICATORS FOR INDIA

Funding from NISSAT (Research Grant No: JSF/1496/96) is acknowledged. We thank Dr. Ashok ,lain, A. Lahiri for their support and B.S. Vinu Kumar and Ritu Aggarwal for computational assistance.

References

1. N. RAGHURAM, Y. MADHAVI, India's declining ranking, Nature, 383 (1996), 572. 2. B. B. MARTIN, J. IRVINE, F. NARIN, CH. STERRIT'I', The continuing decline of British science, Nature,

330 (1987) 123-226. 3. J. IRVINE, B. R. MARTIN, T. PEAOgCK, T. ROY, Charting the decline in British Science, Nature, 316

(1985) 587-590. 4. E. GARFIELD, Mapping science in the Third World, Science andPublic Policy, (1983) 112-127. 5. S. ARUNACHALAM, Research as if Relevance Mattered: Medical Research in India as reflected by SCI

1981-85, Current Science, 72 (1997) 912-922. 6. W. GIBBS, WAYT, Lost Science in the Third World, Scientific American, (1995) 92-99. 7. A. BASU, P. S. NAGPAUL, National Mapping of Science: Report No. # REP 248/98, National Institute

of Science Technology and Development Studies, (1998). 8. A. BASU, The effect of Journal Coverage in ISI on country performance: Another use of Bradford's

Law, manuscript in preparation. 9. D. M. SALWI, Nonsense in Indian Science, Konark Publishers, New Delhi (1998).

10. P. M. BHARGAVA, The cowboys in Indian science, Book Review, Nature, 395/17 (1998).

360 Scientometrics 44 (1999)