ARTICLE IN PRESS
International Journal of Educational Development 26 (2006) 532–544 www.elsevier.com/locate/ijedudev
Restructuring UNESCO’s statistical services—The ‘‘sad story’’ of UNESCO’s education statistics: 4 years later$ Roser Cusso´,1 Department of Sociology, University Paris 8-Saint Denis, France
Abstract Criticism directed at the quality of UNESCO’s education statistics led to the recent restructuring of the Organization’s statistical services. This criticism, primarily supported by the World Bank and subsequently confirmed by consultants engaged by UNESCO, does not prove to be completely justified. In fact, a change in the political orientation of the statistical program appears to have been the main goal of the reform. Providing few significant (or new) recommendations to improve the other dimensions of data quality, the consultants’ reports essentially concentrated on the need for measuring and comparing educational systems’ performances in a strongly competitive world economy. While UNESCO’s General Conference did not discuss the political aspects of the reform, the restructuring can be tied to UNESCO’s loss of leadership to other international agencies, which have come to produce their own statistics and recommendations on education. r 2006 Elsevier Ltd. All rights reserved. Keywords: UNESCO; Education statistics; Restructuring; Data quality; Statistical program; International decision-making
1. International education statistics: a ‘‘data quality’’ or a political issue? Defining, collecting and disseminating international education statistics are important activities that carry a great deal of responsibility and considerable impact through the reports and statistical studies that are made available to the public. A great deal is at stake in the definition of a statistical program, insofar as it has a significant influence on $ This refers to the period 1999–2003, although part of the analysis may also apply to the biennium 2004–2005. 3, rue Rochebrune 75011 Paris, France. E-mail address:
[email protected]. 1 Worked at UNESCO’s statistical services from 1994 to 2001. Ideas and opinions contained in this article are the author’s entire responsibility.
0738-0593/$ - see front matter r 2006 Elsevier Ltd. All rights reserved. doi:10.1016/j.ijedudev.2006.01.001
political, social and economic representation and evolution (Seers, 1983). The United Nations Educational, Scientific and Cultural Organization (UNESCO) had long been the only international institution to produce and disseminate comparative statistics on education covering all the countries of the world. This mission, inherent to the Organization’s mandate, was challenged in the 1990s, when other international organizations—notably the Organization for Economic Cooperation and Development (OECD)— began to collect and publish their own educational statistics. This new statistical output supported these organizations’ interpretations of the social and political role of education: ‘‘A quantitative description of the functioning of education systems can allow countries to see themselves in the light of
ARTICLE IN PRESS R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
other countries’ performance’’ (OECD, 2000, p. 5). In fact, since the 1980s, while the idea that education statistics should help measure and rank national capacities of economic competition had been strongly upheld by other international agencies2, UNESCO’s Division of Statistics continued to measure and compare the spread of mass education and literacy. This comparability—mainly in relation to plan-oriented ‘‘development policies’’3—was finally challenged by publications such as Education at a Glance (OECD, 1992) or Knowledge and Skills for Life (OECD, 2001), where indicators were not only used to compare characteristics of national education systems but also to compare underlying political decisions (e.g. through the efficiency of the use of educational budgets) (Cusso´ and D’Amico, 2005). UNESCO’s statistical services were, in this context, eventually restructured, the founding of the UNESCO Institute for Statistics (UIS) in 1999 marking a decisive step in the process. Criticism concerning the quality of the Organization’s statistics was central to the reform. The lack of relevance, reliability and accessibility of the Organization’s statistics was regularly underlined by the World Bank, especially beginning in the 1980s, when the Bank proposed, without success, the setting up of an International Fund for the Improvement of Education Research. A study prepared jointly by the United Nations International Children’s Fund (UNICEF) and the World Bank, and presented at a meeting of the Board of International Comparative Studies in Education (BICSE)4, concluded that UNESCO statistics were narrow, and neither reliable nor easily accessible (Puryear, 1995). In 1999, 2 ‘‘It was possible [for the OECD] to explore (in the early 1970s) the option of producing indicators on the performance of education, an option that was termed to be premature at the time. [y] Pressure [increased] in the late 1980s in favor of an intergovernmental initiative for the production of such indicators [y]. This pressure came mainly from the USA whose Ministry of Education was to make a contribution, however small, to the launching of the operation’’ (Papadopoulos, 1994, p. 209). 3 Which could also explain the lack of attention to any serious measurement of non-formal education, although Member States were quite free to include data on the educational programs they considered relevant. 4 The BICSE was formed by the American National Research Council in 1988 in order to study the international comparability of statistics on education. The National Research Council was created in 1916 by the National Academy of Science, with the objective of linking the scientific and technological community with the aims of the Academy: promoting knowledge and advising the government of the United States.
533
Heyneman, a former World Bank specialist in education, told of the ‘‘sad story’’ of UNESCO’s education statistics (Heyneman, 1999). He insisted on the urgent need for reform and modernization of the statistical services. He strongly criticized, among other things, UNESCO’s reticence in revising the role—in his opinion marginal—which education statistics were given, especially in terms of measurement and cross-national comparability of learning achievement. Considering that UNESCO tended to be used ‘‘as a battleground for cold war politics’’, Heyneman suggested that the influence of ‘‘Marxist academics’’ and the intergovernmental and political nature of the Organization could explain this reticence (Heyneman, 1999, pp. 68, 71–72). During the reform of UNESCO’s statistical services, the reports commissioned by the Organization (Guthrie and Hansen, 1995; Thompson, 1998) pursued this criticism of the quality of data, as well as of the management of the statistical unit. This vision of the reform met with a mixed reaction from the UNESCO staff union and several Member States. They noted efforts to get around personnel regulations, and the lack of a genuine feasibility study on the relocation of statistical services, which would also have provided better defined terms of reference for tender bids. However, incongruities between the restructuring objectives, as regards quality of statistics, and the actual results of this reform, have never been reconsidered. In fact, the limited enhancement of statistical output, at least during the first 4 years of the UIS, shows to what extent priority was given to political and institutional ends: new orientation, autonomy, and the transfer of the Institute from France to Canada. Finally, apart from specific institutional changes, this restructuring brought about three general developments. First, the weakening of the critical and decision-making functions of UNESCO’s General Conference5. The Member States did not directly discuss the essential question of the social and political significance that the Organization wished to give education statistics. Secondly, the establishment and reinforcement of a directing body mainly concerned by management and financial 5 The General Conference calls together all of the Member States (190 in 2003) every other year. The Conference discuss and adopt programs and resolutions—including the statistical program—prepared by UNESCO’s Secretariat. The Executive Board, in which 58 Member States are represented, meets twice a year. It is responsible for execution of conference decisions and prepares the work of the next session.
ARTICLE IN PRESS 534
R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
issues. And lastly, the development of an increasing consensus between UNESCO’s program and the programs of other international agencies. This last aspect is, in large part, the consequence of the preceding two factors, but is also witness to the growing competition between international organizations in the field of education. Despite the fact of being the only agency ever to be officially mandated, at the international level, with the mission of making recommendations in education—a mission overseen by a representative assembly—UNESCO has ultimately lost this prerogative. 2. UNESCO’s statistical services from the 1950s until the 1990s UNESCO’s Division of Statistics was, up until the beginning of the 1990s, the only international entity to plan and put into being a program on international educational statistics. This program included a continuous activity of collection, standardization, analysis and dissemination of data on education6: (i) design and distribution of various statistical questionnaires and manuals since the 1950s; (ii) design, adoption (in 1978) and update of the International Standard Classification of Education (ISCED); (iii) definition of statistical methods and some 100 indicators on access, enrollment, dropout, expenditures on education, gender disparitiesy; (iv) development and maintaining of a computerized data base; (v) maintaining of regular contact with national statistical officials; (vi) publication of the UNESCO Statistical Yearbook from 1963 on, and other specialized studies, as well as provision of different sets of data in response to specific users’ requests—including externally funded studies; (vii) technical co-operation with Member States, including assistance to individual countries, and organization of international seminars and workshops. However, as the statistical services were long considered an instrument to support the implementation of other sectors’ programs, it became rather clear that UNESCO did not give statistics the same importance and media-visibility that other international organizations gave their own data. Furthermore, the style and orientation of statistical studies did not seem to be adapted to what were deemed to be the new needs of globalization, notably the need for 6
Along with data on science and technology, culture and communication which are not analyzed in this paper.
comparative assessment of students’ skills. The intellectual origins of UNESCO, influenced by the structuralist approach (Levi-Strauss, 1952), go some way to explaining why the Organization never published classifications, or indices, explicitly comparing the performances of countries, especially in the area of students’ learning achievement. The latter was not taken into consideration in the yearly statistical surveys of UNESCO7. In the 1980s, UNESCO was to experience an unprecedented institutional and political crisis. Within the context of the polemical proposal of a new world order in the area of information and communication, several critical voices were raised attacking UNESCO for its incapacity to adapt, its bureaucratic nature, and its ‘‘third-world’’ orientation. Leading the criticism was the United States. In 1983, under the neo-liberal orientation of the American administration, the Secretary of State declared the withdrawal of the United States from UNESCO, citing reasons ‘‘relating to ideological orientation, budget and management’’ (Conil Lacoste, 1993, p. 208). The United States officially left the Organization in December 1984, followed by the United Kingdom and Singapore. One of the direct consequences of the withdrawal of these countries was the decrease of the regular budget by 30%, while almost a further 20% of extra-budgetary funds were cut off between 1983 and 1987. The United States directed their educational aid towards other international and bi-lateral agencies. A new era of multilateralism had started (Mundy, 1999, pp. 39–40)8. Beyond economic restrictions, the departure of the United States exacerbated the loss of UNESCO’s intellectual preeminence in the area of education. That loss was already appreciable over the preceding 10 years: ‘‘In its sector report of 1971, the [World] Bank did not hesitate to level criticism at mass education programs judged too expensive’’ (Laı¨ di, 1989, p. 57). UNESCO was no longer to be the only major international organization making recommendations on education. Currently, the World Bank, the United Nations Development Program (UNDP), the OECD, and UNICEF all lead projects, make recommendations, and in 7
In contrast, from the time when it was founded in 1958, the International Association of the Evaluation of Educational Achievement (IEA) has launched more than 20 studies on students’ learning achievement. 8 The United Kingdom re-entered to UNESCO in 1997, and the United States in 2003.
ARTICLE IN PRESS R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
certain cases, even produce figures on education. UNICEF organizes, since 1998, its own statistical surveys on education within its Multiple Indicator Cluster Surveys (MICS). From the 1980s, the OECD, still working together with UNESCO, has collected, processed and disseminated statistical data on its Member States; some 30 countries in 2003 (OECD, 2003). Since 1997, educational statistics on 11 other countries—19 in 2002 (OECD-UIS, 2003a)—have also been collected by the OECD within a project undertaken with UNESCO and financed by the World Bank, the World Education Indicators (WEI). Countries participating in this project are the most populated non-OECD nations and/or countries with well developed statistical systems. Educational data on another group of countries have been collected and processed by the Statistical Office of the European Communities (Eurostat) as part of the project Poland and Hungary: Action for the Restructuring of the Economy (PHARE). As part of the strategy of pre-access to the European Union, this program was launched in 1989 for Poland and Hungary, and subsequently widened to other East- and Central European countries. Furthermore, the OECD has greatly oriented the process of revision of the ISCED adopted by UNESCO in November 1997. This influence can be seen, for instance, in the introduction of level 4 (post-secondary nontertiary education) and in the definition and classification of educational programs according to their orientation: towards the labor market or towards more advanced studies. Financial and political difficulties were eventually to provoke a whole series of reforms in the Organization. In 1988, an international consulting group was established with the task of advising the Director-General on improving the management of personnel and the Secretariat’s efficiency. Its suggestions included ‘‘decentralization, delegation of authority to managers over resources allocated to their programs, a clearer distinction between the cost of programs and administrative lay-out, with the aim of more tightly limiting the latter [y]’’ (Conil Lacoste, 1993, p. 321). The staff in the headquarters and in the field were reduced from 3457 members in 1975 to around 2150 in 2003. Together with the reduction of permanent positions and incentive measures favoring early retirement, the proportion of temporary employees (supernumeraries and consultants) increased steadily during this period as did sub-contracted activities,
535
often financed through extra-budgetary funds9. It is within this context that the permanent staff of the Division of Statistics suffered a severe cut, numbering 51 members in 1984, only 33 members remained in 1994, when the number of permanent positions was finally frozen. This year also marked the beginning of the restructuring process in the statistical services. 3. Data quality and new orientation of the statistical program: the BICSE report As noted earlier, the World Bank had already proposed, in 1983, a fund for the improvement of education research. Later on, a study—prepared by UNICEF and the World Bank for the meeting of the BICSE in 1993—was to strongly attack the quality of UNESCO statistics (Puryear, 1995). In Puryears’ opinion, the Organization was not able to adapt to the demand for new statistical information, due, to a great extent, to the influence of certain countries which were reticent to compare the efficiency of their educational systems (Puryear, 1995, p. 89). In response to this criticism, the Director-General invited, in late 1994, the BICSE to prepare a plan of action aiming to improve the quality of international education statistics. The BICSE, entirely made up of American experts and/ or experts coming from American universities, published a report entitled Worldwide Education Statistics: Enhancing UNESCO’s Role, known thereafter as the BICSE Report. Published in 1995, this report was financed by UNESCO and the World Bank, as well as by the National Center for Education Statistics and the National Science Foundation (United States). The recommendations of the BICSE report related to several fields: mission of the program, institutional structure, main statistical activitiesy. In spite of the importance granted to the quality of statistics, the report made few new recommendations in this respect. Thus, if ‘‘data quality’’ commonly concerns relevance, reliability, completeness, coherence, comparability, timeliness, accuracy and accessibility (Carson and Liuksila, 2000; Eurostat, 2002), the BICSE report rarely included 9
For the biennium 2000–2001, the regular budget was estimated at 544 million dollars and extra-budgetary funds, at 250 million, more than one-third from the United Nations. For the biennium 2002–2003, the regular budget was estimated at 555 million dollars and extra-budgetary funds, at 334 million (UNESCO, 1999, 2001).
ARTICLE IN PRESS 536
R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
new operational proposals directly addressing these issues10, with the exceptions of data relevance and accessibility (Guthrie and Hansen, 1995, p. 31–35, 53–55). The following point, for example, illustrates the complex nature of statistical activity rather than the purported lack of tools for checking: ‘‘[y] there is no use of software-driven ‘edit checks’ that could be used for data set consistency. (For example, no algorithms are used to verify that the number of students in one grade is less than or equal to the number of students in the previous grade in the prior year. Although immigration might explain an apparently anomalous result, the statistical algorithm would be a valuable way to signal what data need further verification.)’’ (Guthrie and Hansen, 1995, p. 21). In fact, this statistical consistencycheck was already included in the calculation of UNESCO’s indicators of access, drop-out or for repeating. It was the analysis of these indicators, reinforced by the study of time series, that could occasionally indicate discrepancies in the data. In addition, the significance of the difference between the number of students enrolled over two consecutive years is not only a reflection of the number of repeaters, drop-outs or of movements of population, but also of changes in educational policy which may have bearing on these factors. It is precisely the existence of different students’ grade transitions which enable statisticians to analyze the internal efficiency of national educational systems over time. Only minor differences were found between the statistical activities proposed by the BICSE and those already realized within the framework of the existing statistical program of the Division of Statistics: ‘‘establishing common definitions and data standards; regularly collecting and disseminating a core set of education statistics and indicators; maintaining and documenting the underlying data base; planning and coordinating a strategic research and development effort; carrying out analytical activities; and playing the role of catalyst in spurring the development of statistical capacity and systems in member states [y]’’ (Guthrie and Hansen, 1995, 10 Proposals were rather general and often well-known, for example, ‘‘a successful international statistical agency will need to establish and maintain a regular and dependable schedule of data collection and distribution’’ (Guthrie and Hansen, 1995, p. 39); ‘‘Information regarding the standards and the manner in which they are applied should be broadly distributed and made clear to data users and providers alike’’; ‘‘The types and frequency of agency reports should be predetermined and routinely followed’’ (Guthrie and Hansen, 1995, p. 40).
p. 5). When the BICSE recommended the development of externally financed projects, that was, to some extent, because they already existed at the Division of Statistics (i.e. UNICEF’s financing for specific studies), as were programs aimed at developing the statistical capacity-building of the Member States, notably the program National Education Statistical Information Systems (NESIS), financed since 1991 by the Swedish International Development Agency (SIDA). The report’s stress on the latter—national statistical capacity-building11—corresponded to the increasing role given to actions aimed at compensating national budgetary cuts, such as international assistance or national management reform12. Less consideration was given to the effect, on national education systems and statistical services, of the institutional and economical upheaval due to, for example, the structural adjustment programs (Stewart, 1995). The report also underlined the need for modernized data processing methods. It was suggested, for instance, that clerks’ knowledge on national data be stored using a formal comments-recording system instead of being recorded in a paper support. The enhancement of computer equipment and the reinforcement of technical staff were also discussed. Some of these measures had already begun to be implemented, thought not without difficulty: ‘‘The full potential of the new computer platform will not be realized without more powerful hardware’’ (Guthrie and Hansen, 1995, p. 52). In fact, a section of the report described the Division of Statistics’ well-known budgetary restrictions, and subsequent delays in technological innovation, which might have had an influence on efficiency (in the past, cross-data checking could be time consuming). Most of the BICSE report was devoted to defining with precision the expected changes in the orientation of the statistical program (a new mission) and the organization of the activities (new management, autonomy). After remarking that the statistical services had not benefited from American or British expertise as well as from ‘‘the world’s leading statistical professionals and
11
The BICSE joined Puryear’s statement ‘‘The principal problems afflicting global education statistics are at the national, rather than the international level’’ (Puryear, 1995, p. 80). 12 It was the opinion of Puryear that ‘‘Better monitoring of educational systems is less a problem of funding than of priorities’’ (Puryear, 1995, p. 86).
ARTICLE IN PRESS R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
agencies.’’ (Guthrie and Hansen, 1995, p. 19)13, the authors stressed the need for taking into account the new international context in the definition of the mission of UNESCO’s statistical program. They described the evolution of requests for information in the international context, requests relating to, in particular, the performance of education systems: ‘‘[y] modern-day statistics users are interested in more and more accurate measures of student performance’’ (Guthrie and Hansen, 1995, p. 35). Governments are no longer in control of national economic policy, and countries now compete with each other to attract the investments of international firms: ‘‘Modern technology, both its existence and pursuit of its development, has contributed to the formation of massive amounts of private-sector investment capital, often (or even usually) outside the immediate control of governments.’’ In this context ‘‘[y] there is [y] an intense reliance on human capital formation to sustain a nation’s global competitive status and internal civic structure’’. ‘‘International comparative data [y] display the capacity of the other nations that may be trade and investment competitors.’’ (Guthrie and Hansen, 1995, pp. 33, 30 and 32)14. For BICSE, the relationship between education and international economical competition is not a topic of debate, but a reality, which must be taken into consideration when designing statistical programs. This is, in fact, one of the most important aims of the report: to encourage change of function and policy orientation of international education statistics for the sake of ‘‘the altered human capital needs of member states, growth in internationalism among private-sector companies, and the emergence of major third-party agencies concerned with social infrastructure planning and development throughout the world’’ (Guthrie and Hansen, 1995, p. 47). The link between education and worldwide expansion of liberal democracy’s values (i.e. defending individual liberties against what is interpreted as ‘‘encroachment’’ of governments or institutions) was also noted when defining the new statistical program mission (Guthrie and Hansen, 1995, p. 30–32). Lastly, one message comes through quite clearly in the report: ‘‘Unless external agencies perceive a 13
Nevertheless, American and British experts were not completely excluded from collaborating with UNESCO, the common work with OECD being an example. 14 Which was also the advice of Puryear ‘‘Employers compelled to compete in a international economy take a sharper interest in the skills their employees possess’’ (Puryear, 1995, p. 85).
537
meaningful commitment by UNESCO to its statistics program, they are unlikely to view the organization as the most appropriate vehicle through which to fund and implement their own ideas for improving education statistics and indicators’’ (Guthrie and Hansen, 1995, p. 6). As mentioned earlier, emphasis was also placed on the need for changing the institutional framework. The BICSE proposed six possible scenarios for the future institutional structure of UNESCO statistical services: (i) elevating the statistical function in the UNESCO organizational hierarchy, (ii) subsuming statistics within UNESCO’s Education Sector, (iii) alignment with an existing functionally autonomous agency, (iv) creation of an entirely new functionally autonomous agency, (v) fusion with another United Nations agency, and (vi) privatization of statistical services under the guise of a commercial enterprise. As recommendations which aimed at improving the quality of statistics, a major part of the conclusions of the BICSE report and the fourth scenario (creation of a semi-autonomous institute) were presented by the Secretariat to the Member States for approval. This took place at the General Conference of 1997 (UNESCO, 1997a). 4. New functions and management for the Institute: the Thompson report Following the recommendations of the BICSE, the mission and function of the new Institute had to be further defined. Preliminary or parallel changes had also to be made in management as well as in the make-up and profile of personnel. In order to put in motion the transformation of the Division of Statistics into the new Institute, a Steering Committee was created during the meeting of the UNESCO Executive Board in May 1998. Of its nine members, the president had major responsibilities within the World Bank. The Committee was to have, for the interim, the same functions as the Institute’s Governing Board. The latter was to be formed by the General Conference of 1999. These functions included the approval of general policy of the new Institute, the definition of priorities, the supervision of the appropriate use of the budget and the task of making recommendations to the Director-General in the nomination of the Director of the Institute. The Steering Committee approved the Division of Statistics’ program for the biennium 1998–1999, and guided the setting up of several studies on the functions, management, staff policy and structure of
ARTICLE IN PRESS 538
R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
the new Institute. Following the Committee’s recommendation, a private firm was engaged by UNESCO (the consulting group Coopers & Lybrand, associate of PriceWaterhouse) to make proposals in these fields. The paper UNESCO International Institute for Statistics-Report, was prepared and released in 1998. The document is known as the Thompson report (Thompson, 1998). This report restated the Institute’s draft statutes and its future functions as approved by the General Conference in 1997, and proposed a number of developments. As regards statistical activities, one of the main points of the recommendations was the harmonization with methods of the OECD. For instance, referring to the development of suitable indicators, the report stated: ‘‘Within OECD, the education statistics function operates with a broad understanding that about one-third of its current statistics should be carried forward unchanged into the next year, one-third should be refined but within the same broad definition, and one-third should be new [y]. We think this is a reasonable guideline and one that [the Institute] should adopt’’ (Thompson, 1998, p. 9). Again in relation to OECD methods, according to the report, the former Division of Statistics was not capable of identifying future needs for lack of contacts (network and partnership) in the field of national education policy (partly in contradiction with the long-established list of national official correspondents, see first paragraph in Section 2). As in the BICSE’s report, the foremost practical measures aimed at the reinforcement of data production consisted of the development of software to automatically identify statistical inconsistencies, the systematic keeping of written records on metadata, and the formalization of data collection procedures. The Thompson report also mentioned the development of activities such as the use of outside financing by other organizations, the collaboration with the International Institute for Educational Planning (IIEP), the implementation of training workshops or visits in the Member States, the distribution of manuals and technical guides in order to help develop national statistical capacities, the co-operation with countries in difficulty as well as the creation of programs such as NESIS. As noted earlier, these activities were already implemented at the Division of Statistics although they were certainly affected by budgetary restrictions. No other new recommendations specifically adapted to the production of international educa-
tion statistics are to be found in the report15. Moreover, as regards the OECD, the agency most often cited as reference, its educational database was not comparable to that of the Division of Statistics in the area of worldwide country-tocountry comparability and time series (valuable assets UNESCO offered to the academic and political communities). UNESCO’s database—to mention only one example—implements specific techniques and methods to ensure and/or improve data coherence, reliability, comparability and accuracy: statistical codifications, updating of population and economic series, revision of estimated series, maintaining of metadata, revision of time series and school cohorts coherence allowing analysis of educational efficiency, etc. Finally, the participation of the staff of the former Division (31 permanent positions in 1998) in the whole data quality process, including the analysis of indicators was insufficiently taken into account in the Thompson report. Only the creation of a ‘‘documentation data base’’ to store the staff’s handmade notes was recommended. The omission of the complexity of the personnel’s role in the statistical production was logically reflected in the drastic recommendations concerning the staff. The Thompson report recommended that early retirement should be proposed to all those 55 years or over who could not be deployed elsewhere in the Organization. As they were assumed not to be specialists on statistics, it was also suggested that the majority of general services staff be re-deployed. Age and professional category were therefore the two relevant criteria. The report underlined that if UNESCO were not able to arrange the early departure and redeployment of existing staff within the Division, the transition would take too long and prove too costly, and as a result the Institute would be unable to renew its activities with the appropriate personnel (Thompson, 1998, p. 24). It is also interesting to note that there was no mention of measures concerning temporary personnel—seven or eight employees whose existence was not evoked—who were already performing, at that time, tasks relative to the statistical program’s execution. Later on, consultants from the very same firm issued another report, proposing a new structure for the Institute. Submitted in 1999, this 15 In fact, the Thompson report relegated more technical data quality issues to another group of experts to be created later on (Technical Advisory Panel, TAP).
ARTICLE IN PRESS R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
report defined the organizational chart and the major positions. The permanent staff was to be reduced to 26 members. They were supposed to be seconded by temporary personnel and discharged of certain activities by subcontracting. The report also recommended that the Institute should operate under its own rules, applying, for instance, a system of flexible recruitment in which salaries are not fixed in advance, a system at odds with UNESCO’s rules and regulations. A large number of recommendations in the Thompson report were adopted by the Executive Board of May 1998. Emphasis was put on respecting UNESCO’s regulations, and the proposed personnel policy was drawn up without mention of any precise decision concerning the future of the former Division’s staff members: ‘‘Access [to new posts] for staff of the Division of Statistics on the basis of quality but no preferential status’’, the latter certainly referring to seniority and professional experience in the Division (UNESCO, 1998a, p. 5). The Steering Committee recommended the new Institute’s Director at the end of 1998, the position taking effect at the beginning of 1999, prior to the decision process for the transfer of the Institute (see Section 7). As stated in the Executive Board report, ‘‘[the Director] would be in charge of preparing and implementing the transitional work program and of ensuring the migration from the current staffing pattern of the Division of Statistics to the new one’’ (UNESCO, 1998b, p. 12). 5. The creation of the UIS: personnel and management issues As recommended in the Thompson report, the creation of the Institute was to take place at the same time as the proposition of early retirement measures, which six staff members accepted in May 1999. The remaining personnel stayed on at the Institute without any definite redeployment plan in place. In the following months, two staff members were transferred and three others took retirement under normal conditions. Four new staff positions were created and filled after the Institute’s creation in 1999 but prior to 2002, two of which pertained to administration. It was to take 2 years to create a significant number of permanent positions aimed at consolidating and developing statistical activities. These new positions were to come into effect after the Institute’s transfer to Canada by the end of 2001, when the UIS was almost entirely re-staffed.
539
Temporary-staff increased in number during this period of transition: several supernumeraries and two associate experts joined the Institute, the latter being directly financed by Member States. Senior management of the Institute was enlarged by the addition of a Canadian consultant (becoming permanent later on) and an officer seconded to the UIS by the United Kingdom (both having previously participated in OECD activities). The preceding structure, in principle more hierarchical, was replaced by a newer organization of staff and activities. The staff was divided into four groups working under four team-leaders. For each team different activities were defined and distributed. These were in essence the same duties as executed under the previous structure but presented here in the form of a project: definition of statistical questionnaires, data collection, processing and analysis, preparation of international workshops and of publications, etc. The projects were presented by region: Latin America, Sub-Saharan Africay complementing OECD, PHARE and WEI groups of countries. New management and work procedures were also put into practice: several weekly meetings to discuss work in progress and problems encountered, formalization—that is to say restriction—of contact with other sectors and agencies (attendance of meetings, information, official representation), vertical organization of work (direct relations with Member States), continuous evaluation, formalization of contacts between colleagues (in written form), standardization and homogenization of working styles and methods (despite the presence within the personnel of some 20 different nationalities from the five continents). To summarize, staff flexibility and the greatest possible autonomy of teams and of projects was to be accompanied by a certain increase in technocracy and formalization of decision-making. As provided in the statutes of the Institute, the Governing Board was established in 1999 by the General Conference. It was made up of 12 members, six of which were elected by Member States on the basis of regional representation. The DirectorGeneral of UNESCO appointed the other six, including the President of the Board, who was already head of the preexisting Steering Committee. The functions of the Governing Board were mainly the same than those of the former directingbody.
ARTICLE IN PRESS 540
R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
6. The definition and execution of the Institute’s program The main lines of action of the statistical program outlined by the UIS for the biennium 2000–2001 were: (i) establishing present and future needs in the area of statistical data and indicators, (ii) improving collection, dissemination and use of international comparative statistics, and (iii) strengthening the statistical capabilities of the Member States (UNESCO, 1999, p. 173). The main lines of action for 2002–2003 were: (i) improvement of the statistical database; (ii) developing new statistical concepts, methodologies and standards; (iii) statistical capacity building, and (iv) strengthening statistical analysis and dissemination of policy-relevant information (UNESCO, 2001, p. 191). Problems related to the UIS’s transition into a new structure and the transfer of the Institute to Canada, partially explain the limited implementation of those objectives, especially in terms of improvement of statistical production. Nevertheless, the lack of development of significant technical and operational activities is also to be noted, especially those related to data management and processing. As mentioned earlier, while issues such as changes of structure, management and mission of the statistical services—which were very detailed in the BICSE and Thompson reports—are important factors of the data quality, they do not completely encompass it (Carson and Liuksila, 2000; Eurostat, 2002). Recommendations more directly related to data production were more general in nature, most such proposals being already well-known (i.e. updating computer systems or developing manuals aimed at helping national data classification). This may be one reason that priority was given by UIS to the organization of international workshops, and experts’ meetings, rather than the strengthening of the database, at least during the first 4 years of existence of the UIS. The resumption of workshops was certainly responsible for an increase in response-rates to questionnaires as well as a better classification of national educational programs. However, the effect of these improvements was, to some extent, ‘‘diminished’’ since other activities essential to data reliability, comparability, coherence and accessibility were not developed simultaneously. This was notably the case for the technical and statistical activities necessary for the maintenance of the database: classification and codification of new variables, automation of calculations,
updating of demographic and economic series, calculation and/or revision of estimated time seriesy. It was also the case for activities related to data processing, verification and analysis such as checking the consistency of time series or verifying the inclusion of all educational programs with national publications. The development of these activities was, during the first 4 years of the UIS, to slow down significantly. For example, demographic series were normally updated at the last assessment of the United Nations Population Division. The automatic procedure of calculation of the indicators based on the demographic data is then launched again for all the years available. This operation makes it possible to ensure the time series coherence, to check the new indicators and to readjust the estimated series. The selection of indicators for 1998 presented in the regional publications of the UIS of 200116 took into account neither the most recent revisions of demographic series nor the revisions of school data series which were to follow. During years 2002 and 2003, selected indicators for 1998–2000 had been calculated using the latest revision of population data. Nevertheless, time series were disrupted since only indicators for these 3 years had been revised, at least as regards the data included in the Education for All (EFA) Monitoring Reports published in 2002 and 2003 (UNESCO, 2002, 2003) and the statistical series available on the Web site of the UIS (October 2003). It is important to note here that maintaining comparability over time is not inconsistent with the introduction of new indicators. The quality of the latter can be checked by the time series of core educational data. Moreover, educational time series, among others, can help evaluate the impact of international development policies (Cusso´, 2006). At the same time, the implementation of the UIS’s program did not imply the creation and dissemination of new indicators covering all Member States. Appearing at the end of 2001, after the interruption of the UNESCO Statistical Yearbook in 1999 and the publication of the statistical assessment of EFA in 2000 (UNESCO, 2000a), the regional publications on education statistics contain virtually the same kinds of data and indicators as those prepared by the Division of Statistics. 16
Good neighbours: Caribbean students at the tertiary level of education, Regional Report 2001; Latin America and the Caribbean, Regional Report 2001; and Sub-Saharan Africa, Regional Report 2001.
ARTICLE IN PRESS R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
The same can be said for the statistics available on the Web site of the UIS www.uis.unesco.org (October 2003)17. Moreover, the analysis of the education policies of the Member States has not been particularly developed by the UIS. Even though the publications prepared together by the OECD and the UIS (OECD-UIS, 2001, 2003a,b) contain some analysis, this concerns essentially the 19 countries of the WEI project18. Finally, the development of the ISCED1997 has not made much progress since its approval, by the General Conference in 1997, appeared in a summary document on the different levels of education (UNESCO, 1997b). In brief, and as mentioned above, difficulties related to the transition towards the UIS’s new structure and to the transfer of the Institute to Canada, certainly influenced the development of the statistical production. These difficulties render any evaluation of the implementation of the new statistical program complex and, in some ways, incomplete. Part of the Institute’s activities are extremely general or long-term—i.e. the reinforcement of the statistical capacities of the Member States. Nevertheless, the limited enhancement of UIS’s statistical output, at least during the Institute’s first 4 years of existence, should be seen in the context of the experts’ recommendations, which gave priority to institutional and mission changes. 7. Transfer of the Institute to Montreal: a random decision? The relocation of the UIS, outside of UNESCO’s headquarters, was not in the Institute’s draft statutes, as approved in May 1999 by the Executive Board. It was only during the October 1999 session that the Director-General was asked to send a circular to Member States, soliciting proposals for the UIS site. The General Conference approved this bidding procedure in November 1999. The circular set detailed specifications for proposals, to be submitted to the Secretariat no later than February 2000 (UNESCO, 2000d). It was then left to an ad 17
Primary completion rates had also been calculated (UIS, 2003). Data on full-time equivalent teaching staff were requested in the statistical questionnaires. 18 The publication on the Programme for international student assessment (PISA) (OECD-UIS, 2003b) concerned four countries participating in the WEI project, five countries participating in the PHARE project as well as Israel and Hong Kong. See also the analytical feedback of education policies in the UIS Policy Notes.
541
hoc Committee to study the proposals and to recommend the definitive location of the Institute. The Committee, composed of four members of the Governing Board of UIS, recommended engaging a firm specialized in ‘‘office relocation’’ (ENGEL Construction Management Services Ltd.) to evaluate the sites proposed by Canada, France, the Netherlands and Great Britain. The choice of the UIS Governing Board was Montreal (Canada). The Executive Board approved the transfer in May 2000. Despite the Director-General’s request for advice on ‘‘[y] whether the Institute should be located outside UNESCO Headquarters [...]’’ (UNESCO, 2000d, p. 1), the evaluation of the proposals for the UIS’s accommodation were not preceded by an actual transfer feasibility study. It was based upon the Institute’s environment, international travel connections and local transportation, security, technical networks, building conditions, and financial aspects and benefits (UNESCO, 2000b, Appendix II). Political, strategic and professional consequences of the relocation of all statistical services were not to be specially analyzed. The evaluation gave to the item ‘‘close to UNESCO Headquarters’’, in the Parisian proposal, as much importance (similar number of points) as to the item ‘‘well maintained and durable office building from 1972 in its own site’’ in the Dutch proposal (UNESCO, 2000b, Appendix II). Following the remarks of the French representative during the session of the Executive Board in May 2000, pointing out that the relocation of the Institute was far from being the best adapted, the most economical or the most equitable solution (UNESCO, 2000c, p. 13), the representative of Great Britain did not hesitate to mention the lack of quality control on the part of the Governing Board of the UIS, its ad hoc Committee and the UNESCO Secretariat in the bidding process: ‘‘[y] financial analysis took no account of the duration of the offers, and [y] the analysis of travel connections was deficient. [y] Availability and inflation had not been taken into account in every case, and only France and the United Kingdom were offering free premises for 99 years’’ (UNESCO, 2000c, p. 42). Some representatives from developing countries showed their dissatisfaction with the fact that the only countries allowed to participate in the bidding were those able to handle part of the Institute’s expenses. In addition, a number of the Institute’s staff also requested a feasibility study on the
ARTICLE IN PRESS 542
R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
transfer of the UIS (in the form of a proposal by the staff union, STU, presented to the Executive Board in May 2000). In his reply to the Member States, the Director-General addressed matters of procedure. He pointed out that the Governing Board of the UIS had not provided for any interaction between the Secretariat, the Executive Board, and the ad hoc Committee, and that he himself had made the decision to put his trust in the Governing Board of the UIS, just as the Executive Board had done before. The transfer of the Institute was finally approved by the Executive Board without any open discussion on the merits of transferring the statistical services to a continent where Member States had the least need of statistical assistance. More important, there was no evaluation of the extent to which the departure of the statistical unit from UNESCO’s headquarters (where the sectors with which it had the most interaction were located), could advance the improvement of the statistical program. Nevertheless, the experts’ reports offer some insights as to the motivation for the tranfer. Although the latter was not evoked in the reports, they supported putting a distance between the headquarters (where political decisions are taken) and the implementation of the statistical program; Puryear criticized the fact that UNESCO had to ‘‘maintain a broad consensus among [governments] as to the amount and type of data’’ that could be requested (Puryear, 1995, p. 86). More explicitly, privatization of the statistical services was one scenario in BICSE. 8. Conclusion Criticisms of the quality of UNESCO’s statistics, while citing factors of data quality, reveal a marked priority for mission and institutional factors. Recommendations on more technical data quality issues were rather general and often well-known. The limited enhancement of the UIS data production during the Institute’s first 4 years can be understood in this context. Moreover, one of the most valuable contributions of UNESCO’s publications, the statistical time series covering all the countries, has come to be put into question by the increasing similarity to the OECD’s statistical methodology (Cusso´ and D’Amico, 2005). The recommended change of misson of UNESCO’s statistical services was justified in the experts’ reports by the needs for data on educational background of the work force (human capital),
and for measuring and comparing educational systems’ performances in a strongly competitive world economy. The necessity for further development of countries’ statistical capacity building was also noted by the experts. It was the BICSE’s opinion that education can be a spur for worldwide expansion of liberal democracy’s values19. Nevertheless, these arguments are not fully convincing. While measurement of schools’ performances was deemed necessary and desirable for society, these evaluations are not always demanded, notably in countries of the South, as reported even by Puryear—‘‘Why do parents, policymakers and the media seldom demand them?’’ (Puryear, 1995, p. 87). This would suggest a lack of ‘‘assessment mentality’’ (whether for political reasons, or due to ignorance or inertia). Moreover it is not clear why, while ‘‘resistance’’ to country’s performance ranking is deemed political, ‘‘pressure’’ for ranking would be objectively necessary20y It is relevant here to wonder whether an international organization such as UNESCO should fully embrace recent globalization goals, insofar as free-trade priorities (e.g. reforming and assessing education in terms of human capital) are not yet universally accepted. Moreover, the measurement of students’ learning achievement, such as in the OECD’s Programme for International Student Assessment (PISA), is an exercise which is not free of ambiguity. How learning achievement should be evaluated and compared, with which objectives, and to what extent, remains complex as is reflected in the debate on learning achievement test-problems (Goldstein and Thomas, 1996; Burgues, 1999; Green, 1999; Kohn, 2000)21. Despite the political implications of the reform of the statistical services (especially the change of mission), UNESCO’s Member States did not directly address—during the relevant debates—the political and social significance that the 19 As early as the 1950s, the World Bank considered education to be a factor of ‘‘modernization’’ of the non-Western world. Education should facilitate the adoption by masses of new economic attitudes and work habits as well as of liberal democracy’s values such as individual property or modern gender ‘‘equality’’ (Cusso´, 2001). 20 In Puryear’s words: ‘‘Government must perceive that this is what good government is about’’ (Puryear, 1995, p. 88). 21 It is interesting to note the relatively vague definition of ‘students’ learning achievement’ assessed by PISA. This is mainly defined as ‘‘skills to meet real-life challenges’’, those that ‘‘15year-olds will need in their future lives’’ as well as ‘‘knowledge and experience of real world issues’’ (OECD, 2001, p. 14).
ARTICLE IN PRESS R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
Organization wanted to give to education statistics and to the reform of the statistical services. This can be explained by three interacting factors: loss of leadership, the crisis of multilateralism and increasing international competition in the field of production of educational data and recommendations. UNESCO’s loss of leadership, as concerns production of recommendations and statistical analysis, had affected the critical and decisionmaking functions of the General Conference as well as the Secretariat’s capacity to react to other entities’ programs. As a consequence of budget cuts, of the need to obtain outside financing and of the crisis of multilateralism, the UNESCO directing body has adopted a more management-oriented vision of the program and of the decision-making process22. The discourse and fundamental texts of the institution continue to affirm vigorously the right to education and the respect of cultural diversity, all the while letting their eloquence mask a loss of influence in the international debate. The criticisms made by UNESCO concerning, for example, the negative effects of structural adjustment programs on education were not accompanied by a clear statement of policy recommendations in this regard, as had happened in other institutions in other fields (Mundy, 1999, p. 42). And finally, the fact that some education statistics were already being produced by other agencies was a factor encouraging approval of the reform. In this general context, the emergence of a critical and pluralistic debate concerning the definition and use of international education statistics was quite unlikely. For some parties, restructuring was supposed to provide a (first) solution to the problem of the development of relevant statistics in the face of new international demands. For UNESCO, the larger problems are still unsolved especially regarding the definition of the Organization’s position among other international agencies. For Heyneman, certain issues should not be put to a vote: ‘‘Items of the UNESCO budget are the subject of a popular vote at every General Conference; seminars may be more important to those voting’’ (Heyneman, 1999, pp. 72–73). For UNESCO, international decision22 Referring to the change of the approach of the Economic Commission for Latin America and the Caribbean (ECLAC) in the 1980s, Sikkind underlined several explanatory factors: increasing competition between agencies, fall of the Berlin Wall, new legitimate (neo-liberal) ways to present and explain economic problems, generation change, and individual strong personalities (Sikkind, 1997).
543
making, being already consensual, tends to become even less substantial. The reform of UNESCO’s statistical services illustrated this tendency.
References Burgues, D., 1999. The Kassel project: an international longitudinal comparative project in secondary mathematics. Oxford Studies in Comparative Education 9 (1), 135–155. Carson, C.S., Liuksila, C., 2000. Further steps toward a framework for assessing data quality. Working Paper IMF, 20pp. Conil Lacoste, M., 1993. Chronique d’un grand dessein: UNESCO 1946–1993. UNESCO, Paris, 551pp. Cusso´, R., 2001. La de´mographie dans le mode`le de de´veloppement de la Banque mondiale: entre la recherche´, le controˆle de la population et les politiques ne´olibe´rales. Doctoral Dissertation, Ecole des Hautes Etudes en Sciences Sociales. EHESS, Paris, 380pp. Cusso´, R., 2006. La Banque mondiale et l’e´ducation dans les pays ‘pauvres’: quelques e´le´ments pour une contre-expertise. Questions vives 3 (6), 105–121. Cusso´, R., D’Amico, S., 2005. From development comparatism to globalization comparativism: towards more normative international education statistics. Comparative Education 41 (2), 199–216. Eurostat, 2002. Definition of quality in statistics. Document No. Eurostat/A4/Quality/02/General/Definition, 4pp. Goldstein, H., Thomas, S., 1996. Using examination results as indicators of school and college performance. Journal of the Royal Statistical Society 159 (1), 149–163. Green, A., 1999. Converging paths or ships passing in the night: an English critique of Japanese school reform. Comparative Education 36 (3), 417–435. Guthrie, J.W., Hansen, J.S. (Eds.), 1995. Worldwide Education Statistics: Enhancing UNESCO’s Role. National Research Council, Washington, p. 65pp. Heyneman, S.P., 1999. The sad story of UNESCO’s education statistics. International Journal of Educational Development 19 (January), 65–74. Kohn, A., 2000. The Case Against Standardized Testing: Raising the Scores, Ruining the Schools. Heinemann, Portsmouth, NH, 94pp. Laı¨ di, Z., 1989. Enqueˆte sur la Banque mondiale. Fayard, Paris, 358pp. Levi-Strauss, C., 1952. Race and History. UNESCO, Paris, 50pp. Mundy, K., 1999. Educational multilateralism in a changing world order: UNESCO and the limits of the possible. International Journal of Educational Development 19 (January), 27–53. OECD, 1992. Education at a Glance: OECD Indicators. OECD, Paris, 148pp. OECD, 2000. Investing in Education. Analysis of the 1999 World Education Indicators. OECD, Paris, 192pp. OECD, 2001. Knowledge and Skills for Life: First Results from PISA 2000. OECD, Paris, 322pp. OECD, 2003. Education at a Glance: OECD Indicators, 2003 ed. OECD, Paris, 451pp.
ARTICLE IN PRESS 544
R. Cusso´ / International Journal of Educational Development 26 (2006) 532–544
OECD-UIS, 2001. Teachers for Tomorrow’s Schools. Analysis of the World Education Indicators, 2001 ed. OECD-UIS, Paris, 228pp. OECD-UIS, 2003a. Financing Education—Investments and Returns. Analysis of the World Education Indicators, 2002 edition. OECD-UIS, Paris, 232pp. OECD-UIS, 2003b. Literacy Skills for the World of Tomorrow— Further Results from PISA 2000. OECD-UIS, Paris, 392pp. Papadopoulos, g.s., 1994. L’OCDE face a` l’e´ducation 1960–1990. OECD, Paris, 223pp. Puryear, J.M., 1995. International education statistics and research: status and problems. International Journal of Educational Development 15 (January), 79–91. Seers, D., 1983. The Political Economy of Nationalism. Oxford University Press, London, NY, 218 p. Sikkind, K., 1997. Development ideas in Latin America. Paradigm shift and the Economic Commission for Latin America. In: Cooper, F., Packard, R. (Eds.), International Development and the Social Sciences. University of California Press, Berkeley and Los Angeles, pp. 228–256. Stewart, F., 1995. Adjustment and Poverty: Options and Choices. Routledge, London, NY, 243pp. Thompson, Q., 1998. UNESCO International Institute for Statistics-Report,. Coopers & Lybrand, 27pp. UIS, 2003. Global Education Digest 2003—Comparing Education Statistics Across the World. UIS, Montreal, 125pp. UNESCO, 1997a. Records of the General Conference, Twentyninth session, Paris, 21 October–12 November 1997. v. 1: Resolutions. UNESCO, Paris, 133pp.
UNESCO, 1997b. International Standard Classification of Education. UNESCO, Paris, 42pp. UNESCO, 1998a. Report by the Director-General on the Creation of a UNESCO International Institute for Statistics. 154 EX/5, Add. 22 April. UNESCO, Paris, 5pp. UNESCO, 1998b. Report by the Director-General on the Creation of a UNESCO International Institute for Statistics. 154 EX/5, 2 April. UNESCO, Paris, 17pp. UNESCO, 1999. Approved Programme and Budget 2000–2001. UNESCO, Paris, 315pp. UNESCO, 2000a. Education for All. Year 2000 Assessment: Statistical Document. UNESCO, Paris, 69pp. UNESCO, 2000b. Report by the Director-General on the Choice of the Location of the UNESCO Institute for Statistics (UIS). 159 EX/36, 20 April. UNESCO, Paris, 26pp. UNESCO, 2000c. Summary Records (Executive Board 159 session, 15–26 May 2000). 159 EX/SR.1–11, 25 August. UNESCO, Paris, 360pp. UNESCO, 2000d. CL/3534, 21 January. UNESCO, Paris, 3pp. UNESCO, 2001. Approved Programme and Budget 2002–2003. UNESCO, Paris, 330pp. UNESCO, 2003. Education for All Global Monitoring Report 2003. Gender and Education for All, The Leap to Equality. UNESCO, Paris, 416pp. UNESCO, 2002. Education for All Global Monitoring Report 2002. Is the World on Track?. UNESCO, Paris, 310pp.