Hostname: page-component-7bb8b95d7b-s9k8s Total loading time: 0 Render date: 2024-09-27T08:13:43.858Z Has data issue: false hasContentIssue false

Revisiting Knowledge-for-Development

Published online by Cambridge University Press:  05 September 2023

Pierre Jacquet*
Affiliation:
Global Development Network (2012–2022), New Delhi, India
Rights & Permissions [Opens in a new window]

Abstract

The Knowledge-for-Development (K4D) approach has been characterized by a marked concentration of the production of research on development in developed countries. The underlying utilitarian approach to social science research misrepresents the nature and making of policy, raises entry barriers for developing country researchers, and focuses on the production of research to the detriment of its use. Using such research in developing countries requires informed debate, consideration of the local environment, and sufficient local research capacity. Foreign assistance should focus on research capacity building as a specific objective, distinct from the production and publication of research, with its own management, implementation, and monitoring. This is essential if the objective of the K4D approach, namely the better use of scientific knowledge to enhance the quality of policies, is to be achieved.

Type
Original Article
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press on behalf of The World Trade Organization

The conviction that policy effectiveness requires the mobilization of scientific knowledge and evidence has shaped development approaches and donor strategies for at least a quarter of a century, most notably since the World Development Report on ‘Knowledge for Development’ (World Bank, 1998). L. Alan Winters has demonstrated strong and sustained commitment in his career to strengthening research capacity in developing countries.Footnote 1 This essay pays tribute to this aspect of Alan Winters’ work by discussing ways in which ‘knowledge-for-development’ (K4D) efforts can, and should, be improved.

A well-known characteristic of research in social sciences (including economics) is the spatial concentration of its production and its thematic focus. More than 80% of citable documents in the social sciences originate in high-income countries, half a percent in low-income ones, less than 5% in lower-middle-income ones.Footnote 2 Das et al. (Reference Das, Do, Shaines and Srikant2013) note that most of the research in economics published in the top 202 economic journals over the 1985–2005 period is devoted to the United States. Only 4 papers were published on Burundi, 9 on Cambodia, and 27 on Mali. Such concentration of the production is not necessarily problematic. Scientific knowledge is a global public good (Stiglitz, Reference Stiglitz, Kaul, Grunberg and Stern1999). Once generated, it can be freely accessed at zero marginal cost.Footnote 3 Furthermore, its production exhibits economics of scale. An environment conducive to research will benefit researchers and increase their productivity through the synergies and interactions it generates. Geographic concentration may therefore increase research effectiveness and productivity and be preferable to other modes of organization. Casual evidence suggests that developing country academic students able to graduate from leading universities in high income countries or to receive post graduate training there have access to more promising research or professional careers and greater visibility and influence. This is therefore one approach to raise research capacity and strengthen the influence of research on policy. It reportedly worked well in Latin America as highly trained economists educated in the United States came back to their countries and played a major role in shaping policies (e.g., Harberger, Reference Harberger1993).

The K4D effort has mainly espoused the concentration model. Many universities in high-income countries have nurtured centers of development studies and attracted top students from developing countries in their departments. A buoyant academic market for development research has emerged, producing a substantive scientific output, which is a global public good. Why divert resources to train researchers locally if they could be better trained outside of their own countries and if solutions to national development challenges are mainly technical in nature and could be elaborated upon in foreign places? Information technology (IT) has contributed to deepening the technical content of much social science research, and the concentration of technical work may increase overall academic effectiveness.

The thesis developed in this article is that, however useful in augmenting development knowledge, this conception of the K4D model is partial at best and misrepresents the nature of development, the politics of policymaking, the various possible uses of research, and the importance, role, and nature of capacity building. As a result, concentration of social science production is sub-optimal for development policies and strategies. This problem was partly recognized and addressed through research capacity building initiatives, but these have, however, been insufficient and inadequately funded and specified.

1. On Knowledge Use vs. Generation

As a global public good, development knowledge is essentially ‘disembodied’ (Stiglitz, Reference Stiglitz and Stone2000). It can be defined and disseminated without reference to contextual oddities and has a universal value. It includes the broad scientific underpinnings of knowledge that can be universally shared and serve as guide for thinking and action. Even in that case, however, access, understanding, and interpretation are conditioned by individual capacity and by contextual factors, such as culture, history, experience, ideologies, and politics.Footnote 4 To put it differently, there are different ‘ways of knowing’ that are highly context-dependent.Footnote 5 Using scientific knowledge to define and implement development paths or solutions requires re-embodying knowledge, which implies a local process of ownership that concentration of research work makes more difficult, both in terms of lack of capacity locally, and in terms of ‘salience, credibility and legitimacy’ (Cash et al., Reference Cash, Clark, Alcock, Dickson, Eckley and Jäger2002), three characteristics of knowledge that condition its use. Knowledge is not a commodity that can be bought and put to work with little additional effort (Arocena et al., Reference Arocena, Göransson, Sutz, Currie-Alder, Kanbur, Malone and Medhora2014). This echoes the critique of ‘monoculturality’ (Akude, Reference Akude2014) and strengthens the case for expanding research capacity-building efforts.

The very generation and dissemination of ‘disembodied knowledge’ reflects value-laden choices, because scientific investigation is always incomplete and selective. Why generate knowledge, and disseminate it, on any given specific issue as opposed to others? Knowledge being incomplete, any choice to allocate research time on any theme is a subjective choice. Politics, culture, and ideology matter even for scientific creation. Developed countries’ domination ends up being further accentuated through the overwhelming role of foreign expertise and advice (even when instrumentalized in domestic politics). This runs exactly contrary to the avowed focus on local ‘ownership’, even though the latter was rightly recognized as a major principle of aid effectiveness by the 2005 Paris Declaration on Aid Effectiveness.Footnote 6 Moreover, the lack of ownership may be a deliberate strategy for political economy reasons. Developing country governments may tend to ignore domestic academia and seek knowledge from abroad (Arocena et al., Reference Arocena, Göransson, Sutz, Currie-Alder, Kanbur, Malone and Medhora2014). Calling on foreign experts not only helps raise resources, but also limits political implications and responsibility – it is tempting to ‘buy’ an expertise-based, scientifically legitimate, credible foreign blueprint that can then be presented by local ministers as a ‘result’ of their actions, even though the ministries lack capacity to implement these blueprints, which then become substitutes to action. One could cynically interpret part of foreign assistance as the ongoing conception of blueprints succeeding each other and seldom if ever implemented.

From a policy perspective, therefore, the global public good dimension of general knowledge cannot justify a de-politicized separation between knowledge generation, knowledge interpretation, and knowledge use. The three aspects of knowledge interact constantly. Scientific knowledge produced in the best universities in high income countries is not immediately actionable in a developing country context. Research conducted locally is an important component of access to, and use of, such knowledge, of interaction and integration with local implicit knowledge, and establishing nationally based epistemic communities (Haas, Reference Haas1992). Explicit recognition of this role for local production would be a decisive contribution to the effectiveness of the K4D agenda.

2. Social Returns to Social Science Research

In a financially constrained environment, it is not surprising, and even legitimate, to expect research to demonstrate social returns. Unfortunately, the interpretation of the latter has inexorably tended to shrink over time toward a restrictive, instrumental, and short-term interpretation. Given the concern about the social returns to social sciences, this section reviews the various roles and uses they may play and discusses the limits of the utilitarian view of research that has emerged as a guiding force of the K4D movement.

2.1 Roles and Uses of Research

Pressure to demonstrate results has considerably boosted policy work linking research with solutions to pressing policy challenges. The normative stanceFootnote 7 has become one of the main functions of social-science based, policy-oriented work. Economics has notably expanded on an ‘engineer view’ of policy and a rational model of the policy-making process: the economy is represented by a model, whose sophistication testifies of the capacity to understand and summarize complexity, through which inputs (policy instruments) are transformed into outputs (development results) in a more-or-less systematic and pre-determined way. That model is validated by rigorous empirical tests and can be further documented and improved through scientific evaluations. Such a model remains a positive representation of the economy, although the assumptions on which its simplifications are based are hardly value free. It easily leads to an explicitly normative approach by identifying the policy instruments and sequences supposed to generate any desirable mix of given outputs. This representation of the policy process amounts to a depoliticization of knowledge and of policy (Nustad and Sending, Reference Nustad, Sending and Stone2000). It ignores political economy considerations, which may be particularly relevant in at least four ways: (1) the reasons to focus on any given policy problem (which implies a choice between problems competing for attention); (2) the choice and adoption of the objectives to be assigned to policy (which may not be stable over time or over the available information); (3) ideological pressures on methods, assumptions and conclusions; and (4) the conditions for implementation.

Carol Weiss approaches the utilization of social science research along seven distinct and complementary dimensions (Weiss, Reference Weiss1979). These are summarized below into three categories: instrumental, political, and societal. The ‘instrumental’ approach focuses on the role of knowledge in directly addressing policy issues. For social sciences, this can take several forms. In the strongest one, policy may be expected to directly implement knowledge-driven prescriptions. Alternatively, social science research may be directed to solving specific policy issues, to evaluate a pending policy decision, or to formalize learning from policy experiments through scientific evaluations. In a more sophisticated version of the instrumental principle, research may be seen as one source of knowledge interacting with others (cultural beliefs and experience alongside forms of tacit, non-formalized knowledge) to inform policy challenges and decisions. Research, notably evaluative research, has a role to play to formalize tacit knowledge and knowledge from experience. It requires knowledge to know what one does not know. At low levels of education, beliefs tend to be considered as true knowledge; at all levels of education, ‘knowledge illusions’ are pervasive (Sloman and Fernbach, Reference Sloman and Fernbach2017). Unfounded claims, increasingly easily circulated through social media, compete with scientific knowledge and can be used as part of arguments to push specific decisions or actions. Such competition is unequal because access to high quality, scientific knowledge is more constrained, and users may not always distinguish the differences in the intrinsic quality of various claims to knowledge. This interaction between academic research and other forms of knowledge is an important topic for social sciences.

The ‘political’ approach recognizes that social science research-based insights are used as ammunition to defend predetermined ideas or positions. Research on a given theme can provide arguments for conflicting policy views rather than the clear-cut, unambiguous prescriptions that would correspond to the instrumental approach. The role of research in this case is not to change decision-makers’ views but to help strengthen their arguments. They are not primarily interested by new evidence from such research, but use it to support their priors or inclinations. A weaker version of the political approach recognizes that policymakers face constraints that may legitimately prevent them from abiding by scientific advice, even when the latter falls into so-called ‘Econ 101’ (Krugman, Reference Krugman2022). Weiss notes that using research to validate priors is a perfectly legitimate (and important) instrumental use of research, unless of course there is intentional distortion or misinterpretation of research findings. An additional political dimension of research use is tactical: to gain time or to show that decision makers care about a given issue, even if they do not take any other decision about it. Commissioning research on a specific issue may thus be an instrument of decision, which may then lead to an instrumental use of research, to further deliberation, or – in more cynical cases – to oblivion.

Finally, the ‘societal’ approach looks at social science research as a contribution to education about complex issues related to societal challenges and welfare – as a source of enlightenment. Research feeds into public debates, may provide new perspectives, helps define issues, and supports priorities. It may also lead to shared patterns of thinking (Weiss, Reference Weiss1979). Research in social sciences is one important dimension of social interaction, alongside other scientific disciplines, journalism, lobbying, culture, and artistic expression.

Social science research has a potential impact through each of these approaches, which may also interact with each other. For example, the societal role of research may well contribute to changing policymakers’ priors, in turn leading to a more instrumental use of research. To some extent, climate change provides such an example, although action has lagged well behind science-based awareness and prescriptions. At the core of the reflection on the kind of research that would be necessary and useful for policy is a debate on the role of scientific evidence in policymaking. Scientific evidence crucially adds new knowledge, but it is unlikely to be the only driving force to generate action (Stone, Reference Stone1989, Reference Stone2002; Gluckman, Reference Gluckman2016). Different strands of knowledge need to be connected, within academia across the social sciences and outside it.

2.2 Limits of the Utilitarian View

The combination of a normative research approach and instrumental use has received almost exclusive consideration in the K4D agenda, converging towards a utilitarian view of the research-to-policy interaction, even though, as noted by many (e.g., Weiss, Reference Weiss1979; Landry et al., Reference Landry, Amara and Lamari2001; National Research Council, 2012) this is not the most representative or relevant model of policymaking. There are many potential causes for the emergence of the utilitarian view as the dominant one. Ubiquitous funding constraints played a role, by focusing funding on results and impacts. Technical factors, including progress in modeling, computing power, and big data, facilitated this evolution by allowing the ‘engineer view’ to address a larger number of issues, and by sustaining the trend towards measurement and quantification of objectives and results. Political economy pressures pushed toward explicitly addressing complex issues whose understanding and interactions call for technical explanations, calling on technical expertise to hide political motives (in line with the ‘political’ use of research), and considering public policy as a ‘de-politicized’, mainly technical and rational process of choice between alternatives that can be properly evaluated.Footnote 8

Overall, there is a tendency to value instrumental research (as the expression ‘research for development’ itself indicates) more than other forms of research, implicitly suggesting that action should be derived from knowledge. It is hard to disagree, but this is only one aspect of the connection. The other way around, namely the contribution of action to knowledge, is also crucial. Advances in evaluation techniques make it possible to close the loop and use research as an analytical process connecting action and knowledge. In this line of thinking, research is also needed to formalize all forms of knowledge and make them transmissible. This implies a need for new methods, notably based on what is called ‘co-creation’ and on rigorous, multidisciplinary monitoring during implementation (Schon, Reference Schon1983).

The utilitarian pressure has distorted knowledge generation in many ways. It focuses on the explicit and technical components of knowledge, to the detriment of implicit and socio-human components, despite their strong bearing on feasibility and implementation. More broadly, it promotes the generation of instrumental knowledge against non-instrumental dimensions of knowledge. This is problematic, because the latter are essential components of knowledge, determine the environment and conditions of actions, and therefore also impact on the usefulness of instrumental knowledge. The utilitarian view focuses on the deterministic and technical dimensions of knowledge use, as opposed to the understanding and interpretative components. As noted, knowledge use is not only, or even primarily, ‘technical’, but is linked to the level of education and social norms that, for example, define the scale of importance of various dimensions of ‘known facts’. This matters because decisions and actions are about choices and thus implicit priorities. Important areas of knowledge (social, cultural, etc.) are thus neglected.

The ‘engineer view’ also serves as a basis for the elaboration of development donors’ operational strategies. This tilts research themes and methodologies in a way that enables these actors to intervene in line with their own priorities, biases, and competences (Nustad and Sending, Reference Nustad, Sending and Stone2000). This may lead to addressing problems specifically identified and formulated to allow them to implement their ‘solutions’. In such a context, there is a high risk that the choice of research themes in developed countries inspires policy priorities in developing countries, while the reverse should be the case. This is not to deny the considerable benefits of trying to understand how development and development assistance work. Results-based management, intimately linked to the ‘engineer view’, and the generalized recourse to ‘theories of change’ are important contributions to concentrating efforts where they are likely to achieve results and to rationalize interventions. But when carried too far, this approach may lead to a deterministic vision of development and greater risk aversion: no action will be considered without a clear ‘theory of change’ ex ante, which denies the possibility of using experiments to discover what the theory of change might be in practice. The instrumental approach to knowledge is also related to the increasing focus on ‘evidence-based’ approaches, which tend to reduce useful knowledge to a search for evidence and ignore the fact that ‘evidence’ may mean different things to different persons.

A consequence is that knowledge generation is mainly focused on thinking about the ‘what should be done’ as opposed to the ‘how to do it’ (since the ‘how’ is technically pre-determined). This leads to viewing research as a ‘product’ rather than as a ‘process’. A process view would require more investment in understanding, through critical questioning, analyzing, and interpreting, not just within the confines of a specific technical model, but within the social context. The focus on what should be done supports a view of research as providing ‘solutions’, in turn leading to a funding and organizational preference for research conducted in the ‘best academic places’, hence polarizing funding and research approaches, including the choice of research themes and methodologies, as well as research quality, to the detriment of what matters in the field from the local actors’ perspectives.

The prevalence of the utilitarian approach to social science research holds the seeds of its own disappointment. This is in part because supply of academic knowledge and demand for policy knowledge do not meet. Anecdotal evidence suggests that policy advice produced by research is mostly unsolicited, while the demand for immediate knowledge from policymakers is often not satisfied by academic researchers. One way to proceed would be to think about the connection between the supply of (academic) development research and the demand for actionable development knowledge as a specific, professional process that requires its own, autonomous, development.Footnote 9 The issue cannot be addressed by producing more research or research that is more innovative and technical, or by training academics in research communication but requires better understanding of how research is used, misused, or not used.Footnote 10 Efforts to address this gap can include research communications, policy labs and experiments, evaluations, and ‘boundary objects’ (Wenger, Reference Wenger1998) specifically designed to connect the two distinct worlds of academic suppliers and policy users.

The utilitarian approach risks tilting democracies toward what could be seen as a form of ‘enlightened despotism’ (Albaek, Reference Albaek1995), denying debate in the face of scientific certainties. The alternative is not relativism as there are universal scientific truths. The point being made here is that however well established scientific knowledge is, using it when acting requires debate, interpretation, and deliberation. The deterministic overtones of the current K4D approach challenge democracy by questioning the relevance of a democratic debate in the presence of technical certainties evolving into a perceived deterministic view of policy. The main risk, ominously present throughout Western societies, is that discontent with the prescriptions and the results extends to loss of credibility and a rejection of science as a guide to action and decisions. It is in that sense that a project initially based on enlightenment values may lead to weakening their base.

Overall, the instrumental approach to knowledge belongs to positivism and relates to a vision of history where progress takes place as we know more about how to do things, what works best, etc., and where science provides ultimate explanations, and where one can finally act without error. This vision of progress translates into a “quest for certainty” (Dewey, Reference Dewey1929; Fayolle, Reference Fayolle2020) and belief that knowledge can defeat uncertainty. Citizens’ reactions during the COVID-19 pandemic, especially critiques of public policies elaborated while knowledge was largely incomplete, can be interpreted as a refusal of being exposed to uncertainty, and a wish to assign policymakers the responsibility to ensure certainty. The implicit idea that there was one right way to think is illustrated in the debate around the ‘Washington consensus’ and more generally the prevalence of conventional wisdom at any point in time.

More concretely, the virtues of mistakes tend to be ignored, even though they are an important mechanism through which practical learning and ownership occurs. Paradoxically, in the name of avoiding mistakes, the illusion of certainty, through the idea that scientific ‘truth’ should inspire one course of action, can sometimes lead to policy mistakes and failures, invalidating the perception that advanced science in high-income countries provides the best insights on how to address development challenges. It is likely that the recent appeal of ‘alternative facts’, ‘false truths’, or ‘post-truth’ politics reflects a (theoretically and conceptually unfounded) reaction and opposition to a vision of knowledge with deterministic overtones which exclude human dimensions. while humans may not be primarily driven by truth (The Economist, 2016, quoting Kahneman).

While economic and social science knowledge may occasionally find their way into specific actions, the instrumental view underplays the role of thinking (and therefore of reasoning) by equating the usefulness of knowledge with the results of action and suggesting that the latter can be subsumed through analytics. In fact, the main usefulness of scientific knowledge is to be found in the quality and power of the thought process and the rhetoric it supports.Footnote 11 The latter may lead to various, contrasted outcomes. In turn, the value of reason is not so much in establishing the ‘truth’ as in enhancing the power and quality of deliberation (Mercier and Sperber, Reference Mercier and Sperber2017).

3. Issues of Capacity Building

Not only does knowledge generation strengthen the domination of high income countries universities, research themes, approaches, and funding constraints, it also erects barriers to entry that exclude many developing country researchers. These barriers go beyond lack of funding or lack of access to information. They are somehow more insidious: demanding quality standards, for example, exclude researchers working in low-capacity places who never get a chance to bolster their ability enough to participate in global research networks. The ‘solution’ to send the best of them to Western universities is treating part of the problem at the individual level, but certainly not at the country level. And it does not address the deep contextualization issues discussed above.

The quality imperative may well be at the core of the research-to-policy conundrum. It is an imperative because poor quality research cannot be expected to be a good guide for policy. Academia has gradually developed a metric for scholarship quality that is highly determined by the capacity to publish in the best, peer-reviewed, academic journals. This is all-the-more sensible as only peers can read and assess academic papers, and especially frontier research. The assessment of quality should not be left to laymen's judgment. Quality thus defined also acts as a strong signaling device for potential users of academic research (or listening to academic researchers).

However, this is essentially a supply-driven definition of quality. Assessment of quality from a demand-led perspective would still emphasize rigor (in the generation of knowledge, use of data and existing knowledge), but also timeliness, completeness (i.e. broad-based knowledge on major challenges), and relevance.Footnote 12 Academic researchers have developed a field of policy studies that do focus on policy relevance, but even then the very notion of policy relevance is too often supply driven, with the risk of generating recommendations that will not be heeded.

Thus, the academic quality as defended by academic producers may not correspond to the desired features from the knowledge-user perspective. As a result, users may turn to other, potentially inferior, sources of knowledge, which may then be seen to have a competitive advantage against academia. This has considerable implications, both on the perception policymakers have of researchers and of their contribution to development, and on the capacity of high-quality academic research to compete with other sources of legitimate knowledge and even with inferior sources of knowledge, sources promoting fake news and low-quality analyses. What could make scientific research more ‘competitive’ in the market for development knowledge? Can the current, scientific definition of quality be expanded to better address demand requirements while maintaining demanding academic quality prerequisites? How should the latter then be specified? There is a need to develop a second tier of peer-reviewed, policy-oriented journals that provide good outlets for the publication of research work that responds to more relevant quality standards.

Bardsley (Reference Bardsley, Georgalakis, Jessani, Oronje and Ramalingam2017) proposes to define quality by a combination of rigor and excellence. By excellence, he means ‘a competitive process based on open and transparent peer review.’ However, he further qualifies this by mentioning that beyond rigor, quality research should ‘break new grounds in terms of theory and methods’. This puts the barrier quite high and leads to the question of what is needed to develop a research- and evidence-based policy culture in developing countries. This debate might parallel the distinction proposed by scholars of economic growth (Acemoglu et al., Reference Acemoglu, Aghion and Zilibotti2006): what is expected from countries well below the technology frontier is not to push the frontier further outwards, through investments in innovation, but to adopt and adapt existing technologies. How should quality be defined for social sciences research work undertaken in lower capacity developing countries? Obvious candidate components for such quality might be rigor in data collection and data analysis, and new and analytical work on development challenges; it is unclear, however, that quality should emphasize the high-level academic innovations understandably valued by the established academic community.

These questions also pertain to building research capacity. While a well-recognized objective of foreign aid, implementation of research capacity building has suffered from high barriers to entry, combined with the quest for short-term results and thematic and methodological biases imported from foreign expertise. Research capacity building often takes a second seat behind the requirements on research output, and the same quality metric tends to be, by default, applied to both. Yet, there are several qualities that research capacity building should develop, focusing on rigor rather than innovation, which could be assessed as part of a specific, monitorable quality metric. Scientific rigor is particularly important in an era characterized by the data revolution and the power of computers. Technological ease by itself does not solve basic issues that remain at the core of research quality, including keeping a critical eye on the nature and reliability of data that are collected, interpreting them with rigor, using them to address structured questions rather than specifying questions for the sake of using them, and trying to identify robust ways to address existing problems, rather than define the problems from one's own perspective. There is a risk that the collection and use of data could be essentially supply driven, and could lead to diverting resources away from the most urgent development challenges. An important role for social science research will be to structure the demand for data.

4. Concluding Remarks

The knowledge-for-development approach is based on the premise that scientific output can be better used to enhance the quality of policies and improve development impacts. This premise is valid but the approach has only partially succeeded. The emphasis on producing high-quality knowledge has led to a welcome expansion of high-level development research and provided talented individual researchers from developing countries opportunities to develop international research careers. But the international academic and development donor community has tended to assume that the whole knowledge-for-development dynamic naturally rested on such efforts. It does not. Mobilizing knowledge for development requires much more than a focus on high-quality production. It must be conceived and implemented as a specific project with its own principles of effectiveness.

Such a project requires a better understanding of the whole process of using research-generated knowledge (Harris, Reference Harris, Chib, May and Barrantes2015) which requires looking not only at how research is generated and funded but also at how it takes place and is disseminated within a system of personal and institutional interactions between suppliers and users in a given environment.Footnote 13 To make research attractive, more demand-led and policy-relevant, the choice and formulation of research questions should, whenever possible, be organized interactively, while preserving research independence. This can be done, for example, through policy labs, which the Global Development Network (GDN), among others, uses as part of the inception of many programs. Winters (Reference Winters2016) emphasizes the role of interactions involving researchers, policymakers, and interest groups, together with institutional support, effective implementation, and evaluation for the success of the temporary migration program put in place through the ‘New Zealand's Recognised Seasonal Employer’ (RSE) Scheme. While he concludes that ‘the creation of the RSE arose from such a fortuitous coincidence of interests that it will be replicable only very rarely’ (p. 4), shaping a research program (or, indeed, a policy experiment) can build the kind of coincidence of interests that is feasible and appropriate in a given environment and transform research into a collective endeavor to provide policy insights.

While it is crucial to shape research programs in developing countries with the objective of producing quality, publishable output, including a capacity-building component that is explicitly organized and implemented, with its own theory of change, quality control, and evaluation is important. It could be part of a country-based research capacity building policy, in line with one of the recommendations of the evaluation of the Think Tank Initiative to ‘transcend expectations that a project can produce a specific policy change, to instead emphasize the capacities that need to be fostered to underpin credibility and relevance within ongoing policy dialogues’ (Christoplos et al., Reference Christoplos, Pain, Kluyskens and Fruhling2019, p. VII). A recent GDN program conducted in partnership with the EIB illustrated that the main capacity building benefit for young and talented researchers was not as much in the technical realm as in the interaction with non-researchers and in the capacity to design effective research in a given, constrained environment that will be found useful and attractive by non-researcher stakeholders so that they will be cooperative in sharing data and in providing input; choosing appropriate methodologies (rather than having a preferred methodology in mind ex ante and trying to organize the research around it) and how to discuss preliminary results and frame conclusions (Jacquet et al., Reference Jacquet, Jimenez, Fardoust and Sarris2021).

The knowledge-for-development agenda needs to be given a second lease of life. There has been a gradual shift over the last half century towards a positivist, technocratic vision of development challenges that suggested that there were winning, universal ways to think and act. Policies were conceived as mechanisms to uncover rather than as informed choices. This approach led researchers to formulate recommendations and prescriptions. We are now living through the democratic test of these claims: they are often perceived as having generated costly crises and inequality. As a result, an increasing number of citizens, at least in advanced countries, have lost trust in academia, threatening to end the era of enlightenment that has been so instrumental in raising welfare over the last two centuries. When science no longer guides beliefs, societies turn to obscurantism, a particularly ominous development. And yet, science produces useful knowledge, and academic knowledge is crucial at a time when our societies must deal with new global challenges that call for deeper understanding and informed action. Mobilizing research for development requires the highest political commitment.

Acknowledgements

The author thanks participants at the Festschrift Conference in honour of Prof. L. Alan Winters for useful comments. He writes here in a personal capacity.

Footnotes

*

President (2012–2022).

1 For many years, Alan has been a member of the Board of Directors of the Global Development Network (GDN), a public international organization headquartered in New Delhi whose mission is to enhance research capacity in social sciences in developing countries, under the premise that this will lead to better policies and development outcomes. Alan chaired the Board of GDN between 2011 and 2018.

2 Data from SCImago country rankings database for the period 1996–2021.

3 At least theoretically. Applied research may lead to partly privatized knowledge (through protection and patents) and access may be limited.

4 Baggini (Reference Baggini2018) discusses different approaches to thinking in various philosophical traditions. See also (Lavis, Reference Lavis2021).

5 Wagner (Reference Wagner2018), for example, discusses differences between East and West ways of knowing.

6 This commits developed and developing countries to abide by five major principles of aid effectiveness: ownership, alignment, harmonization, managing for results, and mutual accountability (Paris Declaration on Aid Effectiveness, 2005).

7 As opposed to the ‘positive’ and the ‘political economy’ approaches to policy research (Benassy-Quéré et al., Reference Benassy-Quéré, Coeuré, Jacquet and Pisani-Ferry2019).

8 The utilitarian approach also provided a rationale for the intervention of development finance institutions, which need to justify the associated use of funds.

9 For empirical attempts to measure the utilization of social science research along multiple dimensions, see Landry et al. (Reference Landry, Amara and Lamari2001). They show that knowledge utilization depends much less on research outputs than on other factors pertaining to researchers’ and research users’ contexts.

10 See Lant Pritchett's discussion of randomized controlled trials methods, suggesting an explicit distinction between the existence of knowledge and its actual use, and arguing that most of the time the former is understood as naturally or logically implying the latter (Pritchett, Reference Pritchett2018).

11 See for example McCloskey (Reference McCloskey1983) for a discussion of the role of rhetoric in economics and the argument that debates about methods and technicalities hide the main issues that should be discussed.

12 Mercier and Sperber (Reference Mercier and Sperber2017) develop a much more critical analysis on the role of reason, where analytical reasoning is not directed by true, scientific knowledge, but by the careful analytical use of potential interpretations of existing knowledge to serve one's priors.

13 This is what GDN's ‘Doing Research’ program proposes. See www.gdn.int/doingresearch/about.

References

Acemoglu, D., Aghion, P., and Zilibotti, F. (2006) ‘Distance to Frontier, Selection and Economic Growth’, Journal of the European Economic Association 4(1), 3774.10.1162/jeea.2006.4.1.37CrossRefGoogle Scholar
Akude, J.E. (2014) Knowledge for Development: A Literature Review and an Evolving Research Agenda. Deutsches Institut Für Entwicklungspolitik Discussion Paper 2014/18, Bonn (Germany).Google Scholar
Albaek, E. (1995) ‘Between Knowledge and Power: Utilization of Social Science in Public Policy Making’, Policy Sciences 28(1), 79100.CrossRefGoogle Scholar
Arocena, R., Göransson, B., and Sutz, J. (2014) ‘Universities and Higher Education in Development’, in Currie-Alder, B., Kanbur, R., Malone, D., and Medhora, R. (eds.), International Development: Ideas, Experience & Prospects. Oxford University Press, 582598.10.1093/acprof:oso/9780199671656.003.0035CrossRefGoogle Scholar
Baggini, J. (2018) How the World Thinks. A Global History of Philosophy. London, UK: Granta Books.Google Scholar
Bardsley, C. (2017) ‘The Pursuit of Impact through Excellence: The Value of Social Science for Development, a Funder's Perspective’, in Georgalakis, J., Jessani, N., Oronje, R., and Ramalingam, B. (eds.), The Social Realities of Knowledge for Devleopment: Sharing Lessons of Improving Development Processes with Evidence. Sussex: Institute of Development Studies.Google Scholar
Benassy-Quéré, A., Coeuré, B., Jacquet, P., and Pisani-Ferry, J. (2019) Economic Policy, 2nd edn. Oxford University Press.Google Scholar
Cash, D., Clark, W.C., Alcock, F., Dickson, N., Eckley, N., and Jäger, J.. (2002) ‘Salience, Credibility, Legitimacy and Boundaries: Linking Research, Assessment and Decision Making’, Kennedy School of Government Research Working Paper RWP02-046.10.2139/ssrn.372280CrossRefGoogle Scholar
Christoplos, I., Pain, A., Kluyskens, J., and Fruhling, P. (2019) ‘External Evaluation of the Think Tank Initiative (TTI), Phase Two, 2014–2019’, NIRAS. https://idl-bnc-idrc.dspacedirect.org/bitstream/handle/10625/57710/IDL%20-%2057710.pdf?sequence=2&isAllowed=y.Google Scholar
Das, J., Do, Q.-T., Shaines, K., and Srikant, S. (2013) ‘U.S. and Them: The Geography of Academic Research’, Journal of Development Economics 105, 112130.CrossRefGoogle Scholar
Dewey, J. (1929) The Quest for Certainty. A Study of the Relation of Knowledge and Action. New York: Milton Balch And Company. Gifford Lectures.Google Scholar
Fayolle, J. (2020) ‘Philosophie Pour Temps Incertains: A Propos de «La Quête de Certitude», de John Dewey’, https://jackyfayolle.net/2020/11/15/philosophie-pour-temps-incertains-a-propos-de-la-quete-de-certitude-john-dewey/.Google Scholar
Gluckman, P. (2016) ‘The Science–Policy Interface’, Science 353(6303), 969969.10.1126/science.aai8837CrossRefGoogle ScholarPubMed
Haas, P.M. (1992) ‘Introduction: Epistemic Communities and International Policy Coordination’, International Organization 46(1), 135.10.1017/S0020818300001442CrossRefGoogle Scholar
Harberger, A.C. (1993) ‘Secrets of Success: A Handful of Heroes’, American Economic Review 83(2), 343350.Google Scholar
Harris, R. (2015) ‘The Impact of Research on Development Policy and Practice: This Much We Know’, in Chib, A., May, J., and Barrantes, R. (eds.), Impact of Information Society Research in the Global South. Singapore: Springer, 2143.Google Scholar
Jacquet, P., Jimenez, E., Fardoust, S., and Sarris, A. (2021) Measuring Impacts –The Experience of the EIB-GDN Programme. Luxembourg: European Investment Bank.Google Scholar
Krugman, P. (2022) ‘Europe's Gonna Party Like It's 1979’, The Financial Times, 30 August 2022.Google Scholar
Landry, R., Amara, N., and Lamari, M. (2001) ‘Utilization of Social Science Research Knowledge in Canada’, Research Policy 30(2), 333349.10.1016/S0048-7333(00)00081-0CrossRefGoogle Scholar
Lavis, A. (2021) L'imprévu. Que Faire Lorsqu'on Ne Sait Plus? Paris: Flammarion.Google Scholar
McCloskey, D. (1983) ‘The Rhetoric of Economics’, Journal of Economic Literature 21(2), 481517.Google Scholar
Mercier, H. and Sperber, D. (2017) The Enigma of Reason. Cambridge, MA (U.S.A): Harvard University Press.Google Scholar
National Research Council (2012) Using Science as Evidence in Public Policy, edited by K. Prewitt, T. Schwandt, and M. Straf, Committee on the Use of Social Science Knowledge in Public Policy. Washington, DC: The National Academies Press.Google Scholar
Nustad, K.G. and Sending, O.J. (2000) ‘The Instrumentalization of Development’, in Stone, D. (ed.), Banking on Knowledge. London: Routledge, 4462.Google Scholar
Paris Declaration on Aid Effectiveness (2005) ‘The Paris Declaration on Aid Effectiveness and the Accra Agenda for Action’, https://www.effectivecooperation.org/system/files/2020-09/Paris%20Declaration%20on%20Aid%20Effectiveness%20.pdf.Google Scholar
Pritchett, L. (2018) Knowledge or Its Adoption? CGD Note. Washington. DC: Center for Global Development.Google ScholarPubMed
Schon, D.A. (1983) The Reflective Practitioner. How Professionals Think in Action. New York: Basic Books, Inc.Google Scholar
Sloman, S. and Fernbach, P. (2017) The Knowledge Illusion. Why We Never Think Alone. New York: Penguin Publishing Group.Google Scholar
Stiglitz, J.E. (1999) ‘Knowledge as a Global Public Good’, in Kaul, I., Grunberg, I., and Stern, M. (eds.), Global Public Goods. Oxford University Press, 308325.10.1093/0195130529.003.0015CrossRefGoogle Scholar
Stiglitz, J. (2000) ‘Scan Globally, Reinvent Locally: Knowledge Infrastructure and the Localization of Knowledge’, in Stone, D. (ed.), Banking on Knowledge: The Genesis of the Global Development Network. London: Routledge, 2443.Google Scholar
Stone, D. (1989) ‘Causal Stories and the Formation of Policy Agendas’, Political Science Quarterly 2(104), 281300.CrossRefGoogle Scholar
Stone, D. (2002) Policy-Paradox: The Art of Political Decision Making. New York: W.W. Norton.Google Scholar
The Economist (2016) ‘The Post-Truth World: Yes, I'd Lie to You’, The Economist, 10 September 2016.Google Scholar
Wagner, R.G. (2018) ‘Can We Speak of East/West Ways of Knowing?’, KNOW: A Journal on the Formation of Knowledge 2(1), 3146.Google Scholar
Weiss, C.H. (1979) ‘The Many Meanings of Research Utilization’, Public Administration Review 39(5), 426431.CrossRefGoogle Scholar
Wenger, E. (1998) Communities of Practice. Learning, Meaning, and Identity. Cambridge University Press.CrossRefGoogle Scholar
Winters, L.A. (2016) ‘New Zealand's Recognised Seasonal Employer Scheme: An Object Lesson in Policy Making But for Whom?’, Migrating out of Poverty –Research Program Consortium, January.Google Scholar
World Bank. (1998) ‘Knowledge for Development’, World Development Report 1998/99.Google Scholar