[Editor’s note: This is the fifth on a series focusing on supporting think tanks for the evaluation of two pilots for the Indonesian Knowledge Sector Initiative. The views expressed in these publications are those of the author(s) and not necessarily those of the Commonwealth of Australia. The Commonwealth of Australia accepts no responsibility for any loss, damage or injury resulting from reliance on any of the information or views contained in this publication.]
This think piece focuses on lessons from the implementation of a relatively large (USD500,000) project funded by an international development organisation between 2009 and 2011, to provide capacity development services to the Vietnamese Academy of Social Science (VASS). VASS is a large government research organisation in Vietnam (modelled on CASS in China), which reports directly to the Prime Minister (as opposed to specific Ministers) and is home to over 30 policy and academic-focussed research institutes. It is hierarchical and political; where the president has historically been part of the Party’s Central Committee. In the past, the academy has been seen as largely legitimising government policy although some policy institutes have increasingly done work to shape policy. [As a concept and model it is worth studying and learning from.]
The project comprised three components: 1) project management; 2) research communication, and 3) the application of capacity developed in both of these components to manage a large research project. I was part of the a consulting team which was involved in the second component, research communication. The terms of reference for the research communications component ticked all the boxes of such a project by addressing internal communication, policy-focussed external communication (through policy and press briefings), and stakeholder engagement; all through a range of activities such as trainings, study tours, workshops to share learning and good practices, the production of toolkits, and, the most innovative part, action learning through coaching and mentoring of researchers throughout the process.
The project was intended to run in four stages: an initial stage to learn about the context through research and surveys to help design interventions; piloting interventions on a small scale through an action-learning approach; rolling out what worked on a wider scale and finally; and a formal evaluation.
The approach outlined assumed (at least implicitly) that a linear connection existed between the various aspects of the capacity development initiative: the provision of inputs such as technical assistance and workshops would lead to the delivery of outputs such as trained researchers and the production of toolkits. These inputs and outputs were expected to lead to better performance (for example more attractive research, which would lead to more policymakers accessing and reading the organisation’s research) and ultimately impact (policymakers would use the organisation’s research to improve policies and the lives of the country’s population).
What actually happened?
Managed by an external project management unit (PMU), albeit situated within and with its staff recruited from VASS (and reporting directly to the president), the project followed through with some of the intended activities whilst making significant changes to others.
The consulting team was based outside Vietnam and delivered its inputs through frequent trips (during one period visits were almost monthlyand lasted a week). The initial learning stage comprised a quantitative survey followed by a week-long qualitative assessment of the organisation’s communications practices. In addition, senior staff from the research organisation made a study tour to Europe.
The action-learning component was dropped in favour of a series of ‘work and write’ shops, in which five fairly junior researchers from ten research centres within VASS were trained to ‘translate’ long research reports into shorter formats such as 4-page policy briefs and 2-page executive summaries. A toolkit was designed to help these researchers and those who had not undergone the training. The researchers who were seen as particularly keen and committed during the work and write shops were then asked to be champions for this new approach to communicating research.
However, our initial impressions (drawing on an internally conducted ‘light-touch’ review) were that the project had little impact on the broader organisation. Crucially, too, the PMU decided to drop the formal evaluation, so arguably we will never really know what effects if any the project had.
On project management, the relationship between the consulting team and the client started amicably but by the end became fairly strained. On one occasion, the consulting team sought direction from the funder, but this did not lead to any sustainable resolution. There may have been several issues at play here, but key among them was the confusion there was within the consulting team as to the role it was supposed to play in the project. We had initially thought we would be partners with the client, VASS, having a say in decision-making with regard to the selection and nature of interventions. However, as the project wore on, it became clear the client expected the team to do what was asked of it; to be, bluntly, a consultant.
Why such little ‘impact’? Bringing context into the organisation and the project
Was this the most appropriate set of interventions? To make an informed judgement, let’s think about the context in which VASS researchers worked in.
Communication in VASS (and other Vietnamese research institutes; and in many other countries with similar social customs) is largely undertaken through a hierarchical –top down approach. Junior researchers in government research organisations (the main participants in the project) tend to have little or no power in deciding how research is (managed and) communicated. These decisions lie with research managers or directors. Unfortunately, they were not involved in the project; largely because busier and more senior staff were not keen on workshops, which are a more appropriate training tool for more junior staff.
When it comes to actually communicating research to policymakers, formal knowledge products have a limited role in Vietnam. Rather, the President, the directors of the institutes, and the heads of departments are the ones who interact with policy processes through private meetings, commenting on draft legal documents, attending technical seminars/workshops and/or appearing in the press and on television. As Martin Rama says in his paper on the transition in Vietnam, influence is a result of research leaders with strong personalities -often seen as ‘bullet proof’ mediators- who convince the most senior officials in the communist party, with whom they had a strong relationship, of the merit of new ideas. For important reforms, the mere technical soundness or attractive packaging of technical inputs is never enough.
As Enrique and I have said in previous posts, workshops, on their own, cannot facilitate longer term transformation. Change happens outside workshops when people have the space to test and reflect on the ideas, tools, methods and approaches they have learned about. However, we cannot get away from the fact that workshops help to consume and redistribute large amounts of (donor) funding very quickly and produce quantifiable and demonstrable results: people gathered, speeches delivered, production of meeting proceedings, as well as other traces such as newspaper articles, mentions in annual reports and banners and posters –always helpful for reports to donors.
That said, taking an action learning approach -which would have required observing researchers at work and engaging with them in a relatively intensive dialogue- would have been impossible considering none of the members of the consulting team could speak Vietnamese.
With regards to the study tour to the UK, the participants, even though they had the power to instigate far reaching changes within the organisation, faced serious impediments for applying the new ideas they had learned. The most crucial factor was probably the huge difference in the institutional set-up between the UK and that of Vietnam. It is not surprising then that the Vietnamese government has tended to look to its neighbours when looking to learn from other countries. (VASS, after all, is inspired by CASS.) As such, although there is much kudos attached to making links with Western counterparts, Vietnamese researchers, like policymakers, were probably better off learning from their East and/or Southeast Asian neighbours with whom they share several historical, political and cultural attributes. [And even possibly from other developing and emerging countries in Africa and Latin America.]
Promoting changes in internal communication within and amongst institutes in Vietnam is a very challenging endeavour. In a context where researchers are often chasing donor contracts to top up very low salaries (see below), coupled with often excessive bureaucracy, and the need for less senior researchers to secure various permissions to secure funding from donors, researchers often try to minimise formal links between an externally funded project and the research institution in order to maintain a greater degree of control over it. As a result, researchers can become individualistic in their work and tend to keep their activities secret from each other. So while private and informal personal relations are quite common amongst researchers from different institutes, formal horizontal networks, once established, tend not to remain active for very long.
Moreover, as Enrique once said, in some cases the incorrect assumption made by consultants seeking to develop the capacity of think tanks to communicate their research is that if the quality of the research is low, then management and communication can make it better.
Unfortunately, limited funding and lack of modernisation of some research institutes were a significant problem, which resulted in inadequate methodological capacities and weak analytical skills among many researchers. This was exacerbated by an incentive structure that encourages researchers to scramble for short-term consultancy work from donors and government rather than focusing on longer term projects that may provide the opportunity to strengthen their research skills along the way. In fact, the personal career success of the directors often depends on their ability to secure projects and money from donors, and not necessarily their ability to stimulate the production of new knowledge. This is a situation in which many researchers, across the world, find themselves in.
In this context, these kinds of capacity development projects are often seen as an opportunity to top-up low salaries and help make their research more attractive to donors (particularly crucial in a context where government has threatened to reduce funding to research institute) rather than improve their abilities to promote better informed policies.
Evaluating the project: not everyone is interested
The formal evaluation may have been dropped by the PMU for various reasons -it would not be right to be too quick to judge. A lot is at stake when conducting an evaluation: future funding, staffing levels, accountability for the use of resources, career development decisions and professional reputations all depend on positive evaluations. There may be a fear of exposing unintended outcomes and unachieved goals to wider scrutiny.
But if resources were considerable, why then did the donor itself not put pressure on those involved to at least conduct an evaluation as planned?
A possible explanation is that the space and freedom to decide the course of the project that was afforded to the client may have been less to do with wanting to promote their sense of ownership and the effectiveness of the capacity development intervention and more to do with ‘bigger picture’ political economy issues. Many donor agencies are often under great pressure to disburse allocated budgets before the end of the financial year, and the careers of many individuals depend on this. Moreover, given Vietnam’s high growth rates and on-going transition, there is also a distinct desire to ensure continued association with what is seen as a success story. (This might explain why there are in the region of 50 donor agencies and why in 2010 Vietnam was the world’s seventh largest ODA recipient.) Donors have thus been careful to avoid public criticism of government officials and steered clear of what might be considered ‘unreasonable’ critiques of government approaches and programmes. They have often turned a ‘blind eye’ to instances when money is spent in ways that were not originally intended by them in order to ensure good relations with and some influence over the Vietnamese leadership, who are well know for taking a strong lead in ‘disciplining’ donors.
Therefore, whether the organisation and the funder actually wanted to embark on a capacity development process at all could be questioned.
Lessons for the Indonesia Knowledge Sector Programme
A number of lessons could be presented for a programme such as the one underway in Indonesia:
- Promoting capacity development should be from within the community: developing capacities sustainably needs an appreciation of many domains of knowledge and many disciplines as well as a good understanding of the local context as well as language. Donors should therefore ensure local in-country capacity development providers are part of the team and that they have a range of skills and significant hands-on experience.
- Learn the local language: taking steps to learn the language well won’t just help with getting by and surviving on a day to day, it can help with improving your understanding of and be able to engage with the context in which technical advisors work in. Formally knowing the language to a decent level enables one to read drafts of policies and other official documents and make substantive contributions to official meetings and seminars enabling one to access policy-makers and shapers and policy processes. But crucially, it can provide one with a certain degree of credibility amongst local researchers and officials alike, helps develop strong personal networks and opens up opportunities to engage with a wide variety of people (from taxi drivers to senior officials) in informal settings (such as over lunch or in the evening over a drink), which can provide a rich source of stories, anecdotes and rare insights into the way people really feel about, for instance different aspects of a project.
- Developing capacities sustainably, especially in Vietnam has to be a long-term endeavour: Building up trust with the local client, getting insights to the inner workings of an organisation, developing the skills and abilities of individuals and the rules governing organisations can take several years and perhaps decades and requires careful monitoring of shifts in the political context.
- Capacity development as a deliberate process is an inherently political one and if change processes are not owned and led by those whose capacity is being developed, they are unlikely to happen (or, if they do, to be sustainable). Political pressure for change –preferably from domestic actors– is key.
- Consultants can be enablers: helping actors with sufficient power and influence within the client organisation to understand what is happening in their organisation, develop a vision of what they want it to be in future and a strategy to help them to get there. In places like Vietnam where foreigners and outsiders are kept at arm’s length, consultants may want to take an advisor role where they respond to questions and requests from the client ensuring advice given and products produced are of the highest quality, but more on this below.
- Clarity of roles is crucial: negotiating exactly what the consultant is responsible for (e.g. outputs or outcomes) using a champion’s consulting grid can help all parties to clarify what types of relationship are needed for particular tasks and what approach to managing the project they should take, and allow for structured discussion of the internal political issues.
- Continuous or at least regular monitoring and learning are critical activities to help consultants together with the client capture both anticipated and unanticipated changes (if any) and confirm, improve or reconfigure the project team’s understanding of how change is likely to come about and respond appropriately. However, for the project team to be reflexive learners, the client’s, funder’s and especially the consultant’s organisations need to facilitate this through its own learning culture and systems.
- Core funding isn’t progressive in all contexts. Providing core funding to institutes will require a high degree of donor collaboration (so as not to double fund certain institutes), may reinforce high levels of particularism amongst research institutes (due in part to competition for resources) and only promote more intensive relationships of mutual indebtedness (in contexts where clientelist relations are strong), rather than provide researchers more space to produce higher quality research.
- Capacity development needs to focus not just on the capacities of researchers, policymakers and other actors to produce technical results, but also on what it takes to build more effective and dynamic relationships between them. Therefore, in addition to traditional methods such as workshops and study tours, interventions need to consider more advanced approaches such as action-learning which might feature coaching and mentoring, (informal) knowledge networking and multi-stakeholder platforms.
- Donors and grantees need to be ready to take risks –and make mistakes: getting donors and clients to agree to more innovative and less well known interventions will be difficult given high levels of risk aversion. It is clear that there is no quick fix to developing capacities, which requires high levels of energy, patience and flexibility. Thus, consultants and funders alike will need to be realistic about what can be achieved; but not necessarily be conservative.