[Editor’s note: this is a post that follows a discussion about the first Transparify report on think tanks financial transparency. This post focuses on the Think Tank Initiative but other posts will look at other national, regional, and global groups of think tanks. Health Warning: do not use to judge the think tanks. This is not a scientific nor full proof assessment. Do the analysis yourselves, please; although you may use my notes below.]
This is a first effort to rate think tanks within a think tank support initiative; in this case, the Think Tank Initiative. As a reminder, and because I do not have the man-power or time that the Transparify team has I have ‘reworked’ Transparify’s scale in the following, slightly more straight forward and idiot-proof, way:
- 5-star*: the most transparent: 2 clicks or less from the front page to find who funds the think tank, how much, and for what. Also, the think tank discloses information about the nature of the funding their receive -that is: is it a project-based contract? a grant? And, it provides information about their most senior staff salaries.
- 5-star: highly transparent: 2 clicks or less from the front page to find who funds the think tank, how much, and for what. Some information about the nature of funding is offered, too.
- 4-star: average transparent: the information is there but harder to find -.i.e. more that two clicks away or it has to be ‘put together’. As if the think tank was not too keen on it being found or that there is an intention to do better but don’t make it just yet.
- 1, 2, and 3-star: incomplete funding information: this includes not providing detail about who funds them, or not putting it into a single table or easy to read page.
- 0-star: no (zero, nada) funding information: this is not good.
For an introduction to the exercise, what motivated it, what I hope to achieve, and other sets of ratings, please go to this introductory post.
More think tank programmes to follow.
The think tanks
The think tanks I am going to be looking at are the Think Tank Initiative’s grantees.
Note: This assessment is valid up until the 11th May 2014. Website updates may change the ratings I’ve given them so please do not use this blog post to judge think tanks. Although you may be able to use the approach to do it yourself. It would be much better if you checked Transparify’s report to see if they have been rated or submitted the think tanks to the Transparify team to rate them.
Latin America (average = 2.75)
Asociación de Investigación y Estudios Sociales: 1 or 2 stars. It is hard to find any information about funding in the ASIES website. The “about” section does not list its supporters with the exemption of the Think Tank Initiative (but no amounts are provided). From its “history” it is possible to infer that there is domestic private support but no more information is offered. A search of “funding” did not offer any relevant information. Neither did “annual report”.
Centro de Análisis y Difusión de la Economía Paraguaya: 2 stars: CADEP shows who supports them on the front page but this is only provided as widgets along a right-hand side bar. It is not clear if they provide them with funding or other kind of support. The same search for “funding” and “annual report” did not lead to any more information.
Centro Ecuatoriano de Derecho Ambiental: 3 stars: CEDA takes three stars because it is easy to find its annual report (“Cuentas” on the main menu leads to them) and this provides information about who funds the think tank. It is not up to date, however, and the list of funders does not offer detail related to amount and for what was the funding. This could be inferred from the lists of projects, though.
Foro Social de Deuda Externa y Desarrollo de Honduras: 3 stars: FOSDEH also gets a 3 because it is developing a transparency portal (from the main menu) where it presents information about its funders. The about section also leads to a short list of funders and the annual reports. These are all out of date, however. Given all of these components FOSDEH could get 5 stars with little effort. But it could also drop to 2 if the portal does not materialise soon.
Fundación ARU: 3 stars: ARU deserves 3 starts even if it does not provide a clear and detailed list of funders. It has, instead, quite a lot of information about its internal policies (go to About Us/Coordination). There is information about the TTI and lists of organisations with which ARU has a link (which one can infer it may get funds from). Its annual reports are not up-to-date.
Fundación Dr. Guillermo Manuel Ungo: 3 to 4 stars: FUNDAUNGO could have received a 3 star rating as well. It gets a bit more because it provides a long list of funders just two clicks away from the front page. Also, it differentiates between grants, services, and partnerships. This is rare for a think tank. Well done!
Fundación para el Avance de las Reformas y las Oportunidades: 5 stars: Grupo FARO gets 5 stars because it presents information about who funds them, how much, and for what.. in several ways (although this could be made tidier): http://www.grupofaro.org/content/resumen-anual-de-ingresos and http://www.grupofaro.org/content/presupuesto-institucional for example. It could be tidier.
Fundación Salvadoreña para el Desarrollo Económico y Social / Departamento de Estudios Económicos y Sociales : 3 stars: FUSADES gets 3 stars because it differentiates their different types of members and includes those that ‘sponsor’ the organisation. There is no information, however, about how much they provide in support of FUSADES.
Grupo de Análisis para el Desarrollo: 1 or 2 stars: GRADE provides some information about who funds them (via alliances and initiatives) but does not go as far as saying it. Its latest annual report doesn’t offer any information about funding, either. A search for “Funders” or “financing” does not lead to any more information.
Instituto de Estudios Avanzados en Desarrollo: 1 or 2 stars: I was about to give INESAD 0 starts but found some funding information in their publications. Their papers mention who supported the research. It did take quite a bit of looking around to get this information. The usual searchers did not help, either. So, borderline 1 star. They provide a short list of ‘partners’ at the very bottom of the front page that includes their funders.
Instituto de Estudios Peruanos: 3 to 4 stars: IEP gets almost 4 starts because it provides information about financing two clicks away but it is not desegregated. It does list who funds every project so that can give a better sense of who is funding what but it is yet now clear how much they fund and whether it is a grant or a contract. Still, lots of information and not hard to find, but mostly a 3.
Instituto Desarrollo: 2 or 3 stars: ID does provide information about who supports it on its front page but this is not detailed or clear. It was not possible to find annual reports or any more information about who funds their activities -although there is some information about funders in the ‘areas’ pages.
Africa (average = 2.37)
Advocates Coalition for Development and Environment: 2 to 3 stars: ACODE gets almost 3 starts because it is easy to find a list of donors but there is no information about how much they provide. The link to the annual reports that could offer more detail is broken. I think there may be more information in the website but several links do not seem to work. A small investment may go a long way.
African Heritage Institution: 2 stars: AHI does offer some information about who funds it but this is not presented under “funders”, instead they list them as partners, networks or affiliations. There is no detail about how much is provided or if the relationship is current.
Center for the Study of the Economies of Africa: 1 but almost 0 stars: CSEA does not provide any easy to find funding information. Only once one goes into the project report is it possible to get a sense of who funds them. This is in contrast with the very good idea of a funding request “Support Us” tab on the main menu. In a way, I feel that if you are going to ask for funds in this way you need to open your books first.
Centre d’études, de documentation et de recherches économiques et sociales: 1 star: CEDRES gets 1 star as it was not possible to find any information about their funders except as a reference to partners in a project list. There is, however, no information about the size or the type of funding or if it is current or not. It could be 0.
Centre for Population and Environmental Development: 3 stars: CPED gets just about 3 but could be 4 because it provides information about funding very much on the front page: just one click away. Still, the list is a simple list with no detail. The annual report did not have a financial section.
Consortium pour la recherche économique et sociale: 1 star: CRES does not offer easy to find funding information. It has a tab for partners but this is limited to a description about their importance. and IDRC is the only one listed on the front page. The papers themselves offer some indication of where the money may be coming from but this is not clear, either.
Economic and Social Research Foundation: 2 stars: ESRF offers some information about funding but this is only possible to infer by reading through the annual report (which, granted, is easy to find) as they describe who commissioned or supported some of the projects. The About section however does not offer much information.
Economic Policy Research Centre: 1 star: EPRC showcases only its main partners (or funders, it is not clear) but does not provide details about funding amounts or mechanisms. There may be a problem with some of their links as I could not access the anual reports.
Ethiopian Development Research Institute: 2 to 3 stars: EDRI provides a list of funders/partners via two separate links (easy to find) but does not offer any more detail. The annual report also mentions the funders but does not offer any more detail.
Ethiopian Economic Association / Ethiopian Economic Policy Research Institute: 0 stars: There is a ‘related links’ tab at the bottom of the front page that could be linking to funders (but I feel that related links to too vague). The annual report does not include financial information. There is a reference to support for their foundational event but no more.
Initiative prospective agricole et rurale: 3 to 4 stars: IPAR-Senegal gets almost 4 stars because, although it does not offer detail of the funding it receives, it does make the effort to separate funding partners from technical partners, etc. This is an important difference that most think tanks do not make (or do not want to make). Still, it is not detailed enough to make it to a clear 4.
Institut de recherche empirique en économie politique: 2 stars: if Princeton Univesity and the Think Tank Initiative are their only funders then they offer enough information for a 3 but this seems unlikely -or at least it is not clear. In any case, they are only described as funders via a FAQ link.
Institute of Economic Affairs: 3 to 4 stars: IEA does provide information about who funds them but not about how much. It does also give a sense of the funding it gets via its membership scheme. It gets close to 4 stars because its list of projects mentions who pays for what. There is no information about how much they fund, however. Not much needed to get a full 4 or even a 5.
Institute of Economic Affairs (Ghana): Could not rate it as its website was not working. Does not count against the regional average.
Institute of Policy Analysis and Research – Rwanda: 3 to 4 stars: IPAR-Rwanda gets close to 4 because it has a very interesting table in which it describes all its stakeholders by whether they offer them funding, collaboration, etc. This is a worth mentioning table because it shows that it is possible to have several types of relationships with a single organisation. IPAR-Rwanda also offers some information about their funders in the about page but no indication of the amounts.
Institute of Statistical, Social and Economic Research: 2 to 3 stars: ISSER has a list of partners (or funders) that is easy to find and a page about its endowment trust. They do not offer information about amounts, though. I feel that an organisation with an endowment could offer more information.
Kenya Institute for Public Policy Research and Analysis: 2 to 3 stars: KIPPRA mentions that TTI, the ACBF and the Government of Kenya support it. But it does not offer more information about how much they get from each. (While I am at it, KIPPRA requires that you register to download a publication! This is not a very good idea.)
Makerere Institute of Social Research: 2 to 3 stars: MISR received funding from the TTI, Ford Foundation and USAID. But it does not say how much or when it received the funding. It is also possible to infer that it is cross funded by its teaching programme. I could not find an annual report.
Research on Poverty Alleviation: 5 stars (just about): REPOA does not quite get a full 5 star rating because one has to go to its annual report to find the very detailed information about its funding. This should not be necessary given that REPOA has a basket fund that can be easily and clearly represented through a pie chart (as they do). It could be a page on the About us section. Let’s give it an honorary 5, OK.
Science, Technology and Innovation Policy Research Organization: 2 stars: STIPRO offers some information about who their funders are but it is not clear is this is the complete list, how much they offer and for what. It was not possible to find annual reports or any financial statements.
South Asia (average = 2.84)
Center for Study of Science, Technology and Policy: 2 to 3 stars: CSTEP provides an easy to find list of funders but no additional information. There is also no financial information in the annual report so it could have been a 2 star rating.
Centre for Budget and Governance Accountability: 3 to 2 star: CBGA provides a list of funders but no detail. It does offer information about its finances in the annual report (but not up to date), and also has a list of travel by staff (interesting). It does, however, focus on budget transparency so I feel it may deserve a 2 for lack of consistency.
Centre for Policy Dialogue: 3 stars: It was hard to find it but CPD has the opportunity to offer a lot more information. In its annual report it has a list of projects that include the name of the project and the researchers involved. In the names there is some indication about who funds them (but not for all) but no indication about the amount provided. This is a missed opportunity. It could be easily transformed into a 4.
Centre for Poverty Analysis: 3 stars: It takes a while to get to CEPA’s list of clients a partners. But they do offer information about the type of funding they get (although not by funder). Closer to 2 for not not being right there to find, but the information available deserves a 3.
Centre for the Study of Developing Societies: 2 to 1 stars: CSDS provides information about who their funders are although it isn’t clear if the list complete. They have an endowment, and request funding from the public, but do not offer easy to find information about the size of the endowment. No annual reports were found.
Indian Institute of Dalit Studies: 3 to 4 stars: The information is hard to find but it is there. It could be a clear 4 had they offered more detail on their About us section. And a 5 if the tables in their annual report also included the amount provided by each sponsor for each project. Not hard.
Institute for Social and Environmental Transition – Nepal: 3 to 2 stars: This was hard. ISET provides a link to its donors on the main menu but does not offer information about amounts or more detail about why they support it. I tried finding this in in the annual report but the latest one was unavailable (broken link).
Institute of Economic Growth: 1 to 0 stars: IEG was hard to rate. There is a link to a donor receipt that shows IDRC funding but that was it. So good for this but there isn’t anything else. It could be a 0 too.
Institute of Governance Studies: 2 to 1 stars: It appears that IGS gets funding from BRAC University but we know it also gets funding from the TTI. So it is not entirely transparent. There is no information about who funds its work. We can infer but cannot be sure. The annual report did not include financial statements.
Institute of Policy Studies of Sri Lanka: 3 stars: It took a while to find information about funding for IPS. Their research programmes have a section on partners that list their funders. Still it is not clear how much they each fund. But it is easier to infer who funds what.
Institute of Rural Management Anand: 2 or 3 stars: IRMA presents a list of members of the IRMA Society that includes a number of private sector bodies and that, I assume, have supported the centre. Its teaching work must be another important source of income. And we know that TTI is too. But this is not clear.
National Council of Applied Economic Research: 5 stars: NCAER provides an easy to find list of sponsors and partners but no detail beyond that. The annual report, up to date and easy to find, provides a wealth of information about who funds what and how much they fund. It provides details of where they invest their reserves, how much they spend on everything from stationary to salaries, etc. It is not all linked up as in the case of FARO, for instance, but there is so much more information that it deserves a 5 -and if it was better organised I think it could possible make it to 5*.
Public Affairs Centre: 4 to 5 stars: PAC offers all the necessary information but slightly hidden in a financial statements section. It could present the same information in a more friendly manner in the main body of the website. Still. Close to a 5.
Social Policy and Development Centre: 2 stars: SPDC mentions that IDRC/TTI and and the Royal Norwegian Embassy support it but it does not mention the amount or who else supports them (via contracts). I could not find an annual report.
Sustainable Development Policy Institute: 3 stars: SPDI has published a long list of partners (is it code for funders?) but offers no more detail. The annual report has an even larger list in the annexes. The research projects include information about which of these partners are involved -no information on amounts, though.
The Think Tank Initiative
And what about the TTI? Well, we know how much the total fund is, and who are the funders, but finding how much each funds is not that easy (out-of-date annual report). So, on balance, I would say that 4 stars. If it updated this information and offered it on the main About us page and it could be upgraded to a 5. There is also a case for publishing how much each think tank gets from the initiative and how much is spends in individual projects.
The graph below shows the Think Tank Initiative’s rating in perspective:
Among the TTI’s grantees (those on their website at the moment of writing -TTI is entering a second phase soon so this list might change):
- There are 3 think tanks with 5 stars (my rating): Fundación para el Avance de las Reformas y las Oportunidades (Grupo FARO), Research on Poverty Alleviation (REPOA), Centre for Policy Research (CPR), and the National Council of Applied Economic Research (NCAER). The Public Affairs Centre was close to a 5. That is 1 out of 12 (8.3%) think tanks in Latin America, 1 out of 20 (5%) in Africa, and 3 out of 16 (18.8%) in South Asia.
- The averages also show that the regions are not far from each other: 2.75 in Latin America, 2.24 in Africa, and 2.83 in South Asia -with South Asia in the lead (marginally).
- The medians too: 3 for Latin America, 2.5 for Africa, and 2.75 for South Asia.
It is also significant that, when there is funding information, this is provided in four possible ways:
- There is a dedicated place on the website to present information about its funding: these cases are most likely to show more information and in greater detail. There is an effort and an intention to be transparent. This reflects a transparency policy.
- Information is half on the website and half in annual reports: these cases do not always provide all the detail that would be expected of a highly transparent think tanks. The think tanks ‘leave it to the reader’ to put the information together.
- Information is only in the annual reports or financial statements: this is the least ‘transparent’. It doesn’t necessarily mean that the think tank is hiding it but that it is not making an effort to publish it. After all, who reads annual reports?
- Information is scattered throughout the website: this could be on project pages, in publications themselves, in events pages, etc. It would be, in theory, possible to put all the information together but it would be quite a task.
It seems that a good way forward is to provide funding information in different places. A good approach would be, in order of usefulness:
- Have a “Who funds us” page with a table like the one that CGD or Grupo FARO have.
- Mention the funder, the amount, and the type of ‘contract’ (e.g. grant, consultancy, or own funds) in each project or initiative page.
- Include financial information in your Annual Reports; NCAER’s detail is unmatched.
We also know that there are great variations per region. There is a think tank in Africa that scored 5 and another that scored 0. In South Asia and Latin America the extremes are also there. So we cannot talk of a ‘regional’ level of transparency. It might be more interesting to explore why some think tanks are more transparent than others. I bet we will find that there are a number of factors that explain this and that a key one will be the values of their directors and board members.
While there is some truth in that poorly designed websites are also likely to present funding information this is not always the case. NCAER, for instance, has an out-of-date website but scored 5. There are also think tanks with pretty good websites that provide little financial information. I think this case is much better made by Transparify since their sample of think tanks is broader and includes developed country think tanks.
As a final conclusion to this exercise it is worth saying that Transparify has opened a door to a practice that should be more common. It did not take too long to do this: one morning of surfing through websites (which is why I am not claiming this to be as robust an effort as Transparify’s -although I am happy to report that there quite a lot of ‘coincidence’ in the ratings) an afternoon of writing the analysis, and another morning for going over the ratings as a way of double checking.
Donors should be able to do this; think tanks themselves should, too. The media should not find it hard to come up with their own national/domestic ratings. Basically, I think the message should be clear to all by now: it is not hard to judge a think tank’s transparency; if you (the think tank) do not do something about it soon, someone else will and may use it against you.
This should not be seen as a final judgement. Rather this is a baseline that may be used or transformed into a springboard to .. well, Transparify.