have a higher probability of being cited. The number of internet users in a subject country does not significantly affect citations. Similar to downloads, more complex reports in larger countries tend to be cited more frequently. Multi-project, multi-report, and multi-sector reports as well as core diagnostic reports are more likely of being cited. Also reports on countries with larger populations tend to be cited more frequently. Contrary to downloads, reports for upper middle countries are not significantly more cited. Reports pushed by the OMBC received a higher number of citations. The 17 reports that were pushed by OMBC were cited an average of 7 times, significantly greater than the mean of 0.9 for those reports not pushed by OMBC (Table E). One of these reports was cited 51 times, potentially benefitting from a New York Times op-ed by the author. Contrary to downloads, the OMBC dummy remains significant even when controlling for costs. Some policy reports may not be cited simply because they were not located in Google scholar. There are about 410 policy reports that were not cited and not located in Google scholar. Verifying through other search engines seems to confirm that these reports have not been cited. The results do not qualitatively change if we were to assume that reports not located in Google scholar have zero citations. Development objectives do not seem to matter for citations. As expected, we do not find any systematic evidence that reports with a development objective of informing the public debate receive a significantly higher or lower number of citations. VI. Measuring Internal Knowledge Sharing Measuring internal knowledge transfers is difficult. The key issue is that it is difficult to assess the costs and, more importantly, the benefits of knowledge sharing among staff because the inputs and outputs are not systematically monitored and reported, and because of the heterogeneity of the methods of disseminating knowledge, such as through team-based support, sector-wide support, or individual training. Two recent papers have tried to assess the demand for and value of research among World Bank’s operational staff. Ravallion (2011) finds that two-thirds of staff place high value on Bank research. But it also shows that approximately 23 percent of Bank staff has a low valuation of the relevance of Bank research for their work, and is uninformed and unfamiliar with its knowledge products. According to IEG (2012), sector- and anchor-unit based staff rely most often on policy reports from the anchor units within their own sector, and least often from other units.30 There is little evidence about the contribution of cross support to policy reports. This is surprising as some FCOs such as DEC, HDN Anchor, and PREM Anchor provide more than 8 percent of their staff time to cross support. In order to efficiently provide knowledge to its external clients, any large international institutions will have to build effective mechanisms for internal knowledge sharing. When the concept of the World Bank as a Knowledge Bank was articulated in 1996, networks were created and given the responsibility to address issues within their fields and to share knowledge with the regions (via sector management 30 Regarding substantial use, 28 percent of staff used Policy Reports from the anchor unit within their own sector, 19 percent used Policy Reports from sector units in other regions, 17 percent from sector units outside their sector, and 7 percent from DEC. 19

Which World Bank Reports Are Widely Read? - Page 25 Which World Bank Reports Are Widely Read? Page 24 Page 26