Report: what we've learned about evaluation
An independent review of the evaluation reports of 200 Heritage Grants projects completed between 2008 and 2013 found that how comprehensively the project invested in the evaluation impacted on the finished report quality.
Reports were judged on the extent to which they provided a logical framework, robust evidence and analysis of data, were objective, clearly presented, and included sufficient conclusions and recommendations.
Just over a third, 37%, of reports were graded as good or excellent, while 64% where judged to be adequate or poor.
The review, carried out by Ruth Flood Associates, found: "Many reports seemed to be written primarily as project history documents, rather than documents which sought to evaluate objectively and identify openly where things did not go well and could be improved in the future."
The good and excellent reports tended to be those where "the individuals tasked with evaluating projects are involved sufficiently early in the project to be able to collect data systematically throughout the process, or where the grantee put in place robust evaluation systems from the outset."
"Few evaluations reports were the product of robust systems."
Key characteristics
The best quality reports had three things in common:
- Expertise - external consultants or organisations tended to write better quality reports.
- Funding for evaluation - where higher amounts of expenditure had been originally allocated for evaluation, the reports tended to be of better quality.
- Report length - longer reports which contained more explanation and more data, tended to be of better quality than the shorter reports.
"Judging by the quality of the reports, organisations are generally aware of the need to count the activity that they have undertaken but their understanding of the wider rationale and value of undertaking more enhanced evaluation activities is very mixed."
Recommendations
The review includes recommendations for HLF to help improve the quality of evaluations reports submitted, such as providing:
- More detailed evaluation guidance, with each element of the guidance brought to life with specific examples.
- A mandate about the percentage of funding that should be spent on evaluation activities.
- More support for grantees.
HLF’s head of evaluation, Kion Ahadi, said: "This report has provided us with valuable insight into how to better support projects we fund. We are increasing the amount we allow projects to allocate to evaluation and updating our guidance with information on what we expect from reports and the criteria we use to assess their quality. One of our main objectives is to raise the overall quality of project self-evaluations. This work has provided us with a baseline and we will be monitoring our progress against this baseline annually."
Find out more
- Explore the full Heritage Grants self-evaluation review
- And we’ll be sharing review excerpts and advice in the Online Community over the coming weeks