Voices of War & Peace: Impact report and closing thoughts

Voices of War & Peace: Impact report and closing thoughts

We recently delivered our report for the Voices of War and Peace project, where the intention was to look at some of the concrete dimensions, such as how the website is received, used and perceived, alongside some of the less immediately tangible concerns, such as the running of a project like these with various funders and project partners. We talked recently about that process, where we ensured that we were including voices from all those involved, and we found that it proved to be really useful in identifying the everyday dimensions that can help or hinder a project.

Without sharing the findings of the report, there are some things we can pass on:

  • In doing such project evaluations, it’s important to identify exactly what you will be looking for and exploring. Whilst there may be unexpected outcomes, it’s important to have at least two or three guiding areas for interrogation, for example if the project had certain expectations of engaging public communities of a certain type, can we say they were reached? This acts as a contract between project and evaluator, but also the backbone to come back to when reporting, that sense of focus.
  • It is possible to get to the bottom of certain concerns, issues or problems of a project that might have seemed too intangible whilst the project was actually being delivered. Or where there wasn’t time during delivery to stop, reflect, and fix them. So if there is some aspect of bureaucracy or administration that as a rule was holding up the project, at the time, the best that might have been achieved was a quick fix or stop gap – but post-project, with a report in hand, it might be that higher up in the institution conversations can be had about how these problems consistently affected this project but also potentially others, looking for systematic changes to ensure better project delivery in the future.
  • For this evaluation, part of our team was already involved in the project, so that helped offer insight from the ground level, that perspective of being involved and then commenting on that experience. But one of the other evaluators hadn’t been involved at all, and that critical distance helps. In interviews, the act of talking to ‘the stranger’ means they are forced to describe the project, their involvement, etc without missing anything out, and the anonymity offered means that they are able to reflect on where the strengths and problems were found. Where hierarchies of the project, or simply getting caught up int he flow can mean people don’t always feel they have opportunity to comment at the time, the space offered at the end, in such an evaluation process can be very helpful.
  • Finally, reporting on projects like this should not simply be presented as findings in isolation, but recommendations that can be actioned upon. Yes, projects can be evaluated in terms of how they offered value for money, or failed or succeeded in their aims but also, in this case, the partners are looking on to further work in this area. So if further funding applications can draw on what has been learned from the process so far, then all the better. And if there are also findings that generally reflect on the practice of planning and running projects of certain types (in this case involving communities, artists, various funding bodies), and can help all parties learn from the experience, this is also more widely helpful in developing a healthy, robust approach to such projects within the various institutions.

You May Also Like