On the politics of evidence: seeking philosophical counselling
Sue Soal, CDRA
So we are an NGO that has made its name facilitating and promoting organisational learning as essential for NGO effectiveness.
What’s more, we say, it’s the very activity of organisation-wide/systemic learning that will create the internal coherence and rigour needed to continue doing good work and, simultaneously, to justify anything at all. Learning is the central nervous system. It helps NGOs to stand upright and know what’s happening in all parts of their world. It promotes vertical integration. It is the rich vein of organisational life that links strategy and practice. It’s a busy corridor. It is, really, essential to the work of social change organisations and the means by which we access both the data and the insight upon which further strategy is based, not to mention our claims to goodness and validity.
With this rock solid conviction in place, we were well-placed, organisationally, to segue into our current M&E-centric era, happy to suggest that organisational learning is the basis for any M&E system worth its salt.
So much so – and not-quite understanding the term “M&E capacity development” – we sallied forth and created a programme for organisational capacity development that assumes learning at the centre. And being conscious and determined to work with integrity and consistency, we built in our own M&E system that would test the proposition that a learning-oriented M&E system can meet accountability requirements in a results-driven era while simultaneously serving the internal needs of continued organisational health. Ingeniously, we co-opted the language of a results driven approach, and pressed them into service of learning and development.
Ironically and somewhat impishly, we suggested that the best way to pursue results might be to work responsively with the outcomes of one’s own previous efforts.
And then we started the programme.
So here we are: Three practitioners – one leading process, one leading programme and one leading research – and a collective of nine impressive NGOs. They are all based in our home town, they all have a concern with supporting social change, they all have some kind of vertical relationship between their grounded field-work and their more abstract and systemic aims. They are fabulous NGOs. Each has its own special interest in being on an M&E capacity development programme; each sees its potential to strengthen their organisation and each has a particular need.
The programme will develop over 18 months; it will be punctuated by group gatherings, but much of the work will happen in-house in each organisation, supported by us (or one of us, depending on their need). We have great international connections to detailed and specialised evaluation expertise. We have our own deep expertise in organisational systems development and are devoted to ensuring that good ideas are translated into practice. There is an idea that collectively we may generate things that we
can share more broadly: with other NGOs, with the international community. This programme truly is a thing of beauty.
So during the first get together we sketch all the programme dimensions, offer feedback on the initial survey, and guidance on where the best resources are. Collectively the group begins to chart its course and find its way. As a team, we grapple with how we will raise the question of our own research, which is also the same thing as our own M&E agenda. Too soon and prominent and it distracts from the real work. Too late and understated, and it seems shady.
Our research leader is keen for some uniformity of methods and consistency. A bit of baseline. She doesn’t much mind if it’s a survey or a discussion … but consistency would be nice. And the idea of a baseline suggests some urgency to get the study aspect of the programme onto our collective agenda.
I am, to be honest, not that much concerned with any of it. I am interested in that things are changing and developing already, but not so much that I want to pause any of it … or even say it. Actually, I have a real fear that to say it might be to stop it. It’s all so mobile. I am not disinterested in impact (or results). In fact I am so concerned with achieving these that I want to preserve that potential from the interfering ‘eye’ and ‘word’ for as long as I possibly can. I retain a connected and keen interest in how each participant is doing and how the group as a whole is evolving. I carry the practitioner’s complete absorption in the process, annoying as it is for my colleagues who carry other mandates.
But I also carry a concern that we only have this beautiful sharing of experience and continual transparency of evolving experience because we have not objectified any of it. We invited these friends, fellow travellers, partners onto this programme because we thought it would offer something of value to them. Because we are responsive and participatory in how we intervene. Because we are simultaneously at one with these systems we serve, and also outside of them. Because we are keen to help make NGOs stronger and better able to do their work in the current climate.
It is true though that we also want their participation in this programme to be of value – and use – to us (and so by extension, to NGOs in general). We want it to be an opportunity for us to test our ideas. Perhaps even ‘prove’ ourselves right. And to do that we have to ask to study them, and ask them to study themselves, to shine the light of self-awareness on their own evolving processes. Not, any longer for pure learning – not for that insight to be fed back immediately, like compost, into the system from which it came – but to extract it for another purpose. For each participant’s evolving contributions to and experience of the programme to be made into data.
We decide to bring the dilemma to the group – this is after all exactly what we have come together to learn more about and what better way to do it than for us all to subject our experience to scrutiny, and particularly, for us, as the convenors to put our own M&E needs and issues up for discussion? But still, in the session in which this is raised, the atmosphere in the room changes palpably. People become withdrawn, cautious. The tone shifts. The timbre hardens. Contributions shift from questions that enquire to questions wanting answers.
“OK OK”, we say. Let’s work with this. Here’s an exercise: Look at this dilemma, explore it from all its angles; come to some reasonable and pragmatic resolutions. Then, and here’s a clincher … look at your own experience in the field, and identify a similar situation where you have been in the position that we now find ourselves. Look at it from that point of view. What are we learning here?
This question unlocks something. The tone softens. We laugh together at the insights it yields. For some it is a big wake up: “We do exactly this in the field, but we don’t ask” and “We’re not so honest.” For others its confirmation of why the results based agenda is best avoided, relegated to procedural compliance. Kept out of the special place in which the real work happens. For others still it’s an eye- opener to the dilemmas and difficulties of the current times. We are truly at the start of a journey.
And for our team it’s a relief to be told, “Well OK – you can make me your lab rat – study me, get me to study myself – as long as I get my cheese at the end.”
But still … when the group meeting is over and the team reconvenes, I find myself admitting my apprehension to my colleagues. I don’t want to pull participants into this condition of permanent monitoring. I don’t want to yoke their M&E system development to any measures of our success. I am happy to subject our programme to our own review. I am happy to account for what I can be held directly responsible for. I cannot make my problem their problem. I feel a big ‘no’ that is currently coming out as childish distraction from programme objectives. No. I want to protect the learning and relationships in this programme, and the organic development that it nurtures, from the eye of evidence and the onus of proof.
“So,” says our researcher, “you don’t think it’s possible, after all, for a learning-oriented M&E system to meet accountability requirements in a results-driven era?” I am stumped. My clever optimism so easily found wanting. So soon into the programme.
But we are also rather delighted that the dilemmas of the politics of evidence are so present in our programme. We resolve to work further with it all.
And I resolve to seek out philosophical counselling on the dilemmas of method and their implications for practice …