Ever had the thought after listening to a number of mildly interesting presentations and conclusions from research at a conference, colloquium or any other fancy gathering that you heard it all before? That it all sounded strangely familiar? Put differently, that you left a “high-level” research event with the odd feeling that the conclusions could just as well have been written before actually doing the research as similar research with similar methods based on similar theories had already been done more than a few times, only in a different setting perhaps? Didn’t we learn from the previous research, or did we simply (and conveniently?) forget about it, again? And so we go back to the same people who were the subjects of the study, and ask them more or less the same questions, again and again.
Nevertheless, when writing up the findings and conclusions from research in which a large, often vulnerable population was involved (as a ‘target’ or ‘researched’ population), we usually claim to present an analysis and/or synthesis of what these people have said and explained to us. But do our findings and conclusions really mirror what they brought forward? How often do we actually go back to these people with our analysis and conclusions before presenting/publishing things at a conference/in a journal and ask them what they think of it? How often do we ask them whether they recognise their viewpoints in our many ‘deliverables’? Are we so (over) confident in our methods that we are sure about an unbiased outcome that reflects the reality in the field? We could be in for a surprise.
Recently we tried to do just that (more or less) when presenting the preliminary results of the research conducted by the Health Inc consortium partners – research on socially inclusive health care financing in West Africa and India. At the 3rd Health Systems Research symposium in Cape Town we invited a number of representatives from vulnerable population groups to comment on our findings and conclusions, all – that goes without saying – neatly summarized in power points with great figures and tables.
Well, we got some serious slaps in the face. It’s amazing to see how much we researchers can lose in the process of analysing and drawing “conclusions” without being aware of it, biased as we are by the research questions and topics we put forward ourselves; not always paying attention to new elements or not recognising the different approach/perspective the people brought in themselves.
More often than not, the whole research process tends to cut off the sharpest corners and mitigate the sharpest conclusions. How bland recommendations can turn out to be, remaining vague on who should do what on what (decision) level! Well, you know the drill.
To put it in the words of an invited ‘expert by experience’ in poverty: we researchers love to turn things into a debate on the question whether the glass is half full or half empty, while in reality people in poverty don’t even have a glass!
As researchers and academics we can be quite smug when our work results in a so called “wake-up call” for a policymaker or some other important ‘stakeholder’, but we researchers need a wake-up call too. And we’ll have to organise that ourselves, because (as we are considered to be the “experts”) nobody else is going to make us do that out of fear of being blown away by an avalanche of misplaced highbrow, neutrality-claiming rhetoric. We need to set our own alarm clocks and can do that by bringing the people (we ‘researched’) back in. By asking them if we got it right, if we really captured what they intended to communicate to us. We can do that by bringing them to our ‘fancy gatherings in lavish buildings with posh rooms’ (as one activist put it in Cape Town), and making time and space for these kinds of – admittedly, often somewhat awkward – confrontations. Or ‘humble’ ourselves and go back to the field and have the courtesy to talk and discuss our findings in the people’s own environment, where it actually really matters. Where they are more at ease and where researchers are confronted again with the often harsh realities. No matter how one goes about it, I’m convinced that it would be a revelation to bring the people back in when we “finish” our research. Let’s see whether our scientific ‘nuance’ stands the heat.
Isn’t that the essence of “going public” after all? Going back to the public, the people. Not just presenting the lot for the ‘incrowd’ or a number of politicians/policymakers and/or other important “stakeholders” who can pretend they don’t feel addressed by the research or claim to use a near-neutral lens.
No, let’s bring our research to the only stakeholders who really matter. The ones for whom our research should actually matter the most.