Category Archives: Research

Interviewing in Realist Evaluation

Just a quick post with some reflections on Dr. Ana Manzano’s seminar last week (8 Feb 2017) at the University of Leeds on The craft of interviewing in realist evaluation.  It’s a great benefit to have the Realism Leeds team one-hour away and they have a busy programme of other sessions planned for 2017 – keep track via the @RealismLeeds twitter. It was a great talk and nice to meet Ana and members of the team!

Firstly, Ana shared how she is frequently asked how to describe a realist interview, and that it comes back to how you approach the interview to begin with.

Recognising the importance of context – mechanisms – outcomes (CMO) to realist evaluation, preparing for interviews involves thinking about the CMO of the programme / phenomena in advance to get a broad idea of the lines of enquiry. However, it is important to not set them in stone.

One of the key characteristics of a realist interview is treating it as an open card game. Rather than the interviewer retaining prior knowledge close to their chest and acting with false naiveté (with the intention of avoiding data contamination), realist interviews are more of a shared journey between interviewer and interviewee with knowledge being laid out on the table in an effort to improve understanding of the programme theory between both parties.

Importantly, this makes realist interviewing more of an iterative process where the lines of enquiry and CMO focus may be refined over the course of a programme of interviews.  This process of eliciting the programme theory involves theory gleaning, theory refining/testing and theory consolidation – for more detail, see links below.

Qualitative social research interviews already place power in the hands of the interviewer to shape the research outcomes, but the ongoing iterative analysis that characterises realist interviews appears to give even more power to the interviewer.  This perhaps makes it especially important to view the research / evaluation findings as a product which is distinct to the researcher(s) leading the interviews. It also implies it is important to build some form of tracking into the research design so that the iterative process of refining the lines of enquiry / questioning is acknowledged in the reporting.

Manzano, A. (2016) The craft of interviewing in realist evaluation, Evaluation 22: 342-360. Sage. http://journals.sagepub.com/doi/abs/10.1177/1356389016638615

Event summary: http://www.sociology.leeds.ac.uk/events/2017/dr-ana-manzano-the-craft-of-interviewing-in-realist-evaluation

Share

Into 2017 on the Front Foot…

ABRE is looking forward to an exciting year of research and evaluation in 2017 – a year which will be shaped by the 2016 EU referendum, the changing role of evidence, new innovative research methods and opportunities for professionalisation in evaluation. Here are some reflections and useful links for the year ahead!

Firstly of course, the UK’s decision to leave the EU is seismic and begins a long term process of transforming uncertainty into opportunities for economic development and other sectors. While UK Government funding commitments to 2020 are welcome, they also heighten the need for organisations to get their ducks in a row for a new operating environment in the near future. For example, funders’ increasing emphasis on demonstrating accountability and value for money already appears to be generating a culture change in the voluntary and community sector. The message for all seems to be get ready for change.

The past year has also seen a public diminishment of facts and experts on both sides of the Atlantic, in politics at the macro level but also at much more micro levels as reported in the ABRE blog one week before the EU vote. The research and evaluation community has a lot to do to win confidence in a changing world. Sites such as Sense about Science, Full Fact, BBC R4 More or Less and The Guardian’s Reality Check provide useful starting blocks (for the UK at least). Similarly, from this year’s personal reading list, the paper on New Political Governance by Jill Anne Chouinard and Peter Milley provides a helpful primer from North America on some key considerations when a ‘take it or leave it’ approach to evidence exists.

More positively, 2016 has allowed for some exciting new research approaches and innovations. ABRE has recently been collecting video feedback from participants in a Big Lottery funded youth employment programme, and development of the Assess-Evaluate-Develop Framework has progressed with the website launch in July and continued refinements into 2017.

ABRE also continues to support steps to enhance the evaluation profession in the UK, which has involved Andrew’s participation in the UKES Voluntary Evaluator Peer Review (VEPR) pilot in March, attendance at the UKES national conference in May, and election to the UKES national Council in December.

Many thanks to colleagues, clients and friends in 2016, and ABRE looks forward to continued work together throughout 2017. Finally, end of year thoughts with the family, friends and colleagues of David Kay who will be sadly missed but whose enthusiasm and kindness of character will be fondly remembered!

EU Referendum: Lessons for evidence-informed decision making?

pubquiz

At a pub quiz last week, the quizmaster announced there would be European themed questions to recognise the Euro 2016 tournament and upcoming EU referendum. Brilliant and topical… even though we still came second in the end and missed out on the prize of several free pints!

The quizmaster largely kept politics out of it but still made a few comments in favour of leaving the EU and protecting “British jobs” in particular. Being in favour of remaining in the EU, I had a chat at the end to find out why he favoured leaving and his overriding feeling was that “nothing much would change” even if we left, so why not go it alone.

The conversation was revealing in a couple of ways: 1) Uncertainty and apathy in response to the walls of noise and numbers (both fact and fiction) being presented by individuals or groups who don’t usually have much to do with people’s everyday lives; and 2) A resulting reliance on gut instinct which might ultimately result in a decision to remain based on a sense of inertia or reduced risk, or to leave based on some sense that if there’s no difference, why not “back Britain” on its own. Even where evidence is being presented by organisations that are generally respected as independent and credible, it appears to be getting lost in the overall fog of facts and fiction.

Working in the research profession, evidence-informed decision making is difficult at the best of times for some organisations managing even small projects and policies. Organisations might choose not to use commissioned research findings because they don’t align with other existing plans or because there are numerous other examples of research findings to bear in mind, or due to other intangible strategic or political factors.

The conclusion is perhaps not to be surprised that some UK citizens are uncertain what evidence to trust in the EU referendum and, further, that it is understandable if decision-makers in organisations rely on instinct or a considered personal judgment where numerous examples of evidence are being presented to them. The extent to which this counts as evidence-informed decision making is uncertain, but it reflects the reality facing research commissioners and, at the least, illustrates the importance of researchers reporting study findings as clearly as possible.