Author Archives: admin

UK Evaluation Society Conference 2017 – Initial Reflections

This is a short recap of the 2017 UKES Conference which was held on 10-11 May and themed around the use and usability of evaluation.

Beginning with the opening keynote, Michael Anderson from the Centre for Global Development highlighted that publication bias in journals towards positive-results was being replicated as publicity bias in the media; that slogans are effective where they have intuitive appeal, even if they’re disputed factually (e.g. “Prison Works” from Michael Howard in 1993); and this presents evaluators with a conundrum around using slogans to present findings, on one hand recognising that soundbite headlines are effective communication devices (so ‘if you can’t beat them, join them’), but on the other hand, recognising evaluators’ role to provide nuanced judgments.

Lorraine Dearden provided the second keynote on the same day she was actively defending her evaluation evidence in the media (clarifying the effectiveness of free school meals and breakfast clubs in light of Labour’s proposals). This was consistent with Lorraine’s position that impact evaluations need to be wholly rigorous to add significant value… but also recognising that process evaluations have an important role in making sense of the numbers.

Martin Reynolds ran a thought-provoking session on professionalism in evaluation and how systems-thinking can help interrogate the ways in which evaluators balance facts and values using boundary judgments; and that a transparent, civic-led model of professionalism is missing in evaluation at present.

Raman Srivastava provided an insightful update from Canada on the new Policy on Results combining the previous Policy on Evaluation with federal resource management functions, shifting the use of evaluation findings to the point when Ministerial decisions are made about programmes and budgets. In contrast to some conference conversations which viewed politics as a threat to effective evaluation use, Raman helpfully reminded attendees that “good policy can be good politics.”

Sue Holloway was the third keynote and identified three challenges to greater evaluation use in the voluntary and community sector: motivation; capacity (to produce AND use evaluations); and money. In response to these factors, Sue highlighted that evaluation needs to clearly demonstrate its value, and that includes having a clearer idea of what quality standards apply among evaluation practitioners.

At the AGM, the UKES Council presented the 2017-2020 Business Plan and invited evaluation practitioners to become more involved with the Society, including taking part in the Society’s working groups and keeping in touch using LinkedIn and twitter.

Three of the What Works Centres gave a timely progress update, along with an insistence that RCTs aren’t the only studies they’re considering, only that “the strength of their recommendations reflects the strength of evidence they have available,” as noted by Beth Shaw from NICE, hence qualified impact studies are given more prominence. It was also excellent to hear Will Finn’s policing examples involving PC Positive and Sgt Cynical, and his comparison between a successful evaluation and his mother-in-law’s curry!

All of the other sessions that I attended were enjoyable, such as the discussions on engaging beneficiaries in evaluation co-production, managing evaluation steering groups, and protecting evaluators’ and their role from internal politics in large organisations. There was also an important update from UKES Council members on the Voluntary Evaluator Peer Review (VEPR), with significant interest from international societies in how the Society is using this to support professional development among UKES members. A full list of the sessions I attended is presented below.

Thanks to all who organised and participated in this year’s conference!

  1. Giving Voice to Evaluations in an Era of Slogan and Snapchat, Michael Anderson, Visiting Fellow, Center for Global Development
  2. Beyond burden: Engaging beneficiaries as equal partners in evaluation, Bethan Peach and Tarran Macmillan, OPM Group
  3. Balancing and managing different perspectives in evaluation steering groups, Joe Duggett, SQW Ltd
  4. Good and Not So Good Practice in Quantitative Impact Evaluation, Professor Lorraine Dearden, Professor of Economics and Social Studies, University College London and Research Fellow, Institute for Fiscal Studies
  5. The challenges of developing a system to develop evaluators’ capabilities through a voluntary peer review grounded in reflective practice, Derek Poate, Dr Dione Hills and Professor Helen Simons, UKES VEPR Sub-group
  6. Evaluation as public work: An ethos for professional evaluation praxis, Dr Martin Reynolds, The Open University
  7. Canada’s new Policy on Results: How the Department of National Defence is evolving its evaluation function scheme, Dr Raman Srivastava, Federal Department of National Defence, Canada
  8. Challenges of achieving both accountability and learning: Case study of the evaluation of a payment by results programme, Alex Hurrell, Oxford Policy Management
  9. The politics of utilisation focused evaluation, Lydia Richardson, IPE Triple Line and Alison Napier, INTRAC
  10. The Evidence Journey: A View from the Other Side, Sue Holloway, Chief Executive, Project Oracle
  11. The work of HMG’s What Works Centres, Sara MacLennan, What Works Centre for Wellbeing, Beth Shaw, National Institute for Health and Care Excellence (NICE), Will Finn and Abigail McNeill, College of Policing
  12. Monitoring, evaluating and learning at multiple levels: 25 years of the Darwin Initiative, Dr Simon Mercer, LTS International
  13. Deepening our understanding of challenge fund evaluation, Clarissa Poulson and Lydia Richardson, IPE Triple Line
Share

Interviewing in Realist Evaluation

Just a quick post with some reflections on Dr. Ana Manzano’s seminar last week (8 Feb 2017) at the University of Leeds on The craft of interviewing in realist evaluation.  It’s a great benefit to have the Realism Leeds team one-hour away and they have a busy programme of other sessions planned for 2017 – keep track via the @RealismLeeds twitter. It was a great talk and nice to meet Ana and members of the team!

Firstly, Ana shared how she is frequently asked how to describe a realist interview, and that it comes back to how you approach the interview to begin with.

Recognising the importance of context – mechanisms – outcomes (CMO) to realist evaluation, preparing for interviews involves thinking about the CMO of the programme / phenomena in advance to get a broad idea of the lines of enquiry. However, it is important to not set them in stone.

One of the key characteristics of a realist interview is treating it as an open card game. Rather than the interviewer retaining prior knowledge close to their chest and acting with false naiveté (with the intention of avoiding data contamination), realist interviews are more of a shared journey between interviewer and interviewee with knowledge being laid out on the table in an effort to improve understanding of the programme theory between both parties.

Importantly, this makes realist interviewing more of an iterative process where the lines of enquiry and CMO focus may be refined over the course of a programme of interviews.  This process of eliciting the programme theory involves theory gleaning, theory refining/testing and theory consolidation – for more detail, see links below.

Qualitative social research interviews already place power in the hands of the interviewer to shape the research outcomes, but the ongoing iterative analysis that characterises realist interviews appears to give even more power to the interviewer.  This perhaps makes it especially important to view the research / evaluation findings as a product which is distinct to the researcher(s) leading the interviews. It also implies it is important to build some form of tracking into the research design so that the iterative process of refining the lines of enquiry / questioning is acknowledged in the reporting.

Manzano, A. (2016) The craft of interviewing in realist evaluation, Evaluation 22: 342-360. Sage. http://journals.sagepub.com/doi/abs/10.1177/1356389016638615

Event summary: http://www.sociology.leeds.ac.uk/events/2017/dr-ana-manzano-the-craft-of-interviewing-in-realist-evaluation

Into 2017 on the Front Foot…

ABRE is looking forward to an exciting year of research and evaluation in 2017 – a year which will be shaped by the 2016 EU referendum, the changing role of evidence, new innovative research methods and opportunities for professionalisation in evaluation. Here are some reflections and useful links for the year ahead!

Firstly of course, the UK’s decision to leave the EU is seismic and begins a long term process of transforming uncertainty into opportunities for economic development and other sectors. While UK Government funding commitments to 2020 are welcome, they also heighten the need for organisations to get their ducks in a row for a new operating environment in the near future. For example, funders’ increasing emphasis on demonstrating accountability and value for money already appears to be generating a culture change in the voluntary and community sector. The message for all seems to be get ready for change.

The past year has also seen a public diminishment of facts and experts on both sides of the Atlantic, in politics at the macro level but also at much more micro levels as reported in the ABRE blog one week before the EU vote. The research and evaluation community has a lot to do to win confidence in a changing world. Sites such as Sense about Science, Full Fact, BBC R4 More or Less and The Guardian’s Reality Check provide useful starting blocks (for the UK at least). Similarly, from this year’s personal reading list, the paper on New Political Governance by Jill Anne Chouinard and Peter Milley provides a helpful primer from North America on some key considerations when a ‘take it or leave it’ approach to evidence exists.

More positively, 2016 has allowed for some exciting new research approaches and innovations. ABRE has recently been collecting video feedback from participants in a Big Lottery funded youth employment programme, and development of the Assess-Evaluate-Develop Framework has progressed with the website launch in July and continued refinements into 2017.

ABRE also continues to support steps to enhance the evaluation profession in the UK, which has involved Andrew’s participation in the UKES Voluntary Evaluator Peer Review (VEPR) pilot in March, attendance at the UKES national conference in May, and election to the UKES national Council in December.

Many thanks to colleagues, clients and friends in 2016, and ABRE looks forward to continued work together throughout 2017. Finally, end of year thoughts with the family, friends and colleagues of David Kay who will be sadly missed but whose enthusiasm and kindness of character will be fondly remembered!

New Resource for Research and Planning Studies

ReflectivecycleFor individuals and organisations involved in managing projects, programs or policies, it can sometimes be unclear how and why to use evaluative research and planning. The idea of using research evidence to inform practice might be viewed positively but understanding what this involves might be less certain.

At the same time, research practitioners are regularly challenged to design and undertake new research and planning assignments without necessarily having access to consistent up-to-date guidance that can be applied across a wide range of contexts.

To help address these issues, assessandevaluate.com has been launched to support the design and development of research and planning studies. It does this by providing guidance through a structured Research and Planning Framework, and by explaining some of the main concepts and approaches that are useful for practical research and planning. This includes distinguishing between exercises involving assessment (what happens and why), evaluation (understanding the implications) and development (planning what happens next) which have a cyclical relationship, as shown in the diagram.

The website is available as a resource for project managers, policymakers, researchers or others with a professional interest in the subject. It is grounded in a global review of research and planning guidance, combined with practical experience. It is also expected to evolve over time so feedback is welcomed.

To share comments and ideas, please contact me at andrewberry@assessandevaluate.com

EU Referendum: Lessons for evidence-informed decision making?

pubquiz

At a pub quiz last week, the quizmaster announced there would be European themed questions to recognise the Euro 2016 tournament and upcoming EU referendum. Brilliant and topical… even though we still came second in the end and missed out on the prize of several free pints!

The quizmaster largely kept politics out of it but still made a few comments in favour of leaving the EU and protecting “British jobs” in particular. Being in favour of remaining in the EU, I had a chat at the end to find out why he favoured leaving and his overriding feeling was that “nothing much would change” even if we left, so why not go it alone.

The conversation was revealing in a couple of ways: 1) Uncertainty and apathy in response to the walls of noise and numbers (both fact and fiction) being presented by individuals or groups who don’t usually have much to do with people’s everyday lives; and 2) A resulting reliance on gut instinct which might ultimately result in a decision to remain based on a sense of inertia or reduced risk, or to leave based on some sense that if there’s no difference, why not “back Britain” on its own. Even where evidence is being presented by organisations that are generally respected as independent and credible, it appears to be getting lost in the overall fog of facts and fiction.

Working in the research profession, evidence-informed decision making is difficult at the best of times for some organisations managing even small projects and policies. Organisations might choose not to use commissioned research findings because they don’t align with other existing plans or because there are numerous other examples of research findings to bear in mind, or due to other intangible strategic or political factors.

The conclusion is perhaps not to be surprised that some UK citizens are uncertain what evidence to trust in the EU referendum and, further, that it is understandable if decision-makers in organisations rely on instinct or a considered personal judgment where numerous examples of evidence are being presented to them. The extent to which this counts as evidence-informed decision making is uncertain, but it reflects the reality facing research commissioners and, at the least, illustrates the importance of researchers reporting study findings as clearly as possible.

UK Evaluation Society Conference 2016 – Initial Reflections

Tower BridgeTwo days (27-28 April), 16 learning sessions, lots of networking!  Lots of international development content which took a bit of navigating for evaluators working in a UK/EU context (while still picking up some very helpful info from globally-based colleagues!). Some initial reflections below, along with a full list of sessions attended.  I’d expect that all of the presenters would be happy to field any requests for further information, although I’d also certainly be happy for any follow-up discussions, particularly on the Voluntary Evaluator Peer Review pilot/process which I’ve participated in and would highly recommend.

Choosing methods.  There was a sense of some RCT pushback among attendees and presenters, not least from the newly launched CECAN which will be looking into alternative approaches using realist context, simulated RCTs / policy modelling, drawing on participatory expertise and ongoing monitoring / iteration.  Later in the first day, I became slightly worried that evaluators might no longer be needed for choosing appropriate methods… well, maybe not that worried yet, but the resource being developed by Dr. Barbara Befani and Michael O’Donnell was a really interesting development in actually setting out a list of available methods, let alone how it has drawn on expert feedback to advise on which ones may be most suitable for different situations.

Simple, complex and complicated.  CECAN Director Nigel Gilbert introduced the challenges of upward and downward causation, and the example of steering a wheelbarrow to illustrate a wobbly evaluation pathway… and I learned that ‘complex’ is derived from the Latin to ‘intertwine’ (…I’ll probably be using that as a random fact in conversations). Prof. Picciotto also ran a very lively session trying to convince us that the frequently used ‘simple’ example of following a recipe (as opposed to sending a rocket to the moon or raising a child – Glouberman & Zimmerman, 2002) is not even simple, especially when celebrity chefs get involved!

Demonstrating policy influence and intangible added value. This was applicable for a lot of the discussions on how to measure strategic influence in an international development context, and it was a great introduction to process tracing and Bayesian confidence updating (using hoop tests, smoking gun tests, doubly-decisive tests and straw-in-the-wind tests) for assessing confidence in causal attributions (session 9).  In an EC context, a similar challenge is being tackled to demonstrate the value of European-level approaches for certain issues and to address calls for subsidiarity (session 2). Carol Candler shared experiences of conducting strategic consultations in Singapore where the emphasis on ‘face’/saving face meant that critical assessments needed sensitivity and delicate handling, but that attempts from strategic leaders to delegate their interviews should be resisted.

Engaging audiences. From Dr Beatriz Garcia’s presentation on cultural / economic impacts of the Liverpool European Capital of Culture 2008, I liked how a pedagogic approach made it simpler to map/present investments allocated to the event itself, wider city regeneration, and meeting wider European objectives; and secondly, how using regular, glossy updates on impact during the event(s) helped promote engagement and strategic buy-in. Claire Hutchings made the good point that we should be moving towards evidence-informed policy, not evidence-based policy.

History of evaluation.  Finally, Bradford Rohmer gave an enjoyable presentation on the history of evaluation in the EC, from early written evaluation guidance in the late 1990s which was restricted to mid-term and ex-post; to the inclusion of ex-ante from the 2000s; and the emphasis on DG standardisation and evaluation working documents in last year’s Better Regulation Package.

Sessions attended below:

  1. Evaluating complexity, Professor Nigel Gilbert, Director, Centre for the Evaluation of Complexity Across the Energy-Environment-Food Nexus (CECAN)
  2. What is EU-added value and how can it be measured? Andrew Hetherington, Coffey
  3. Evaluating complexity every day: Practical approaches to evaluating complexity in European funding, Laura Hayward, ICF International
  4. Evaluating the culture of major events: The long-term view, Dr Beatriz Garcia, Head of Research, Institute of Cultural Capital, University of Liverpool
  5. Unpacking methodological appropriateness for Impact Evaluation: Presentation of an online tool for selecting appropriate methods, Dr Barbara Befani, University of East Anglia; Michael O’Donnell, Bond
  6. Understanding what works: Do we know how to mix methods? Professor Bob Picciotto, King’s College London
  7. Making the infinite countable? Responding to the challenges when evaluating innovation policy, Jonathan Cook, SQW
  8. youngballymun’s performance story report: A rigorous and pragmatic evaluation of a complex community change initiative, Dr Gemma Cox, youngballymun
  9. From assessing impact to assessing confidence about impact: Harnessing the potential of Process Tracing and Bayesian confidence updating to evaluate policy influence in complex and uncertain settings, Dr Barbara Befani, University of East Anglia; Gavin Stedman-Bryce, Pamoja UK Ltd; Stefano D’Errico, IIED; Francesca Booker, IIED and Centre for International Forestry Research
  10. Unpacking and optimising mixed methods evaluation: Insights from the Carers’ Employment Pilot Evaluation, Dr Annette Cox, IES
  11. Halfway house: The confused past and uncertain future of evaluation in the European Commission, Bradford Rohmer, Coffey International
  12. UK Evaluation Society AGM
  13. Recognising messiness and embracing real world complexities: Evaluation and the SDGs, Claire Hutchings, Head of Programme Quality, Oxfam
  14. Voluntary Evaluator Peer Review update, Derek Poate, Chair, UKES VEPR Sub-group
  15. Learning from our successes: A positive approach to assessing Public Value, Carol Candler, Voluntary and Philanthropy Sector Development Advisor; Helen Highley, Brightpurpose
  16. Joining the dots…between services, evaluators and funders: The Project Oracle journey, Professor Georgie Parry Crooke, London Metropolitan University

This year’s full programme is available at: www.profbriefings.co.uk/ukes2016/

Thanks to all the organisers and presenters at this year’s event.

Making a Difference in 2015

2015pic

2015 was the UN Year of Evaluation and you may have seen that EvalPartners sponsored the Evaluations that Make a Difference initiative which provides a global viewpoint on how research and evidence can really help programs to change people’s lives.

From ABRE’s perspective, it was very pleasing to work on several program evaluations during the course of the year covering activities as diverse as climate change mitigation, hi-tech sector support, enterprise promotion, community and voluntary services, and skills enhancement and training.  Although some of these examples might be of smaller scale or scope than the EvalPartners examples, the same premise remains that well researched and grounded evidence can make a big impact on how organisations or groups move forward, whether this is progressing a pan-European approach to climate change innovation or helping charitable organisations to improve their success rate when bidding for funding (with Kada Research).

It was also insightful to share thoughts with evaluation colleagues at the UKES conference in May and York RCTs conference in September, including presenting on simplicity in evaluation… which of course is fairly complex!

ABRE’s work in 2015 included research support for multiple economic strategy assignments in the south east of England (Kada), as well as assisting with rapid evidence assessments and bid writing elsewhere. It was also the year of moving into a new office in the centre of Chesterfield. Into 2016, work is already underway on the exciting evaluation of a sector innovation program in the West Midlands.

Best wishes and a happy new year to colleagues and clients, and ABRE looks forward to working together throughout the year!

Keeping Evaluations Simple (Without Being Stupid)

Although the opportunity exists to undertake highly technical evaluations, several established maxims recommend keeping things simple wherever possible to improve effectiveness.  This includes the well-known acronym K.I.S.S. or ‘Keep it Simple Stupid’ which originated in the 1960s US Navy and engineer Kelly Johnson who emphasized that the benefits from having more advanced aircraft are dependent on whether they are repairable by an average mechanic operating in combat conditions.  This viewpoint is similarly attributed to figures such as Albert Einstein, “everything should be made as simple as possible, but no simpler,” and Leonardo da Vinci, “simplicity is the ultimate sophistication.”

The technical depth and breadth of methodologies available to evaluators and the complexity of issues being investigated can make it challenging for evaluators to pick the right research approach and communicate their work effectively to participants and end-users.  There are at least five aspects that can determine the level of complexity for a particular study:

  • Area of study. For example, Zimmerman (2001) differentiates simple issues (where there is agreement and certainty) from complicated (where there is either agreement or certainty) and complex (where there is neither).
  • Evaluation design and methods. Many standalone and mixed methods are available to investigate and triangulate towards reliable evidence.
  • Research processes, planning and project management. Different approaches possible for managing the research process.
  • Presenting and communicating. Reporting of research findings (and the methods that produced them) is arguably as important as the quality of the research itself.
  • Overall understanding of evaluation. Evolving and overlapping roles of evaluators can lead to misunderstandings among researchers and research audiences alike.
‘Simple’ is a relative concept when it comes to maintaining one of these

‘Simple’ is a relative concept when it comes to maintaining one of these. Photo credit: CnOPhoto / Shutterstock.com

Example 1: Evaluation design and methods

Sophisticated tools and techniques may make a compelling case for improving the quality and rigour of evaluations.  Yet at the same time, there is an argument that they can sometimes be akin to fitting jet engines for mechanics to repair with just a spanner.  Several factors that can influence simple evaluation design are highlighted below.

  • End-user needs. Deciding whether the evaluation will primarily be driven by a research theory perspective to enhance subject-matter knowledge or by a utility approach that emphasizes how the findings could be used by practitioners and/or research participants. i.e. for whom is the study intended?
  • Focus of the research question. The choice of evaluation design may be between a method that is less precise but directly addresses the research question and one that is more rigorous but only offers a proxy result.  This will depend on the purpose and aims of the evaluation.
  • Research scope and technical detail. Paring back the research scope and/or structure can be helpful, particularly if there is openness for results that are less specific and more generalisable. For example, evaluation rubrics advocated by Davidson (2014) focus on using high-level evaluation questions and synthesizing responses into a consistent rubric format.
  • Understanding causation. Where applicable, a program theory approach helps to open the grey/black box of causation.  Further, the UK Magenta Book (2011) notes that “where the logic model is particularly complex, restricting the scope of the evaluation to consider shorter, simpler links in the logic chain can increase the ability of process evaluations to provide good evaluation evidence.” 

Sometimes however, the benefits of increased complexity outweigh the benefits of keeping things simple.  The UK Magenta Book (2011) lists a series of barriers to generalisability where neglecting them in a method could be described as being ‘too simple’:

  • Internal and external validity – ensuring that research results are interpreted accurately and study sample results are representative.
  • Strategic context – making sure that important contextual issues are accounted for.
  • Additionality – identifying substitution, displacement and other elements of net impact.
  • Unintended consequences – capturing spillover effects, both positive and negative.

Example 2: Simple reporting and communicating

Policymakers and other end-users of evaluations have different levels of tacit knowledge and limited time, so the evaluator’s challenge is provide technically accurate reporting that the target audience(s) can understand simply and quickly.  This applies both to outlining the evaluation method and to translating research findings for the end-user so that they understand what it means for policy and practice. Examples of challenges and solutions related to simple reporting include the following.

  • System modelling. As noted by the W.K. Kellogg Foundation (2004), outcome mapping and logic models provide “a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan to do, and the changes or results you hope to achieve.”
  • Visual presentation. Increasing use of infographics, illustrations and video that can be easily disseminated, including through social media.
  • Keeping messages brief and punchy. This applies to both verbal and written means of communication.  Vaughan & Buss (1998) advocate reports that “communicate reasoning as well as bottom lines; use numbers sparingly in the summary reports; [and] elucidate, don’t advocate,” while  Oliver et al. (2014) note that “academics used to giving 45-minute seminars do not always understand that a hard-pressed policymaker would prefer a 20-second phone call.”
  • Hierarchy of detail. A verbal summary of the main research messages is the tip of the iceberg, but this is typically backed up by greater levels of detail provided in presentation slides, factsheets, executive summaries, full reports and supporting technical annexes. For example, the widely used 1:3:25 (bullets: exec summary: report) model.
  • Promoting engagement and involvement. An inclusive, two-way dialogue between researchers and end-users can help to simplify communication, with Funnell and Rogers (2011) highlighting that “those who have contributed to developing a theory of change often feel much more connected to it.”
It’s important to know whether a 45-minute seminar or a 20-second phone call would work best

It’s important to know whether a 45-minute seminar or a 20-second phone call would work best. Photo credit: iofoto / Shutterstock.com

Happy mechanics and evaluators

There are several aspects that determine the complexity of an evaluation, yet maxims like K.I.S.S. serve as a reminder for evaluators to keep things simple where possible in order to improve effectiveness.

Consideration of whether an evaluation method or design is simple enough will typically come down to deciding what works best for whom. Similarly, the most suitable means of reporting will depend on the requirements, evaluation culture and capacity of the study audience.  In this regard, the background story to K.I.S.S. offers an interesting parallel between on-the-ground engineers fixing aircraft and in-the-field evaluators seeking to keep research simple, relevant and accessible.

This article is based on a presentation given to the UKES conference in May 2015. My thanks to the session participants for their valuable Q&A feedback.

References:

Davidson, J. (2014) The Rubric Revolution: Practical Tools for All Evaluators, UKES London/South East Regional Evaluation Network Event 26th September 2014

Funnell, S. C. & Rogers, P. J. (2011) Purposeful Program Theory: Effective Use of Theories of Change and Logic Models, San Francisco, Jossey-Bass/Wiley

HM Treasury (2011) The Magenta Book: Guidance for Evaluation, TSO, London https://www.gov.uk/government/publications/the-magenta-book (accessed October 2015)

Oliver, K., Innvær, S., Lorenc, T., Woodman, J. & Thomas, J. (2014) Negative stereotypes about the policymaking process hinder productive action toward evidence-based policy, LSE website http://blogs.lse.ac.uk/impactofsocialsciences/2014/06/02/how-to-get-policymakers-to-use-more-evidence/ (accessed October 2015)

Pawson, R. & Tilley, N. (1997) Realistic Evaluation, London, Sage

Vaughan, R. J. & Buss, T. F. (1998) Communicating social science research to policymakers, Thousand Oaks, CA, Sage

W.K. Kellogg Foundation (2004) Logic Model Development Guide, Battle Creek, MI https://www.wkkf.org/resource-directory/resource/2006/02/wk-kellogg-foundation-logic-model-development-guide (accessed October 2015)

Zimmerman, B. (2001) Ralph Stacey’s Agreement & Certainty Matrix, Schulich School of Business, York University, Toronto, Canada http://betterevaluation.org/resources/guide/ralph_staceys_agreement_and_certainty_matrix (accessed October 2015)

Remember, remember the 6th of November, a memorable day for the North?

Originally posted 6th November 2014

NFutures6Nov

Deputy PM Nick Clegg addresses the Northern Futures summit in Leeds, 6th November 2014

The Northern Futures summit was hosted by the Centre for Cities today and was an opportunity for civic leaders, businesses, residents and other stakeholders to discuss ideas on how to create an economic core in the North of England that’s globally competitive.

The day involved the pitching of nine ideas contributed to the Northern Futures team over the past four months, along with keynote speeches from Jim O’Neill, Chair of the City Growth Commission, and Prof. Ed Glaeser of Harvard University.

Five quick points from the day:

  1. Enhancing skills was probably the most cited and widely agreed on theme of the day, both at the level of schools (importance of teaching quality) and employers (increasing demand for skills uplift). There was also a recognition of entrepreneurial skills as key drivers of growth.
  2. Transport improvements, largely in terms of road/rail infrastructure and services, received popular support among participants, but there were questions raised whether they could be treated as fundamental to growth. Local connections within city-regions and inter-city connections across the North were both viewed as high priorities.
  3. There was no clear consensus on the importance of decentralisation to economic growth in the North, although there was support from councils for greater autonomy in an environment of budgetary pressures. There was recognition that the Greater Manchester agreement represents the thin-end of the wedge.  There was a sense that a lot of the ideas are not especially new (i.e. references to The Northern Way), but the opportunity for cities to have greater control of their economic growth was seen as a new development.
  4. The pitched ideas have implications across a range of scales, from local empowerment, selling the North as the economic alternative to London at the national level, and calls for cities to work together to be relevant at the global level. There was also debate on whether the greater emphasis on cities leading growth would be at the expense of rural areas and smaller towns.
  5. This is just the beginning. The day was a success in terms of bringing together many civic leaders and policy professionals, but there was also recognition that the audience did not represent the diversity of the population at large, and that change would be gradual.  The call from two local Year 13 students at the beginning of the event to be “youthfully ambitious” was frequently referenced throughout the day and this is fitting given that a lot of the proposed structures and collaborations are at a youthful stage of development.  The Chancellor’s Autumn statement would appear to be the next opportunity to hear next steps ahead of the run-up to the 2015 general election.

References

Deputy Prime Minister’s Office (2014) Northern Futures Summit Draft Communiqué, https://www.gov.uk/government/consultations/northern-futures (retrieved 6 November 2014)

Ideas for the North of England in 2030

Originally posted 21st October 2014

PeaceGardens

How should cities like Sheffield develop in the future?

If you’re aware of the UK Government’s Northern Futures consultation, you will have seen that they’ve been seeking ideas on future growth for the north of England and a vision for 2030.  The range of opportunities and challenges this opens up is vast and makes it difficult to know where to start.   A limited scan of evidence however finds that certain areas may have a meaningful impact on creating an economic core to compete with the biggest cities in the world.  This includes:

  • Connecting the north through world class transport services and infrastructure
  • Promoting inward investment and international trade in the north
  • Advancing existing sector specialisms to develop renowned centres of industry and learning
  • Encouraging and embracing sustainability as part of a northern identity

What will happen as an eventual outcome from the Northern Futures consultation (with a general election seven months away) is a separate question.  At this point in time, this set of suggestions simply scratches the surface and highlights areas that government and other stakeholders may wish to consider when developing plans in the north of England.  It also advocates the importance of having a solid evidence base to inform future decision-making. Further commentary on each idea is provided below.

Ideas for the North – pdf

1. Connecting the North

Improved transportation and travel services across the north of England could offer several important benefits, not least the economic impacts from reduced congestion and shorter journey times.

Focusing on the Local Enterprise Partnership (LEP) regions between Lancashire and the Humber, Figure 1 highlights how many people commute to work by car, van or train within each of the LEP regions and how many commute in and out of the surrounding LEP regions.  It shows that the number of people commuting between cities in the north is a small fraction of the number travelling within the city regions.

From one perspective, this could suggest that any future investment should perhaps be directed at improving services within the city regions that people actually use already.  Alternatively, it could be interpreted that people aren’t commuting between northern cities precisely because of the current state of transport.  This is an area that would certainly benefit from further research.

North Commuting

Figure 1: Number of People on a Daily Commute Within and Between Northern LEP Regions by Car, Van or Train. Source: Office for National Statistics (ONS), Census 2011, Origin-Destination by Mode of Transport

2. Promoting the North

There is potential to re-examine how the north of England is promoted around the world for inward investment, tourism and trade.  At the moment, LEPs in the north are playing an active role in inward investment but also rely on UK Trade & Investment (UKTI) for the delivery of investment projects. This is in contrast to the devolved authorities of Scotland, Wales, Northern Ireland and London that have separate teams with responsibility for project delivery.

Publicly available data on inward investment is limited, especially at regional and local scales, and Figure 2 provides a breakdown of the 1,773 inward investment projects undertaken by UKTI in 2013.  It also includes a column chart which finds that the number of projects in England (excluding London) is generally lower than other parts of the UK according to size of population, number of businesses and value of the economy.

The relatively strong performance of Scotland, Wales and Northern Ireland suggests that the north of England could do more to get to these levels.  For example, it might be advantageous for the north to have an individual team responsible for project delivery, similar to the devolved nations.

UKTIprojects

Figure 2: UKTI Inward Investment Projects in 2013. Source: UKTI Annual Report 2013-2014; ONS Mid-Year Population Estimates; ONS Business Count; ONS Regional Trends

3. Centres of Industry and Learning

The north of England has a considerable number of large companies and institutions that are well embedded in the region and offer valuable supply contracts to surrounding firms.  There are also examples like the AMP in Rotherham, the Centre for Life in Newcastle and Sci-Tech Daresbury in Halton where long-term investment and partnerships are helping centres to become world-renowned.

Figure 3 presents the number of enterprises employing over 1,000 people in the north and it finds that there are up to 325 large organisations in the private sector alone.  Dedicated local or regional support to key employers and greater recognition of firms that call the north home could potentially encourage a shared spirit of growth among northern business leaders.

Enterprises

Figure 3: Enterprises Employing 1,000 or More People by LEP Region, 2013. Source: ONS, Business Count

Centres of excellence and innovation not only emerge from world class employers though, they also come from world class universities.   As shown in Figure 4, higher education institutions across the north had over 500,000 higher education students enrolled in 2012/13, including many international students.  The impression they take away of the north will likely influence whether they stay in the region or come back to work in the future, bringing their skills with them.

HEIs

Figure 4: Number of Higher Education Students Enrolled on Courses by LEP Region, 2012/13. Source: Higher Education Statistics Authority (HESA), Students by HE Provider, 2014

4. Sustainable and Green

A final idea is to encourage and embrace sustainability to the extent that it becomes part of the north’s identity and at a level that competes with leading sustainable cities in Europe such as Copenhagen and Amsterdam.  This could help to address issues as diverse as affordable housing, supporting an aging population, energy security, environmental protection, and improving health and quality of life.

There are relatively few indices that track the sustainability performance of UK cities but one example is the Sustainable Cities Index produced by Forum for the Future between 2007 and 2010 which covered 20 UK cities and 13 indicators. Figure 5 lists the average ranking for each city over the four years of the index.

As can be seen, Newcastle is the highest ranking city in the north, followed by Leeds and Sheffield.  Four other northern cities are in the bottom half of the ranking list; Manchester, Sunderland, Liverpool and Hull.  If steps were taken to transform the north into a more sustainable economy, the northern cities ranking poorly between 2007 and 2010 could be setting the highest standards for the rest of the country by 2030.

Sustainabiity

Figure 5: Sustainable Cities Index Average Ranking (1-20) between 2007 and 2010. Source: UK Sustainable Cities Index, Forum for the Future, 2007-10

References

The Northern Way (2011) Transport Compacthttp://www.northernwaytransportcompact.com/(retrieved 16/10/14)

Campaign for Better Transport (2014) Right Track North webpageshttp://www.bettertransport.org.uk/better-transport/right-track-north-campaigning-better-trains-north-england (retrieved 16/10/14)

UK Trade & Investment (2014) Annual Report 2013-2014https://www.gov.uk/government/publications/ukti-inward-investment-report-2013-2014(retrieved 16/10/14)

HESA (2014) 2012/13 Students by HE Providerhttps://www.hesa.ac.uk/stats (retrieved 16/10/14)

Forum for the Future (2007-2010) UK Sustainable Cities Indexhttps://www.forumforthefuture.org/sites/default/files/images/Forum/Projects/Sustainable_Cities_Index/Sustainable_Cities_Index_2010_FINAL_15-10-10.pdf (retrieved 16/10/14)

Siemens and Economist Intelligence Unit (2009) Green Cities Indexhttp://www.siemens.com/entry/cc/en/greencityindex.htm (retrieved 16/10/14)

City Growth Commission (2014) Unleashing Metro Growth: Final Recommendations of the City Growth Commission, Royal Society for the encouragement of Arts, Manufactures and Commerce (RSA) – http://www.citygrowthcommission.com/publication/final-report-unleashing-metro-growth/ (retrieved 22/10/14)

Swinney, P. & Bidgood, E. (2014) Fast Track to Growth: Transport priorities for stronger cities, Centre for Cities – http://www.centreforcities.org/research/2014/10/20/fast-track-to-growth/ (retrieved 22/10/14)

West Midlands Integrated Transport Authority (2014) Midlands Connect: How better connectivity will maximise growth for the Midlands and the nation – http://www.wmita.org.uk/media/1069/midlandsconnect_a4brochure_final_lowres.pdf (retrieved 22/10/14)

One North (2014) A Proposition for an Interconnected North – http://www.manchester.gov.uk/news/article/6940/one_north_region_s_cities_unveil_joint_plan_for_improved_connections (retrieved 22/10/14)