You are viewing this site in staging mode. Click in this bar to return to normal site.

‘But the RCT shows it’s effective!’: Seven reasons why health commissioners may be reluctant to use your evidence-based and effective health intervention in practice.

Katherine E Brown PhD1, Kristina Curtis PhD1, Lou Atkinson PhD2

1Centre for Advances in Behavioural Science, Coventry University, and Public Health Warwickshire, UK.
2School of Life & Health Sciences, Aston University, UK

As the UK-based academic community steals itself for enlightenment about the specifics of the next Research Excellence Framework (REF), one aspect we are certain about is the increased importance that will be placed on impact of research (e.g. see Lord Stern’s review published in July 2016). Given the gradual proliferation of similar national research assessment systems (e.g. ERA in Australia; VQR in Italy and FCT in Portugal) it seems likely that an increased focus on impact of research will only garner greater sway going forward amongst governments worldwide.  For anyone engaged in research which focuses on the development and evaluation of health-related interventions, one kind of impact they will likely hope for, and may plan to achieve, is uptake and use by the target population of interventions they have developed. In particular, where research funding has been invested in randomised controlled trials (or other methodologically robust evaluations) of interventions and they are shown to be effective and cost-effective at addressing identified health problems, there is arguably a clear need for those interventions to be brought into practice.

There is it would seem however, a long road to travel before we see routine application of theory and evidence-based interventions in healthcare and public health practice.  Evidence of concern around this issue and the wider issue of application of the knowledge base from health psychology was highlighted at the 2016 joint conference of the European Health Psychology Society and the BPS Division of Health Psychology in Aberdeen. Here, there was an extremely well attended roundtable discussion that focussed on how to get health psychology research more commonly used in policy-making and practice. There was interesting and useful input from a range of panel speakers and audience members, but a stand-out comment came from Prof Mike Kelly, the former lead for the National Institute for Health and Care Excellence (NICE) in the UK. Prof Kelly spoke about the different world of policy-makers and the different agendas and priorities at play. He made the point that academics cannot expect policy-makers to come to them and simply accept their work and apply it. Instead, academics need to go to them, and seek to understand as well as influence policy-makers.

At the end of the last REF period, Wallace, Brown and Hilton (2014), set out the case that health psychologists and those working to develop and evaluate health promotion and behaviour change interventions may need to plan more effectively for their interventions to have impact. Included in the impact planning framework that we set out was the need to include key stakeholders in the research from the outset (Wallace et al., 2014). This is not a novel idea, and indeed has been promoted in intervention development frameworks such as intervention mapping for many years (Bartholomew et al., 1998; 2001; 2006; 2011; 2016). If it can be achieved however, it is one of the best approaches for gaining interest in and the potential for implementation of interventions by the public sector. Many of the best theory-and-evidence-based interventions are developed by researchers and academics without partnership with commissioners and other key stakeholders however, and in such circumstances, academics may find promotion of their interventions to those that hold the public purse strings challenging.  We have experience of seven years working embedded within a public health department and well over a decade’s experience of working in partnership with and delivering commissioned work for NHS trusts and public health departments. Based on this, and some of our findings from research with health commissioners and service providers assessing barriers to and facilitators of use of behavioural science evidence in the commissioning cycle (Curtis, Fulton & Brown, under review), we present seven main reasons why health commissioners may be less than enthusiastic about evidence-based and effective health interventions brought to them by academics, and how academics might start to overcome these barriers.

1. Commissioners often act on what looks good to them at the time of need. Commissioning typically works in cycles with contracts of specified lengths and it will depend on where in the commissioning cycle they are as to how likely they will be to show an interest in an intervention. Understanding something about how and when relevant commissioners are looking to renew service contracts may help to make timely and relevant approaches to them. This may mean looking for ways to start meeting and engaging with commissioners, and good places to start will include local and national professional conferences and learning events (e.g. BSPHN conferences, PHE annual conference, Health and Wellbeing Board conferences, or healthcare specialism conferences).

2. Commissioners are often motivated by what works locally and addresses local need. Although academics and researchers are often impressed by the strength of evidence provided by a well-designed randomised controlled trial (RCT), the limitations of RCTs (particularly for public health interventions) have been well documented (e.g. Victora, Habicht, & Bryce, 2004). In particular, many evaluation designs including RCTs ‘seek to eliminate contextual confounders’ (May, Johnson & Finch, 2016; p1). Our research has shown that health commissioners like evidence about what will work locally for their own populations (Curtis et al., under review). They are interested in what worked elsewhere for similar populations, and they like this to be presented in a more ‘anecdotal’ way than the typical write up of a RCT or systematic review of such evidence. That is not to say that commissioners do not access and take account of formal evidence. They do, but often contextual, local factors and ‘real stories’ are simply more compelling. Making links between the trial population from your own research and that of commissioners may help to promote your intervention. In addition, using qualitative evidence from participants and associated professionals (e.g. from process evaluation work) will also support the message that this will likely work for them.

3. Local authorities and other public organisations may be fearful about digital solutions due to Information Governance and data protection issues. Many interventions to support health outcomes are making use of the internet and the growing ubiquity of mobile technologies to support people to improve health outcomes. The evidence base suggests that digital interventions have huge potential to support health improvement and the preventive health agenda in areas that include sexual health (e.g. Bailey, Mann, Wayal et al., 2015), weight management (e.g. Sherrington, Newham & Bell et al., 2016) and smoking cessation (Griffiths et al, 2016; in press; Lorencatto et al., 2016). Our experience in public health has shown that even when intervention development using digital technologies has been commissioned by Public Health, the broader organisational structures in which they reside (i.e. local authorities in England, but may equally apply to other health and government organisations across the world) may mean there is resistance to uptake because of fears surrounding data security and maintaining good information governance. In the UK, breaching the Data Protection Act (1998) and the more recent General Data Protection Regulation (GDPR) can mean huge fines for organisations, prison sentences for those at the top of such organisations and huge reputational damage. Hence, caution on these issues is understandable. Seeking good advice and support from IG/IT data protection specialists on these issues and making sure that messages about the robust nature of data security and protection of identifiable data are included in communications with health commissioners is essential.

4. Commissioners and others responsible for what gets funded may not appreciate the value of Evidence-Based behavioural or behaviour change interventions. There are still many professionals working in healthcare and public health and in positions of power with respect to “who holds the purse strings” that know little about the behavioural sciences and the potential value they offer in relation to improving health and wellbeing outcomes. When you work in this field it can be difficult to fully appreciate the perspective that politicians, policy makers, epidemiologists, medics and the whole host of others who may hold power to make or break the translation of your work into practice have. Often they will be focussed on outcomes alone (e.g. have we reduced sexually transmitted infections?; Did we reduce hospital admissions?; Did we increase flu vaccinations?) and have little appreciation of the behavioural factors at play and the size of the scientific evidence base that can help to illuminate and facilitate actions that will influence that behaviour. Even those with good appreciation of this field of work will likely have a whole host of other priorities baying for their attention when it comes to decision-making. As behavioural scientists we hold a collective responsibility to work on our ‘comms and marketing strategy’, and consider how we communicate the value of what we do to those who sit outside, but are working alongside, our discipline.

5. Cost-effective may still not be affordable. Even if the RCT results show that an intervention is not only effective but cost-effective, it may remain outside the budget constraints of those who could potentially put it into practice. Even if you are not trying to draw income or profits from the roll-out, there may be costs associated with your approach or product that put it beyond the commissioners’ reach. It is important therefore that researchers consider any potential cost implications at the planning and development stage.

6. Commissioners select service providers they know and trust, or who have a demonstrable track record of delivery. Employing an organisation that has previously provided a good quality service (for that commissioner or elsewhere) reduces the risk that the commissioned service will be poor or not delivered to the desired specifications. However, established providers may not necessarily be knowledgeable about behavioural science or able to design or offer evidence-based solutions. Rather, providers tend to offer pragmatic solutions which fit the budget and play to their strengths in service delivery. Where evidence-based, effective interventions have been designed and tested by academics or via research trials, the originators of such interventions rarely have the remit or ability to provide this as a service on an ongoing basis, leaving commissioners wishing to deploy that intervention to find a reliable provider who is willing to adopt and deliver someone else's intervention. This is not only more difficult to achieve but adds an extra element of uncertainty and risk for both provider and commissioner. For these understandable reasons, commissioners are often reluctant to risk their limited budgets on potentially ground-breaking solutions with no track record in the ‘real world’. Partnering with service providers through the intervention development process and conducting pragmatic trials (where interventions are delivered to ‘real’ patients and service users), may create the necessary pathway to implementation and increase established service providers’ knowledge and competence to deliver services which maximise the potential for behaviour change.

7. Commissioners are sometimes reluctant to approach academics for support. Our research with commissioners indicates that they feel more comfortable in approaching a healthcare professional compared to an academic for information and support (Curtis et al., under review). Some report negative experiences of communicating with academics. It is important therefore, that commissioners’ perceptions of academics are changed to increase their approachability. Increasing our ‘user-friendliness’ will help. For example in the West Midlands, PHE have been working as a conduit bringing together academics from local universities and those from public health departments and third sector providers to engage in workshops focused on behaviour change and evaluation skills training. Other regional PHE departments will likely be keen to take similar approaches to support the workforce in their area. In addition the BSPHN aims to promote collaboration between research and practice, so get in touch, join our events, and get involved!

All of this means that some of us need to get better at working in the middle-ground between research and practice. Those who want to keep their careers focussed within academia can still do this, but should link more often with academics also working in practice or include practitioners, commissioners or other relevant members of the workforce in their research (e.g. on advisory groups and steering committees). Academics should seek to understand a bit more about the world of practice to which their work relates. For those working within the public health workforce, their mission needs to be to seek out academics interested in the application of their research in practice and teach them about what it is they need from research as well as taking the opportunity to use their academic expertise. Heads of departments and public health leaders also take note! We need you to support your workforces to work in this way.


Bailey J, Mann S, Wayal S, et al. (2015). Sexual health promotion for young people delivered via digital media: a scoping review. Southampton (UK): NIHR Journals Library; 2015 Nov. (Public Health Research, No. 3.13.) Available from: doi: 10.3310/phr03130

Bartholomew Eldrigde, L. K., Markham, C. M., Ruiter, R. A. C., Fernàndez, M. E., Kok, G., & Parcel, G. S. (2016). Planning health promotion programs: An Intervention Mapping approach (4th ed.). Hoboken, NJ: Wiley.

Bartholomew, L. K., Parcel, G. S., Kok, G., Gottlieb, N. H., & Fernández, M. E. (2011). Planning health promotion programs: An Intervention Mapping approach (3rd ed.). San Francisco, CA: Jossey-Bass.

Bartholomew, L. K., Parcel, G. S., Kok, G., & Gottlieb, N. H. (2006). Intervention Mapping: Designing theory and evidence based health promotion programs. San Francisco, CA: Jossey-Bass.

Bartholomew, L. K., Parcel, G. S., Kok, G., & Gottlieb, N. H. (2001). Intervention Mapping: Designing theory and evidence based health promotion programs. Mountain View, CA: Mayfield Publishing.

Bartholomew, L. K., Parcel, G. S., & Kok, G. (1998). Intervention Mapping: A process for developing theory- and evidence based health education programs. Health Education and Behavior, 25(5), 545–563.

Curtis, K., Fulton, E., & Brown, K. E. (accepted). Factors influencing application of behavioural science evidence by public health decision-makers and practitioners, and implications for practice. Preventive Medicine Reports

Griffiths, S., Brown, K. E., Fulton, E. Naughton, F. & Tombor, I. (2016).  Are digital interventions for smoking cessation in pregnancy effective? A systematic review protocol. BMC Systematic Reviews 5:207 DOI: 10.1186/s13643-016-0390-6

Griffiths, S. E., Parsons, J., Fulton, E., Naughton, F., Tombor, I. & Brown, K. E. (2018). Are digital interventions for smoking cessation in pregnancy effective? A systematic review and meta-analysis. Health Psychology Review

May, C.R., Johnson, M., & Finch, T. (2016). Implementation, context and complexity Implementation Science 11, 141

Sherrington, A., Newham, J. J., Bell, R., Adamson, A., McColl, E., & Araujo-Soares, V. (2016). Systematic review and meta‐analysis of internet‐delivered interventions providing personalized feedback for weight loss in overweight and obese adults. Obesity Reviews, 17, (6), 541-551.

Stern, N (2016). Research Excellence Framework Review: Building from success and learning from experience. Available at:

Victora, C. G., Habicht, J-P., Bryce, J. (2004) Evidence-Based Public Health: Moving Beyond Randomized Trials, American Journal of Public Health 94 (3), 400-405. DOI: 10.2105/AJPH.94.3.400

Wallace, L.M., Brown, K.E., & Hilton, S. (2014). Planning for, implementing and assessing the impact of health promotion and behaviour change interventions: a way forward for Health Psychologists. Health Psychology Review, 8(1), 8-33. DOI:10.1080/17437199.2013.775629