Skip to main content

Five minutes with Professor Don Nutbeam AO on how to get evaluation right.

Read time: 2 minutes

Date: 06/2025

Professor Don Nutbeam AO is Executive Director of Sydney Health Partners, Professor of Public Health at the University of Sydney and an international leader in evaluation and translating research into public health practice. His career has spanned leadership positions in universities, government and health services and consultancy, with roles including Vice-Chancellor of the University of Southampton (UK), Academic Provost at the University of Sydney and Head of Public Health in the UK Department of Health in the Blair Government.

He is co-author of the seminal text, Evaluation in a Nutshell, now in its third edition. I asked Don to share his insights into designing successful evaluations of healthcare programs.

Top Tips: Professor Don Nutbeam on how to conduct a successful evaluation of healthcare programs

  • It is vital to initiate dialogue with those implementing and affected by the intervention you are evaluating right from the beginning, so you understand their needs and answer the questions they want answered.
  • Strike a balance between the scientific quality of the evaluation, and how practical and useful it will be for the end users.
  • Evaluation is not a linear process. You need both qualitative and quantitative research to build a full picture of what happened and why.

Successes

Mary Haines: You’ve been working in evaluation for several decades. What do you think the key is to designing both a high-quality but also a useful evaluation of a healthcare program?

Don Nutbeam: Professor Nutbeam says that throughout his career, he has grappled with tension between ensuring methodological quality of evaluations and considering the practical use of the evaluation to improve the program or intervention being evaluated.

“If you took a conventional scientific view of the world, you would end up answering questions that no-one asked. The reality is, particularly in public health when you’re dealing with quite complex causal mechanisms… they can’t be boiled down to something that can be easily studied in a conventional randomised trial. So you have a point of tension between the usefulness of the interventions and scientific quality.”

“We need to conduct evaluations that are both of reasonable, scientific quality to give you confidence in the observed outcomes, and that are sufficiently practical and useful to give the potential subject of your evaluation confidence to use the learnings.”

 

Mary Haines: So how do you chart that middle course?

Don Nutbeam: “The approach I’ve always taken to this is that you have to work out where you want to end up, as a really important starting point. If your motivation is to win a research grant and publish a paper – and I don’t mean this disparagingly – then you have to focus on conventional, scientific definitions of what makes a quality evaluation. If you actually want to have an impact in a healthcare system or in our public health system, then you have to balance that with what’s the user’s perspective on this? How might something like this be done in less than ideal conditions?”

“It means it’s not all driven by the needs of the researcher, but that’s balanced by the needs and requirements of the intended user. And the way to get it right is to start that dialogue from the beginning, and not in some tokenistic way.”

 

Mary Haines: Can you give an example of how that approach has worked for you?

Don Nutbeam: He describes an evaluation of COVID-19 public communications conducted early in the pandemic by researchers from the University of Sydney, western and south-western Sydney, to address the realisation that communications were failing to reach a significant proportion of the population. The evaluation was designed in a matter of weeks, in direct consultation with local communities representing 10 of the most commonly used languages other than English in the area.

“We were able to develop, test and provide feedback on different ways of communicating with different population groups in Western Sydney and we were able to expose, very quickly, huge differences in their consumption of media, in the type of messaging that they were more or less responsive to, and able to provide all of that feedback, both to the local health districts and importantly, to the Ministry in a matter of weeks.:

“That led to a fundamental change in the way in which we communicated about COVID from that point onwards.”

The starting point was to ask what the community needed and to open a dialogue with them, he says.

“We were able to answer useful questions that ultimately influenced, in an important way, both the way in which we communicated with the public, and the impact of that communication.”

“We need to start getting more of the right type of answer to the right type of question, rather than the perfect answer to the question no-one asked”.

– Professor Don Nutbeam AO

Pitfalls

Mary Haines: What are some of the learnings about what to avoid when designing evaluations?

Don Nutbeam: “There are a couple of big pitfalls that I’ve definitely fallen into. One is that bringing about change in complex behaviours requires substantial and sustained intervention. Substantial meaning that it often needs multiple dimensions to it. And sustained means that what might appear to be success, for example, in smoking cessation after a few weeks, might not be sustained.”

He cites the example of an evaluation of a state-of the-art intervention to reduce smoking among school children in the UK the 1980s. While strongly supported by evidence, the program was ultimately found not to have any sustained impact on smoking behaviours, likely due to broader external influences on children’s behaviours.

“Definitely one thing I’ve learned that if you’re going to conduct an evaluation, you need to be sure that your intervention is actually strong enough and can be sustained.”

Tips

Mary Haines: So what are your key tips for ensuring evaluations are really useful for end-users? 

Don Nutbeam: It’s common sense to me. If you want someone to use something that you’ve been working on, it’s really important to understand their world.

“Don’t imagine this is a linear process. I always think of it as building a wall. You know, there’s a brick here and a brick there, and you’ve got to put the whole lot together in order to fully understand what happened and why, with what effect, and that that requires this mix of qualitative and quantitative evaluation. It requires regular and meaningful connectivity to potential users – those who might be most affected by whatever intervention you’re seeking to evaluate.”

“We have to give far more consideration to where the evaluation leads, how practical it is for subsequent use and how it might become adopted. “

On the future

Mary Haines: How do you see evaluation evolving in the future?

Don Nutbeam: “The direction I’d like to see things go is a better cultural balance between scientific integrity and consideration of where this leads and how practical it is for subsequent use.

“There will be innovations driven by the desire to achieve efficiency, but I think there are some significant opportunities for the digitisation of the evaluation process. I think we’re on an accelerating curve in that regard,” he says.

“There are important things that we can use some of the large language model platforms to help us with – AI has shown its potential to create synthetic data, which in essence allows researcher to conduct a wide range of experiments and simulations without the risk of exposing patients’ identity. This can open data up for use by researchers in a way that we could never have done before.”

June 2025

Interview by Mary Haines, Founder and Director of MH Consulting Group (MHC), a boutique consultancy specialising in research strategy, evaluation and review, strategy and programs, and facilitation. MHC developed the five minutes interview series as a platform for leading professionals to share their know how.

You are welcome to republish this article. Please include the following attribution:

This article was first published by MH Consulting Group: www.mhcgroup.com.au.

Discover more 5-minute interviews by Mary Haines