Not everything that counts can be counted, and not everything that can be counted counts

This thought piece was first published as part of the NextGen Research page. The NextGen project is Australia’s largest study on engagement and infrastructure delivery

Over the last few months in my role as CI on the NextGen engagement project, I’ve been lucky to talk to a range of dedicated community engagement professionals and committed infrastructure practitioners. We launched our successful national survey, and facilitated launches and workshops all over the country. I have been talking to people about why they think community engagement fails, and what is needed. In our workshops, we asked:

‘What do we need and what do you want to learn?’

We’ve heard lots of different responses, but one idea that keeps coming up is that we need data. In our workshops for example, practitioner asked “are there more quantitative means of measuring engagement” and “is there a way to quantify the value of engagement?”.

Practitioners are telling us quantitative data and metrics are needed as this is the language of finance, of engineers, of the business and senior execs. Community engagement practitioners need to be able to talk this language to show why community engagement is important. Data could prove how many days, and therefore how much money, is lost on a project through community protest or community blockades because community engagement is lacking. It could also demonstrate the cost of government intervention on behalf of communities if a project does not invest in developing a social licence. Quantitative evidence on the importance of engaging with the community, and the risks to projects if community isn’t engaged, is seen to play an important role in building the business case internally. Building the evidence base builds the case for resources, and for ensuring community engagement is a focus for infrastructure development and delivery. 

I know from my own experiences working with community engagement professionals how they have one eye outward, engaging with the community, and one inward, seeking to defend, argue and protest the importance of what they do. Very often this is with colleagues whose background is finance or engineering, where their very day to day business is calculating and working with numbers and data.

And yet, I can’t help thinking, what then for the role of other types of data?

In my other research in public policy, I have recently written about the evidence based policy movement. Here academics and practitioners are calling for data to inform and evaluate, seeking evidence on ‘what works’. On the surface, there isn’t much to argue with. After all, why wouldn’t we want our government’s policies to be based on evidence? But there are concerns that a particular type of evidence, evidence using quantitative data, seems to be seen as more valid, more legitimate than other forms of knowledge. This leads us to ask what the role then is for the knowledge of policy makers and the public service in understanding what works? What about the experiences and expertise of the stakeholders, NGOs or frontline staff who may be involved in the delivery of policies? Or what, even, of the stories of the clients, or ‘policy targets’?

The parallels to community engagement, to me at least, are clear. I suggest that in using data and metrics as evidence for community engagement, there is a risk that other forms of knowledge may be side-lined. What becomes the role of the expertise of community engagement staff, or knowledge and experiences of the affected community? These experiences and knowledge are, I would argue, equally important but are often qualitative in nature. Indeed, what about other reasons besides it saves time and money – is there a role for arguing that community engagement is the right thing to do? Or is the failure of this argument leading to the call for metrics?

What do you think? Is this a valid concern, or are the risks to community engagement of not talking the language a greater concern?