26 July 2019
Times are changing. Not only do you have to be good at what you do, increasingly you need to show the outcomes of what you do. CSSA’s former Research Director, Brenton Prosser, explores some simple strategies for social service providers to display their good work with more impact.
Around the world, governments need more hard proof to justify their public spending. Cost-benefit analysis and reporting return on investment is standard for most policy initiatives. Program managers demand evidence that a service delivers or for an outcome achieved. Even academic research needs to incorporate more real world application.
The social service sector is not immune to these changes. The implications reach from Chief Executive to coalface delivery. Capturing outputs is central to justifying funding. Measuring outcomes is instrumental to winning the next tender. Information collection has become part of everyone’s everyday work.
For many in the sector, this is unfamiliar territory. Their training was not as evaluators or researchers. The prospect of a blank program logic can give some people nightmares. Let alone producing them across many programs. So while there may not be many ways to make this new demand less demanding, there are ways to maximise its impact.
1. Plan ahead – you can’t capture evidence retrospectively
You have won the funding, but time is tight. The funding body took longer than planned to approve your application, but the start date is the same. All hands are on deck, the inputs and outputs are updated, and the service starts.
Such scenarios present the risk that the only data captured will be output counts for departmental databases. The only case studies will be the ones written by senior staff well after events.
But what about that inspiring personal story of change shared at a team meeting? What about clients surveys that capture service experience? What about the picture that tells a thousand words?
New possibilities are everywhere with the spread of hand-held and tablet technologies, new audio, video and online opportunities (such as technologies that can translate into the social services sector). However, they need processes for informed prior consent and training of staff to make the most of them. Such things need to be planned while preparing the tender, not left until after funding is won.
2. Think big – develop a theory of change
Program logics are everywhere in our sector. Most lack two things, measurable outcome indicators and a theory of change.
Often program outcome statements, even from government departments, are vague. They do not identify what has changed, how we know it has changed and what tangible evidence there is to prove it. Across the CSSA membership, excellent examples of going beyond subjective measures are emerging. In the past, CSSA has collaborated with members to develop family outcomes frameworks. These help show more compelling cases of real impact and change.
However, if we are to assess the real impact and change of a program, we also need to assess both the current and desired situations. This is part of building a theory of change (TOC) and pursuing the necessary steps to achieve it.
Such TOC plans enable us to design evaluation questions that capture the full value of a program. They also support our ambition beyond minimal reporting requirements.
Also, our members provide services in a way that respects human dignity and works toward a fairer society. Demonstrating this in a wider theory of change is vital for our mission, as well as capturing the unique value of our services.
3. Mix it up – use multiple types of evidence
Our sector is very good at sharing inspiring accounts of personal change. Often it is these accounts that sustain us in our service when times are tough. For some time, we have used these accounts to show the impact of our work, as well as support lobbying to government. While such stories remain powerful, we now also need other evidence to support them.
Using multiple types of evidence increases the reliability of our claims. If numerical measure (quantitative), descriptive analysis (qualitative) and concrete example (case study) all point to the same conclusion – then it is likely to be true. Academic researchers call this triangulation.
This strategy also works well when it comes to communicating impact. Early on, I learned to include statistics, situation and stories into every project. My research mentor explained that numbers get the media interview, description explains the problem and narratives provide the human interest hook. Over the years, I have found this to be a principle that also works with politicians, policymakers and evaluators. But to succeed, it has to be part of pre-design, not post-reporting.
4. Identify audiences – use different data for different targets
Following on, a good program and evaluation impact plan does not focus just on the funding body. Social service programs need to communicate value to many stakeholders.
Do you need to communicate to the local community for the program to work? Does the Board need reassurance or your staff need to be more engaged? Are funding sources declining and you need support from elsewhere? These will vary by situation, but each audience will value different evidence.
The first questions I ask in all my research and evaluation training sessions are:
- who are your most important stakeholders?
- what do they need to be persuaded about impact?
- what do you know and can show, and what do you need to collect?
Getting the answers to these questions will require pre-planning. Those working in university impact centres often describe a ‘triple writing strategy’. What this means is that if you put in the effort to write a grant, report or paper, make sure you also convert it into a professional piece and social media post.
Finally, take time to think about the information culture in your organisation. New processes are important, but non-compliance is a challenge we all recognise. Think about strategies that will help your team see how collecting data fits in the big picture. This about strategies that will help your team see how collecting data fits in the big picture, as well as show their extra time adds up for organisations that have to not only be good at what it does, but has to show it too.