SROI with a rubric?
A simple addition to more fully meet Social Return on Investment’s potential
Social Return on Investment (SROI) is an approach to accounting for the value an organisation, policy or program creates, relative to the value it consumes. While a traditional set of financial accounts only looks at financial value, SROI can also include social, environmental and economic impacts. SROI quantifies these impacts in monetary units with the aim of providing a comprehensive view of the value created per unit of investment. This method involves working with stakeholders, mapping outcomes and costs, valuing them, calculating a benefit-cost ratio (“SROI ratio”), and reporting results. SROI guidance can be found here.
The information provided by a SROI analysis can be used to help organisations improve their performance, make informed investment decisions, and communicate the value they add. However, as noted previously, the method has been used to a variable level of quality, with some resting on questionable assumptions and leaps of faith, especially regarding causal attribution and valuation of benefits.
I’ve written before about ways in which SROI is similar to and different from cost-benefit analysis (CBA) and their respective strengths, challenges and limitations. In particular, I think SROI could benefit from CBA guidance which emphasises rigour, consistency and replicability in obtaining and attributing sound estimates of benefits and costs. Conversely, SROI guidance is stronger on involving stakeholders and the practice of CBA could learn a thing or two from this. Full articles here and here if you want to dig into the details.
In this article I focus on one aspect of SROI that could be beefed up: the potential to address multiple criteria in an explicitly evaluative SROI.
Single-criterion evaluation
CBA single-mindedly pursues one criterion concerned with investments creating more incremental value (benefits) than they consume (costs), making society better-off in the aggregate regardless of equity and other implications. (See this article for a more detailed explanation of CBA’s Kaldor-Hicks efficiency criterion).
This single, preordained criterion in CBA contrasts with progam and policy evaluation more generally which can accommodate multiple, contextually-defined criteria. This is important because there’s more to value for money (VfM) than just one model of efficiency. Other aspects of good resource use include equity, sustainability, productivity, and more. Here’s something I wrote earlier on VfM criteria.
SROI is somewhere in between CBA and evaluation-in-general: it sums up the change in value consumed and created in a SROI ratio (essentially, the monetary value of impacts divided by the monetary value of resources used). Equivalent to a benefit-cost ratio (BCR) in CBA, the SROI ratio in effect tracks something akin to Kaldor-Hicks efficiency without formally referencing it. However, SROI guidance emphasises that the ratio isn’t the whole story, and recommends stakeholder engagement and mixed methods, including interviews, focus groups, surveys, documents review and impact mapping to investigate changes (positive or negative) and the value that stakeholders place on those changes. I’m told many SROI practitioners downplay the ratio and emphasise the wider story too.
These participatory, mixed-methods processes can potentially address more than just the cost-benefit test.
The possibility of multi-criterial SROI
I’d like to thank Mark Forsyth for pointing out that SROI impact mapping hints at multiple criteria that are often used in VfM assessment, specifically (quoting Mark):
Who and how many stakeholders? Equity when considered alongside all other criteria
At what cost? What will/did they invest and how much? Economy
What are the outputs? Efficiency when combined with inputs above
What changes and by how much? Effectiveness
When do they start and how long do these outcomes last, does the outcomes drop off in future years? Sustainability
How valuable, what is the relative importance of the outcome? Relevance
What will happen/would've happened without the activity? Impact when combined with effectiveness and others
What activity would/did you displace? Who else contributes to the change? Coherence
What is the overall value per amount invested (using all the above) Cost-effectiveness
While the SROI impact mapping questions don’t cover every aspect of the corresponding criteria, it’s easy to see how they’re related. The purpose of these questions is to explore pieces of the performance story and support the process of identifying, quantifying, attributing and monetising costs and benefits that can be included in the SROI ratio. For the most part, these questions seek answers that are descriptive, not evaluative - though I can see a couple of exceptions (e.g., “how valuable…”).
But what if we give it a few mods? These questions could be expanded just slightly to actually evaluate resource use at each step, including details of resource management, productivity and social justice that are not reflected in the SROI ratio. For example, to assess: how economically its resources are managed; how efficiently organisational processes are carried out; how equitably the program’s inputs, actions, outputs and outcomes are distributed; and so on.
So it appears that SROI can (and nearly does) accommodate multiple criteria.
However:
The criteria are not made explicit;
There’s no inbuilt transparent basis for rating performance against the criteria individually; and
There’s no clear basis for synthesising the individual ratings to make an overall evaluative judgement from all of the criteria collectively.
The SROI ratio is the part that comes closest to having an explicit criterion, standard and synthesis process - and being simple and compelling to communicate, it is this ratio that seems to capture attention when the findings of a SROI study are publicised, whether intended or not. Yet the ratio is not a complete evaluation, for reasons I summarised here.
This strikes me as a missed opportunity to do more and better with SROI.
Explicitly evaluative SROI
Another topic I’ve written about recently is evaluative reasoning - a process underpinned by a general logic which involves judging value, based on:
Criteria: Aspects of a policy, program, organisation (etc) that matter to people;
Standards: How good performance would have to be for people to consider it good enough; and
Evidence: Observations of the policy/ program/ organisation and its impacts.
Evaluative judgements (e.g., “we find that this program is worth investing in because it creates value of $2.70 for every $1 spent”) rest on criteria, standards, evidence (and supporting rationale about the validity of each), whether stated or not. For example:
Criterion: The relative magnitude of benefits and costs, as indicated by the SROI ratio.
Standard: Benefits greater than costs; SROI ratio >1 is good, bigger is better.
Evidence: Estimated monetary values of social, environmental, economic and financial outcomes and costs attributed to the program.
Despite the guidance asserting that SROI is more than the ratio, it’s the ratio that seems to be the overriding basis for judging the value of a program. SROI still looks like single-criterion evaluation to me. But it could be more.
In professional evaluation, involving value judgements by, with, or for people affected by policies and programs, conclusions are strengthened by making the evaluative reasoning process explicit, transparent and inclusive, as I argued here.
I often recommend a participatory, power-sharing approach to evaluative reasoning that involves co-developing and using rubrics with stakeholders, for reasons I’ve shared before (in short, because values come from people; see link in the previous paragraph), though I acknowledge that there are alternatives to rubrics (see here and here for options).
A participatory, rubric-based approach strikes me as a natural fit with SROI’s objectives and principles, and could open the door to using SROI to address multiple criteria and present evaluative judgements that are just as compelling as the number, but more comprehensive and more nuanced.
SROI on steroids
SROI analysis could be strengthened by:
1. Making criteria explicit
An SROI study could declare what aspects of performance, quality, or value are in scope and, for each aspect, define what they mean for the specific investment. The following are, of course, examples because criteria need to be determined contextually:
Cost-effectiveness: what would we see when the investment is creating enough value to be considered worthwhile? (For example, SROI ratio greater than 1).
Equity: what would it look like when the organisation, policy, or program is addressing inequities, and fairly distributing resources, actions, impacts and value? (For example, identifying and reaching those most in need).
Effectiveness: what short- to medium-term impacts would tell us the investment is progressing toward meeting its longer-term value proposition? (For example, reducing rates of homelessness in a locality).
Efficiency: what would we see if the right actions were being carried out productively and in the right ways to deliver the expected outputs? (For example, providing services in ways that are a good cultural fit with the community; meeting performance targets for timeliness and numbers of people assisted to find stable accommodation).
Economy: what would good stewardship of resources look like? (For example, managing costs of staff and overheads within the available budget; securing ongoing funding to sustain service provision; minimising environmental waste).
Additional criteria such as relevance, coherence, and sustainability may also be included.
2. Setting standards for rating performance on each criterion
For each of the criteria, how will we judge whether performance is excellent, good, adequate or poor? (or some other set of levels). To illustrate, the following standards can either be used ‘out of the box’ or adapted to more specifically define levels of performance for each criterion. For example, we could define adequate performance on the cost-effectiveness criterion as “a credible prospect of breaking even” and excellent as “breaks even beyond a reasonable doubt”, as detailed here.
3. Adding a framework for synthesising and balancing multiple criteria in the final judgement
The following table illustrates how a series of ratings can be summarised for each criterion and for VfM overall, together with supporting evidence. Overall VfM considers the ratings for the individual criteria collectively, weighted according to the relative importance of different criteria bearing in mind factors such as the life cycle of the policy or programme. For example, economy and efficiency may be more important during implementation, whereas effectiveness, cost-effectiveness and equity may receive more weight later on. Examples of this approach in real-world evaluations can be found here.
How?
A fully evaluative SROI can be designed and implemented by following the steps of the Value for Investment approach. VfI isn’t a competing method or tool; it’s a complementary system (a set of principles and a process) to support evaluators in selecting and coordinating an appropriate mix of methods and tools for each context. Guidance here and here. Training is also available.
Bottom line
The addition of explicit criteria and standards, in a rubric or other framework co-created with stakeholders, would align SROI with Program Evaluation Standards which call for explicit evaluative reasoning.
This addition would also allow SROI analysis to make more meaning from information it already collects (or could collect), making SROI a truly evaluative approach.
Thanks for reading!
If you found value in this article, please share the ❤️ to increase its visibility on Substack and LinkedIn.
Acknowledgement
Thanks to Mark Forsyth for the inspiration for this article and for peer review. All opinions expressed are my own, as are any errors.
Where’s this? The first person to correctly identify the location of this photo in the comments below wins a free 12-month subscription to the full Evaluation and Value for Investment archive.
You’ve got it David!
Another great article Julian. Hope I’m right in saying the Shot Tower Museum at Melbourne Central