Skip to content
back Back
Expertise
back Back
Insights
back Back
About us

Evaluating a programme to support young people with care-experience at risk of homelessness in England

EXPERTISE
Policy, Development & Evaluation

Share to

Challenge

Evaluating complex policies and programmes often means working without the clean comparisons of a randomised controlled trial. That doesn’t mean we have to compromise on rigour. As part of the first-of-its-kind Test and Learn Programme, funded by the Ministry of Housing, Communities and Local Government (MHCLG) being delivered by the Centre for Homelessness Impact (CHI), Verian is using robust quasi-experimental methods and data that has already been collected to retrospectively evaluate a programme to support individuals who have left the care system.

There is a mountain of evidence showing that people who have been through the care system in the UK have worse outcomes than those who have not. The difference in housing outcomes is one of the starkest. Young people with care experience face a ‘care-cliff’ when transitioning towards independent living, with many reporting that they do not feel sufficiently prepared for independent life.1 Statistics back this up:

a 2020 UK government survey of rough sleepers reported over a quarter of respondents had been in care,2 and official figures show a 54% increase in homelessness amongst care leavers between 2019 and 2024.3

The ‘Support for Care Leavers at Risk of Homelessness and Rough Sleeping’ programme was introduced in 2018. This programme provided funding to local authorities for specialist advisor roles aimed at offering intensive individualised support to care leavers transitioning to independent living. As part of the Centre for Homelessness Impact’s Test & Learn Programme, we have been commissioned to evaluate this programme to understand its effectiveness.

Approach

What is the Centre for Homelessness Impact Test and Learn programme?

The Test & Learn programme is a ground-breaking programme to trial innovative approaches and test what works to reduce homelessness and end rough sleeping.

Several projects within this programme are testing interventions using randomised controlled trials (RCTs) – the ‘gold-standard’ of evaluation evidence. This intervention did not randomise who received the programme – the programme was introduced in 2018 and was allocated to local authorities through a pre-set criteria. The number of high-risk care-experienced people in the local authority was used to decide whether local authorities would be funded as part of the programme. To date, three groups of local authorities received funding at three separate points in time (2018, 2021, and 2022), and there is a large group of local authorities that have never received funding.

When randomisation isn’t feasible policy evaluators can turn to quasi-experimental methods to understand a policy’s impact. These methods are used to establish causal relationships using observational rather than experimental data, and so are applicable to real-world settings, for example, evaluating interventions that have been allocated through some criteria. When the underlying assumptions of these methods are met and supported by rigorous sensitivity analyses and robustness checks, causal estimates from quasi-experimental methods are as credible and policy relevant as those from RCTs.

The quasi-experimental methodology we are using to estimate whether the programme had an impact is an approach called difference-in-differences. This approach exploits the fact we have local authorities that received funding, and local authorities that didn’t receive funding. The crux of this method is that the estimated impact is measured by the difference between the change in outcomes for local authorities once they were funded, compared to the same change in outcomes for unfunded local authorities – the difference in changes… or the difference-in-differences!

The key assumption of this method is that the change in outcomes for unfunded local authorities represents what would have happened to funded local authorities if they hadn’t received funding. We need to model this counterfactual change in outcomes for funded local authorities to estimate the impact of the programme. This assumption is called the parallel trends assumption. We can be more confident in this assumption if trends in outcomes were similar across funded and unfunded local authorities before funding was given. The outcome we are looking at for this evaluation is the number of care experienced people suffering from homelessness, measured through MHCLG and Department of Education’s (DfE) secondary data sources.

Impact

Earlier this year, we published our evaluation protocol.4 This sets out how we’ll conduct the difference-in-difference analysis in detail. Importantly, by pre-registering the analytical steps we will use for the impact analysis it removes any doubt of the research taking part in ‘p-hacking’; the manipulation and tinkering of data and methodological decisions to only report findings that are favourable or statistically significant. This practice of pre-registering impact evaluations contributes to more reliable and reproducible findings, ultimately supporting stronger evidence for decision-making – whether you’re using an RCT or quasi-experimental design.

We are now into the analysis and reporting stage of the evaluation. We are following our pre-registered plan for the difference-in-differences analysis and are analysing the extensive qualitative and quantitative data we have collected for our implementation and process, and economic evaluation strands. We look forward to the first publication of the evaluation report of the Test and Learn Programme for this project. It has been a fantastic opportunity for us to be part of this ground-breaking programme, working to tackle such an important issue.

Stay tuned for our evaluation report later, to be published shortly!

 

[1] Van Breda et al. 2020; Atkinson & Hyde 2019

[2] MHCLG, 2020

[3] https://becomecharity.org.uk/press-release-54-increase-in-homelessness-among-young-care-leavers/ 

 [4] Also registered to the Open Science Framework - https://osf.io/vhdpj  

 

Verian Group

Our latest thinking

Subscribe to receive regular updates on our latest thinking and research across the public policy agenda.

Our expert teams around the world regularly produce research and insights relating to public policy issues. 

If you are interested, please provide your details. You can unsubscribe at any time.