Guidance

UK Shared Prosperity Fund: evaluation strategy

Updated 16 April 2025

This strategy sets out further detail on evaluation plans for the UK Shared Prosperity Fund (UKSPF). This sits alongside the UKSPF additional information, which contains associated guidance for local partners on participating in the UKSPF evaluation, and 2025-26 technical note, which sets out key updates to the UKSPF for 2025-26.

Introduction and context

Purpose of the UKSPF

The UK government has set out an ambitious plan for change, focused on five national missions: ambitious, measurable, long-term objectives that provide a driving sense of purpose for the country. The UK Shared Prosperity Fund (UKSPF) is a central pillar of this ambitious agenda and a significant component of its support for places across the UK. 

The UKSPF programme – a total of £3.5 billion in the period March 2022 to March 2026 - provides funding for local investment, with a focus on building community pride and opportunities across the UK[footnote 1].

There are 3 investment priorities:

  • communities and place: investment in activities that enhance physical, cultural, and social ties and access to amenities, such as community infrastructure, local green space, and innovative approaches to crime prevention
  • supporting local business: investment in business support and entrepreneurship programmes, local retail, hospitality, and leisure facilities, and productivity-enhancing, energy efficient, and low carbon technologies in collaboration with the private sector
  • people and skills: investment in measures to enable adults to join and progress within the labour market, boosting their core skills through support for qualifications, training and to overcome barriers to work

Figure 1 shows how the UKSPF’s investment priorities, themes, and sub-themes align with the government’s Missions.

Figure 1: UKSPF investment priorities and the government’s Missions

This image sets out the government’s Missions.

  • Mission 1: Kickstart economic growth
  • Mission 2: Make Britain a clean energy superpower
  • Mission 3: Take back our streets
  • Mission 4: Break down barriers to opportunity
  • Mission 5: Build an NHS fit for the future

It then shows the 3 investment priorities for the UKSPF (Communities and Place, Supporting Local Business, and People and Skills), their underlying themes and the sub-theme/scope of each. It shows how the 5 Missions will be delivered by UKSPF activity under each priority and theme.

Communities and Place

There are two themes under the priority of Communities and Place, these are Healthy, Safe and Inclusive Communities, and Thriving Places.

Healthy, Safe and Inclusive Communities supports:

  • Mission 2: Make Britain a clean energy superpower
  • Mission 3: Take back our streets
  • Mission 4: Break down barriers to opportunity
  • Mission 5: Build and NHS fit for the future

Its scope includes improving health and wellbeing, reducing crime and the fear of crime, bringing communities together and tackling homelessness.

Thriving Places supports Mission 1: Kickstart economic growth. Its scope includes development of the visitor economy, and high streets and town centres improvements.

Supporting Local Business

The Support for Business theme sits under the Supporting Local Business priority. This is the only theme under this investment priority.

Support for Business supports:

  • Mission 1: Kickstart the economy
  • Mission 2: Make Britain a clean energy superpower
  • Mission 4: Break down barriers to opportunity

Its scope includes advice and support to business, enterprise culture and start-up support, as well as business sites and premises.

People and Skills

The are two themes under the priority of People and Skills. These are Employability and Skills.

Both themes support:

  • Mission 1: Kickstart the economy
  • Mission 2: Make Britain a clean energy superpower
  • Mission 4: Break down barriers to opportunity

The scope of the Employability theme includes supporting people, including those who are economically inactive, to progress towards and into sustained employment and support for young people who are at risk of becoming NEET.

The scope of the Skills theme includes essential skills (including numeracy, literacy, ESOL, and digital) and employment related skills.

The Fund’s light-touch, delegated delivery model enables local government to continue investing in local priorities and target funding where it is needed most. It enables truly local decision making which will be planned and delivered by councils and Mayoral Strategic Authorities across England, Scotland, and Wales (‘lead local authorities’).

The Fund is being delivered by lead local authorities (LLAs); comprising councils and Mayoral Strategic Authorities in England, councils and a regional grouping in Scotland, and four regional groupings in Wales. In Northern Ireland (NI), the Ministry of Housing, Communities, and Local Government (MHCLG) is delivering the UKSPF - working with local partners including local authorities (LAs), the private and third sectors, and Northern Ireland Executive departments - to deliver projects that best reflect the specific needs of Northern Ireland’s economy and society.

UKSPF funding includes additional targeted programmes:

  • the Rural England Prosperity Fund (REPF) - starting in 2023-24, REPF is an England-wide £143 million rural top-up to the UKSPF programme, providing support for rural businesses to diversify their income streams and strengthen local economies, alongside funding for rural-specific community and civic infrastructure in England. The REPF objectives sit within the UKSPF’s communities and place and supporting local business investment priorities. REPF evaluation is funded and led by the Department for Environment, Food, and Rural Affairs (Defra) as an add-on to the place level case studies portion of the core UKSPF evaluation. As well as factoring into intervention and programme level evaluation where appropriate.

  • Multiply (2022-25) - a 3-year, UK-wide skills initiative worth up to £559 million and aimed at increasing the level of functional numeracy in the adult population across the UK, to improve labour market outcomes and close skill gaps. For 2025-26, the Multiply programme will not continue as a specific, ringfenced programme. Local authorities retain the flexibility to deploy their local UKSPF allocation according to need and this can still include funding for adult numeracy. Multiply’s programme evaluation is delivered by the Department for Education (DfE), as distinct from the wider UKSPF evaluation delivered by MHCLG. Place level case studies in Scotland, Wales, and Northern Ireland will include a process evaluation of Multiply, led by MHCLG. Further details on Multiply’s investment priorities and delivery model can be found in the Multiply prospectus. (PDF, 708KB)

Purpose of this strategy

The UKSPF evaluation aims to:

  • evaluate impact and process, through understanding what the UKSPF has delivered; whether it has successfully achieved its own policy objectives aligned to the government’s Missions, how effectively it has been implemented in places (including whether this varies across and within different nations), and whether it has constituted good value for money
  • build the evidence base on ‘what works’ for community pride and opportunities across the UK; developing MHCLG’s understanding of how success in these areas can be measured and using this insight to inform future local growth programme design at both the programme and place level
  • provide accountability that MHCLG has delivered funding in the most effective way to support realisation of the UKSPF’s outcomes

This document builds on the ‘evaluation’ section of the UKSPF additional information (for England, Scotland, and Wales) and the NI additional information, providing further detail on:

  • evaluation strategy: the aspects of the programme that are being evaluated and why, the methods and data sources that will be used to enable evaluation, and how the different components of the evaluation will work together to build a holistic picture of the UKSPF’s delivery and impact at the intervention, place, and programme level
  • evaluation delivery: how the strategy will be implemented by MHCLG, what outputs will be produced, how LLAs should expect to support evaluation activity, and what timelines the different components of the evaluation will operate to

The UKSPF 2025-26 technical note and additional information have been updated alongside the publication of this strategy to include further guidance for LLAs on how they should expect to be in involved in each of the evaluation components.

The delivery of UKSPF is carried out in devolved areas across Scotland, Wales, and Northern Ireland. For this reason, the approach to evaluation has been adapted to the specific context and delivery needs across Scotland, Wales, and Northern Ireland. 

Documents relating to the UKSPF evaluation components - including feasibility, interim, and final reports - can be found on the UK Shared Prosperity Fund: evaluation page.

MHCLG have also published a ‘how to guide’ for local authorities interested in planning and designing their own evaluations of UKSPF funded projects.

Summary of evaluation approach

Core UKSPF and Multiply evaluation

Alongside the UKSPF prospectus, MHCLG published a menu of intervention types for 2022-25. This has since been updated, via the UKSPF 2025-26 technical note, to themes and sub-themes for 2025-26, aligned to the government’s Missions. Originally, each nation had a specific set of interventions under the three investment priorities, with additional Multiply intervention types in Scotland, Wales, and Northern Ireland. Wherever possible, the evaluation activity set out in this strategy is consistent across all parts of the UK; however, in some cases a different approach was necessary in line with equivalent variance in the wider UKSPF delivery model (e.g., in the case of Multiply).

The UKSPF is a flexible fund that allows places to design and implement projects that best suit their particular local needs. This broad scope, combined with MHCLG’s multi-level, multi-component evaluation plan - as set out later in this section - means that constructing a single programme-level logic model is impractical. Rather, LLAs and local partners will work together with MHCLG to construct theories of change on a per-intervention theme/per-place basis to fit the specific aims of each evaluation component.

Figure 2 sets out a high-level logic model for the UKSPF, including the five government Missions that it supports. This will be tested through the programme level impact evaluation.

Figure 2: High level logic model for the UKSPF

This image sets out the high-level logic model for the UKSPF as a whole, illustrating the step-by-step process of how the Fund’s central and local inputs will support UKSPF projects to generate outputs, outcomes, and impacts.

Inputs include the £3.5 billion UKSPF funding itself, alongside guidance and support for places to spend it effectively. Place level inputs include local delivery capability, other funding streams, and places’ wider socioeconomic contexts.

These inputs will support projects, aligned with the UKSPF themes and sub-themes, across the UK, leading to outputs (e.g., places receiving funding to deliver projects) and then impacts (e.g., in bringing communities together, supporting people to progress towards and into employment, and developing the visitor economy) The chart then illustrates the Missions that UKSPF impacts will support.

Please note that a previous version of the UKSPF high-level logic model referred to the Levelling Up White Paper missions and related impacts, including improved pride in place and life chances. This has been updated. The UKSPF inputs have also been updated, to take account of the 2024 Autumn Budget announcement of a further £900 million of funding for local investment by the end of March 2026.

As the UKSPF is a centrally designed fund with delegated delivery, evaluation activity will take place across 3 levels: programme level (focusing on the UKSPF as a whole), place level (focusing on the UKSPF within individual LLAs and in Northern Ireland as a whole), and intervention level (focusing on specific intervention themes across a range of places).

Across these 3 tiers the evaluation aims to ascertain:

  • programme level: what is the overall impact of the UKSPF on community pride, opportunities, and the government’s Missions? How effectively was the Fund delivered as a whole?
  • place level: how was the Fund delivered across different types of place and across different parts of England, Scotland, Wales, and Northern Ireland? How did the Fund interact with other local growth funds and local projects within these places? How effectively was the Fund delivered within places?
  • intervention level: what was the impact of specific projects and intervention themes? How did this compare to their original aims? How effectively were projects delivered, and did they achieve good value for money? Did this vary across different intervention themes?

Table 1: UKSPF evaluation components

Spatial tier Component Aims
Intervention Intervention level evaluation To assess the impacts of nine study groups across the UKSPF’s three investment priorities, and how well they have been delivered using a quasi-experimental approach with treatment and control groups
Intervention Randomized controlled trials To gain a deeper understanding of the impacts of the subset of projects that are appropriate for RCTs, above and beyond that made possible by the quasi-experimental approach of the wider intervention impact evaluation. Note that as set out later, the RCT work is no longer being taken forwards
Place Place level case studies To develop a detailed understanding of the UKSPF’s effectiveness across different types of place, considering their unique local characteristics and challenges, and focusing on interactions between stakeholders, local decision making, process efficiency, and other local growth funds
Programme Programme level impact evaluation To assess the extent to which the programme has impacted the core aims of supporting the government’s Missions and building community pride and opportunities across the UK and, by extension, whether the programme theory level of change has worked in practice, through a theory-based evaluation approach
Programme Multiply evaluation To assess the impact and value for money of Multiply, how effectively it has been delivered by places and the UK government, and to build the evidence base regarding promising practice in the delivery of local adult skills interventions

Across the 5 evaluation components in Table 1, MHCLG is carrying out multiple types of evaluation:

  • process evaluation: to understand how the UKSPF has been implemented in practice by places, and the aspects of its design that have helped or hindered said implementation
  • impact evaluation: to understand the extent to which economic, social, and environmental outcomes relating to community pride and improved opportunities have changed as a result of the UKSPF – for both the Fund as a whole, as well as with respect to specific intervention themes and places
  • value for money evaluation: to understand, where impact has been evaluated, how the value of the impact achieved compares to its cost according to The Green Book principles of economy, effectiveness, and efficiency

Figure 3 illustrates non-exhaustively the data types and data sources that will feed into the components in Table 1, and subsequently how these components will collectively build MHCLG’s understanding of what works at the intervention, place, and programme levels.

Figure 3: Summary of evaluation inputs and outputs

This image illustrates the different data types and data sources that will support the UKSPF evaluation, the components of the evaluation described in Table 1, and how each of these components feed into the evaluation’s core aims of understanding what works for intervention design, local process and delivery, and programme design.

Data sources shown in the chart include:

  • monitoring data from projects
  • data collected by evaluation contractors directly
  • administrative data
  • information about UKSPF beneficiaries
  • survey data

Rural England Prosperity Fund evaluation

The Rural England Prosperity Fund (REPF) is a rural top up to the UKSPF programme providing allocations to eligible local authorities in England to help address the additional needs and challenges facing rural areas.  

As outlined in the Rural England Prosperity Fund: prospectus, REPF funds capital projects for small businesses and community infrastructure. This will help to improve productivity and strengthen the rural economy and rural communities and is aligned with the government Mission 1: Kickstart economic growth.

The UK government launched the REPF in September 2022, with funding covering April 2023 to March 2025. In March 2025, an additional year’s funding was announced - with the programme totalling £143 million.

Defra and MHCLG will evaluate REPF through several additional rural-specific add-ons to the place-based case study components of the main UKSPF evaluation. The separate Defra-funded evaluation of REPF focuses on the process and delivery. Further detail can be found in the ‘Rural England Prosperity Fund Process Evaluation’ section.

Evaluation outputs

For each of the components in Table 1, MHCLG will at a minimum publish a comprehensive report following conclusion of the evaluation setting out lessons learned, evidence and recommendations applicable to the wider local growth space. The scope and scale of these reports (including plans to publish additional documents) will vary across the different components, and further detail can be found in the specific sections dedicated to each.

In addition to the main reports, LLAs and relevant Northern Ireland partners participating in the evaluation may also receive access to analysis ahead of wider publication. MHCLG will look to supplement any written reports with interactive data visualization tools and output dashboards to allow for meaningful engagement by stakeholders with the evaluation’s underlying data and analysis, where this is useful, feasible, and non-disclosive.

Implications for lead local authorities

Lead local authority participation in evaluation activity

The UKSPF evaluation is not intended as a means to hold LLAs to account for their selection or delivery of projects funded using UKSPF allocations. It is intended to develop a UK-wide understanding of what works, in what context, and by what means in support of delivering the plan for change and the government’s Missions, as well as what doesn’t work. The lessons learned from both successful and unsuccessful projects will be an invaluable resource for both local and central government in the development of future local growth funding. LLAs will not be penalised or otherwise disadvantaged by the outcome of any part of the UKSPF evaluation.

Places invited to participate in the evaluation have been informed and participation requirements communicated. Not all LLAs have been asked to participate in all components of the evaluation, and some may have been asked to participate in multiple components depending on their institutional type, allocation size, and chosen projects.

Participation in evaluation activity is not compulsory for LLAs to receive funding. However, as set out in the UKSPF additional information, LLAs are encouraged to assist MHCLG with the local level aspects of the UKSPF evaluation. LLAs can draw on their management and administration budget to cover engagement in any MHCLG-led evaluation activity. LLA’s were not required to gather data in advance of the evaluation, however, MHCLG asks that any information gathered proactively is shared to support the evaluation.

MHCLG will coordinate the evaluation using specialist evaluation contractors to help carry out fieldwork and analysis as appropriate, while retaining overall responsibility for design and delivery. MHCLG will work closely with contractors, LLAs, and grant recipients (where relevant) throughout the design and delivery phases of the evaluation to minimise burdens, maintain clear and regular communication between all parties, and make sure progress is in line with the evaluation’s core aims.

This co-designing approach means that many of the specific details of how each evaluation component should be implemented – including the precise data sources and evaluation methods employed – were finalised at each component’s feasibility stage. LLA views were sought, where possible, to ensure that the final evaluation approaches are proportionate and deliverable.

The distinct roles and responsibilities of Northern Ireland partners are reflected in the UKSPF’s Northern Ireland delivery model; MHCLG is managing delivery in collaboration with a range of local partners (including local authorities), which will deliver at a range of spatial scales. Unless explicitly stated in this strategy, this will not affect the scope of aims of the UKSPF evaluation in Northern Ireland. Key partners and grant recipients are encouraged to take part in certain aspects of the evaluation. This could be helping to identify beneficiaries and providing feedback and views on the delivery and impacts of the UKSPF in Northern Ireland to support, for example, the Northern Ireland process evaluation or intervention level evaluation relevant to Northern Ireland.

The ‘evaluation’ section of the UKSPF additional information has been updated with additional guidance for local areas covering each component, to complement this strategy.

Monitoring and reporting

Routine reporting data is required by MHCLG for monitoring purposes. Details of reporting requirements can be found in the UKSPF additional information. To minimise additional burdens, MHCLG will draw on this same reporting data to support multiple evaluation components, including the intervention, place, and programme level impact assessments.

To capture local intelligence on how the UKSPF is being implemented and achieving its desired results, LLAs in England, Scotland, and Wales are encouraged, but not required, to conduct light-touch reflective “lessons learned” exercises (rather than full local evaluations). These exercises will ensure LLAs are deploying proportionate capacity and capability towards building local intelligence. Proposed methods could include measuring outputs, contribution analysis, developing case studies, or conducting surveys and interviews with stakeholders and beneficiaries. As set out in the additional information, MHCLG asks to be informed through routine reporting when this is taking place.

Evaluation components

Intervention level evaluation

LLAs have the flexibility to invest across a range of activities that represent the right solutions for their areas. The UKSPF 2025-26 technical note sets out the simplified UKSPF themes for 2025-26, aligned with the Government’s Missions. The five themes and twelve new sub-themes – as shown in Figure 1 – map to the previous interventions, as set out in the previous interventions list.

As part of their original investment plans, LLAs were required to identify the outputs and outcomes they wished to target and intervention types they wished to prioritise., LLAs now have more flexibility to fund any combination of themes, and projects can cut across themes as appropriate to the specific circumstances of the place. Projects are delivered by LLAs themselves or by a local organisation or partner (e.g., a charity, local community group, or contractor). 

MHCLG published a investment plan developed with local partners to set out the intervention types supported by the UKSPF in Northern Ireland for 2022-25 and have been guided by partners in selecting themes to invest in for 2025-26. 

Aims

The aim of the intervention level evaluation is to understand what intervention themes work, for whom, and in what conditions, for the three investment priorities (people and skills, supporting local businesses, and communities and place). 

The intervention level component of the UKSPF evaluation combines impact, process, and value for money evaluation in an integrative approach, aiming to understand the extent to which individual UKSPF funded projects have affected community pride and opportunities at the local level. This will in turn show MHCLG and places what works and what doesn’t within the local growth policy space, and support places to efficiently target their local growth spend over the longer-term. 

This component of the evaluation uses a mix of theory, quasi-experimental, survey, and interview-based methods to:  

  • build an evidence base around the most effective UKSPF intervention themes across different types of place

  • understand how to design and deliver these projects in a way that achieves maximum impact and value for money, building an evidence base that places and UKG will be able to draw on when delivering local growth and communities funding in the future

Structure

For the intervention level evaluation, a feasibility study was undertaken to inform the design of the main phase.

In the feasibility study, priorities for evaluation were identified from the published UKSPF intervention types, where the wider evidence base is weakest, expenditure highest, and those most suitable for robust causal impact evaluation. MHCLG and evaluation contractors worked with LLAs and local partners in Northern Ireland to develop theories of change and evaluation methods for each priority intervention theme.  

Through this exercise, nine groups of similar projects were identified – see Figure 4. These ‘study groups’ form the focus of the intervention level evaluation. Around three projects in each study group have been chosen, with those selected sharing similar core characteristics in terms of their aims or objectives, outputs (tangible deliverables from the investment), intended beneficiaries, and the mechanisms through which they are intended to bring about change. 

Conducting the evaluation at the study group level provides the opportunity to pool evidence from projects within the study group. This increases the scale and robustness of evidence generated.

Further information on the process of identifying study groups can be found in the UKSPF intervention level feasibility report (PDF, 2023KB).

Figure 4. Intervention level study groups

This image shows how the 9 evaluation study groups relate to the three UKSPF investment priorities: people and skills, supporting local businesses, and communities and place.

  • Study Group 1: Helping improve the employability of economically inactive people (People and Skills)
  • Study Group 2/3: Helping improve the employability of economically inactive young people (People and Skills)
  • Study Group 4: Involving local businesses in helping improve the employability of economically inactive people (People and Skills)
  • Study Group 5: Supporting the digital development of local businesses (Business Support)
  • Study Group 6: Providing grants to local businesses (Business Support)
  • Study Group 7: Helping businesses decarbonise through audits and grants (Business Support)
  • Study Group 8: Major refurbishments of community buildings (Communities and Place)
  • Study Group 9: Large investments in sports pavilions or pitches (Communities and Place)
  • Study Group 10: Investments in new or improved playground equipment (Communities and Place)

Following the publication of the feasibility report (PDF, 2023KB), an opportunity to deliver more robust and innovative analysis was identified by pooling projects across study group 1 (helping improve the employability of economically inactive people) and study group 2/3 (helping improve the employability of economically inactive young people) and focusing on the concept of ‘distance travelling towards employment’. This means that participants’ journeys towards employment can be tracked at an earlier stage, delivering learning on the projects’ effectiveness without having to wait years to observe employment outcomes. More information on the distance travelled methodology can be found in the early update report.

The main phase involves the delivery of a 3-pronged evaluation (impact, process, and value for money) of the nine study groups identified at the feasibility stage. MHCLG will work closely with participating LLAs to understand details of the projects grouped within each and to access and collect information and data.

Methodology and data collection

The feasibility study identified the most suitable methodologies for producing robust learning in a proportionate way.

The precise data sources used to support the evaluation will vary by study group. As well as contractor-led primary data collection, examples of data sources MHCLG expect to draw on include, but are not limited to:

  • monitoring data collected through wider UKSPF reporting procedures (including information on the intervention themes each place is targeting, quantum of spend toward each project, and progress toward agreed outputs and outcomes)
  • survey data measuring community pride and opportunities, alongside associated national level annual surveys such as DCMS’ Community Life Survey
  • administrative data held centrally, such as socioeconomic indicators and travel, labour market and crime statistics from the Office for National Statistics (ONS)
  • data identifying individuals and businesses who are taking part in or would otherwise benefit from UKSPF interventions[footnote 2]. For a people and skills project - like a training course - this could include the details of those attending. For an area-based or capital project - like a new park - this could include the details of those in the park’s catchment area who might be expected to use it.

Process evaluation

The process evaluation seeks to produce learnings about the design, planning, implementation, management, and monitoring of the projects. It will triangulate evidence from documents (such as UKSPF guidance and project documents); in-depth interviews with a range of stakeholders, including LLAs, delivery partners, local community representatives, and local businesses; focus groups with stakeholders who deliver and receive the projects; and site observations which provide an understanding of projects on the ground.

The early update report covers early process insights specific to each UKSPF investment priority, based on ten interviews with LLA’s and delivery providers, conducted between February and March 2024.

Impact evaluation

The impact evaluation approach set out below focuses on impacts to 2025. However, given the likely longer-term nature of impacts of many UKSPF’s funded projects, the feasibility of extending the evaluation beyond 2025 is being considered.

An integrative evaluation approach is being used for the impact evaluation of each study group – summarised in Figure 5. This maximises learning by bringing together several evaluation methods - including several data sources and analytical approaches to enhance rigour - with the aim of generating robust evidence on what works, for whom, how, and under what conditions.

Figure 5: Intervention level integrative evaluation approach

A flow chart showing how different data sources used in the evaluation will support different analytical techniques, which will come together to form the integrative evaluation approach.

The data sources include:

  • monitoring data: from project delivery partners
  • qualitative data: from focus groups and interviews
  • bespoke survey data: from local residents or programme participants
  • secondary data: existing surveys and administrative data

The analytical methods that these data sources support include:

  • descriptive analysis
  • before-versus-after analysis
  • quasi-experimental analysis (where possible)
  • theory-based contribution analysis

The methods include:

  • descriptive analysis of the projects (within each study group) using data from bespoke surveys, project monitoring data, and other secondary data. This will be used to understand contextual factors, which are important in attributing impacts to the evaluated projects and exploring how impacts might vary across contexts
    • descriptive assessment of the extent to which the strategic objectives of the projects (within each study group) have been met. This is important to understand for both the impact and value for money evaluation.
  • before versus after analysis to indicate what has changed over time, using data that is being collected in bespoke surveys and local monitoring data. This method does not provide evidence on what is attributable to the projects, but it highlights what has changed, where, and for whom since their inception
  • quasi-experimental difference in difference analysis will bolster rigour, by generating evidence on what impacts can be attributed to the projects within each study group. This will be used for the communities and place study groups, using synthetic control groups derived from Community Life Survey data - see ‘Evaluation tools and data sources’ section. This analysis will focus on comparing outcomes at a range of specified distances from the project site. The merging of people and skills study groups – as described above - has enabled within group analyses, to understand which combination of skills and employability support work best for different characteristics
  • theory-based contribution analysis will triangulate evidence, from the process evaluation, bespoke fieldwork, and above methods, to test the theory of change. This will be used to articulate an evidence-based narrative of the extent to which it is reasonable to claim that the intervention themes, represented by each study group, have contributed to the observed outcomes. It will explore how this varies across cohorts of participants or contextual factors, as well as potential reasons for variations

Value for money

The value for money evaluation will produce an assessment of the extent to which, and under what conditions, the projects represented by each study group deliver value for money. The approach for doing so, as set out in The Green Book, includes:

(i) providing a qualitative assessment of the extent to which the projects have met the local strategic objectives they were designed to achieve
(ii) using both scenario-based analysis and break-even analysis to determine the extent to which social benefits could be expected to exceed the costs for these types of projects over their lifetimes

The scope of the value for money evaluation will be determined by numerous factors, such as the quality of project level information, monitoring data, and survey response rates. However, the evaluation provisionally seeks to answer the following 5 key questions: 

  1. to what extent were local strategic objectives met by the projects within each study group?

  2. to what extent were the financial costs in line with projections?

  3. to what extent are the social benefits (monetised and non-monetised) likely to exceed the costs, over the life of the projects within each study group?

  4. what are the conditions under which value for money is likely to be relatively higher or lower, and for whom?

  5. to what extent is there evidence that any observed benefits are displaced from another group or locality to the target group or locality?

Potential evaluation beyond 2025

The current evaluation looks at the short-to-medium term impact of projects. The feasibility of extending the evaluation beyond 2025, to enable longer-term impacts to be collected, is currently being explored. The feasibility report (PDF, 2023KB) suggested methods for extending the evaluation and evidence that could be generated, for an illustrative 3-year period.

Outputs

Key outputs will include, for each of the nine study groups:

  • process:  a detailed summary report drawing out key specific findings, alongside an overarching process evaluation report setting out generalisable findings and recommendations
  • impact: a detailed summary report drawing out key specific findings, alongside an overarching impact evaluation report setting out generalisable findings and recommendations
  • value for money: a high-level summary of the value for money assessment of the study groups

Timeline 

Milestone Timeframe
Feasibility phase Spring 2023(#)
Feasibility phase report published February 2024(#)
Main phase Summer 2023 - Summer 2025
Your Community, Your Say survey 2024 delivered February - March 2024(#)
Early update report published October 2024(#)
Your Community, Your Say survey 2025 delivered February – April 2025
Final impact, process, and value for money evaluation reports published Winter 2025/6

(#) This milestone has been completed (as of April 2025)

Randomised controlled trials

The randomised controlled trials (RCT) element of the UKSPF evaluation sought to assess the impact of the UKSPF at the level of individual projects in specific places. The RCTs were designed to complement the wider intervention level impact evaluation by:

  • operating at the more granular level of individual projects in specific LLAs, rather than looking at broader study groups of similar projects across multiple LLAs
  • using a fully experimental approach to measuring impact, with random allocation of treatment and control groups. This is not possible for the wider intervention level evaluation as most UKSPF projects are inherently unsuitable for a random allocation-type approach

Between March and May 2023, MHCLG invited LLAs to volunteer their UKSPF projects through an expression of interest process (EOIs) to serve as the subject of an RCT evaluation delivered by MHCLG.  

Amidst a high level of initial interest from LLAs, one potentially suitable UKSPF project was identified. However, following more detailed feasibility work, this project was ultimately not considered viable for a full RCT and not taken forward.

A lessons learnt paper (PDF, 637KB) was published for those considering RCTs in local growth.

Place level case studies

The UKSPF takes place  in the context of a complex local government landscape across multiple tiers of institution – with allocations to Mayoral Strategic Authorities, unitary authorities, county councils, and lower-tier authorities – each of which has their own set of powers and responsibilities with regards to the kind of local growth projects that the UKSPF is designed to deliver. Even within a particular type of institution, there exists substantial variance between places in terms of size, capability, demography, and socio-economic circumstances.

The UKSPF is designed to bring benefit to all types of places across the UK – not only those most capable or well-resourced to take advantage of the funding it offers – and therefore it is crucial to understand how and to what extent its outcomes are moderated by the characteristics of different types of LLA. 

Aims

The place level case studies element of the evaluation is focusing on up to 34 LLAs[footnote 3] with the aim of building a detailed understanding of what works and what doesn’t in different places. The case studies will focus on the local design, delivery, and impacts of projects, generating robust evidence on how combinations of UKSPF funded projects within a locality work together to support local priorities across the three key investment priorities: communities and place, supporting local business, and people and skills. This builds on the UKSPF intervention level evaluation by evaluating the delivery process and impact of projects within specific place contexts. 

Local growth programmes often have overlapping aims and objectives within places. Therefore, where relevant, the case studies will also investigate the interaction between the UKSPF and other local growth funds, such as the Towns Fund.  

Methodology and data collection

Each case study will include process, impact, and value for money evaluations[footnote 4]. The impact evaluation will take a theory-based approach - primarily using contribution analysis - supported by mixed-methods research including surveys, interviews, focus groups, and quasi-experimental approaches, where feasible and appropriate.

Process and impact research questions within each evaluation are based on a common framework, with additional bespoke questions tailored to each LLA. MHCLG commissioned Your Community, Your Say (YCYS) surveys will be undertaken in up to 20 places to support the evaluation of communities and place projects.

The Multiply element of the UKSPF programme in Scotland, Wales, and Northern Ireland will be investigated through short process evaluations in each of these cases studies. Evidence gathered as part of these evaluations will support the Multiply programme level evaluation, led by the Department for Education (DfE).

Furthermore, a summary report analysing evidence from across the place level case studies will draw out wider narratives and conclusions about what worked and what didn’t across implementation of the UKSPF as a whole, and between different types of place.  

The place level evaluation methodology report provides further detail on the main methods expected to be used.

Structure

Case studies will be designed and developed across 2 core phases – feasibility and main

The feasibility phase completed in March 2024 and involved: 

  • identification of a longlist of places using desk-based research and set of secondary data
  • creation of a shortlist through collaboration with selected places and MHCLG

The place level methodology report presents the feasibility stage in more detail, as well as the evaluation themes and methodologies to be explored and used across the case studies. Evaluation plans for the case studies are available to read on the UKSPF place level evaluation page.

Following the feasibility phase, the main phase is being conducted between April 2024 and early 2026 and covers delivery of the case studies themselves. This is broken down into: 

  • evaluation design and planning: a review of programme documents and monitoring data, interviews with UKSPF leads and a theory of change workshop to create an evaluation plan within each case study
  • interim fieldwork and reporting: a first wave of fieldwork focusing on the process evaluation, followed by the delivery of an interim report and revised theory of change. YCYS fieldwork will be undertaken during this stage, as well as Multiply process fieldwork
  • mid-stage contribution analysis workshop to assess the draft contribution claims. These will follow the process evaluation findings and offer an opportunity to reflect on our place level theories of change and hypothesised causal chains, due to be tested in the final stage of the evaluation
  • final fieldwork and reporting: a second wave of fieldwork focusing on the impact evaluation and development of a final theory of change. This stage includes analysis of YCYS surveys for the 20 places

Figure 6 provides examples of the research questions the case studies will look to address across the process and impact evaluations. The research questions are illustrative, not exhaustive and further questions have been developed which are sensitive to each case study LLA’s particular local circumstances, priorities, challenges, and projects.

Some UKSPF funded projects will be delivered to individual people or businesses (e.g., skills courses or business support schemes). Other projects will be ‘area-based’, delivered to a specific area within a place (e.g., as public realm improvements or crime prevention measures). Case studies will include a mix of both.

Figure 6: Case study example research questions for UKSPF

For the process evaluation, research questions include:

  • Design:
    • What projects did UKSPF deliver and why/how were these selected?
    • What type of needs did the projects address?
    • Has funding been accessed from other sources (public funds or local match funding)?
  • Implementation:
    • How did local authorities implement UKSPF projects?
    • How successful was engagement with potential suppliers and delivery of any procurement processes?
    • What approach was taken to ensuring the right types of participants would benefit from the projects?
    • How did projects interact with other local regeneration projects (e.g., Towns Fund)?
  • Delivery:
    • How were UKSPF projects delivered in practice, including engagement with target populations, the skills, expertise, and quality of delivery?
    • What worked well and less well in delivery?
  • Monitoring:
    • How did local programmes collect data on expenditure, outputs, and outcomes?
    • How was this resourced?
    • How was this data used and shared?

For the impact evaluation, questions include:

  • What types of outcomes and impact have the UKSPF funded projects had?
  • How have these been achieved?
  • What was the impact of delivering UKSPF projects in conjunction with other funding streams and local strategies?
  • What has driven local success

This will be combined with outputs from local evaluations and the UKSPF intervention level evaluation.

For the economic evaluation, questions include:

  • Has UKSPF offered value for money, according to the NAO 4 E’s approach?
  • How effectively has UKSPF funding been used in combination with other public funding to achieve outcomes?

Within the evaluations for places in Scotland, Wales, and Northern Ireland, MHCLG will also conduct short process evaluations of the Multiply programme. These will be framed by the research questions in Figure 7. 

Figure 7: Case study example research questions for Multiply in Scotland, Wales, and Northern Ireland

Research questions for the Multiply process evaluation in Scotland, Wales, and Northern Ireland, including

  • How were spending decisions made?
  • Which specific cohorts did you chose to target?
  • What specific barriers have you faced in delivering Multiply projects and how have you responded to them?
  • What level of confidence would you have in delivering numeracy support again?

Outputs

Detailed interim and final reports will be published for each case study, combining analysis of local socio-economic conditions with both qualitative and quantitative evidence to assess: 

  • process: design, implementation, and delivery of places’ investment plans; what worked well, what didn’t, and the key enablers/blockers of effective delivery
  • impact: the extent to which the place level theories of change matched up with the ultimate outcomes of those projects; whether hypothesised contribution claims were accurate
  • value for money: an economic evaluation of UKSPF delivery in each place, following the NAO’s 4 E’s approach of economy, efficiency, effectiveness, and equity

Interim reports will focus on the process evaluation and early impact findings. Final reports will focus on the impact evaluation, including quasi-experimental evidence where feasible.

A summary report will be published that will bring together evidence from across all case studies. This will draw out overarching themes, patterns, and narratives across different places, with the aim of identifying common good practice and lessons learned that may apply to other places in the UK.

Qualitative Comparative Analysis will be used to analyse the factors which are more likely to have supported the achievement of the programmes’ outcomes of interest, including the success of delivery in specific areas (e.g., identifying factors that have worked well in rural areas compared to urban). 

Timeline

Milestone Timeframe
Feasibility Phase:
Longlisting and shortlisting of candidate LLAs
Piloting of place-based approach with initial tranche of places
March 2024(#)
April 2024(#)
Invitation for tranche 2 places to participate July 2024(#)
Evaluation planning & theory of change workshops phase Summer – Autumn 2024(#)
Interim fieldwork & reporting Late 2024 – Spring 2025(#)
Your Community, Your Say survey 2025 delivered January – March 2025(#)
Mid-stage contribution analysis workshops June 2025
Final fieldwork & reporting Summer 2025 – Early 2026
Summary report February 2026

(#) This milestone has been completed (as of April 2025)

Programme level evaluation

The UKSPF is a national programme, and its evaluation will include a national, programme-level component, in addition to the intervention and place level deliverables covered in earlier sections of this strategy. 

‘Programme level’ in this context refers to evaluation of the impact of MHCLG-delivered parts of UKSPF only. It does not include Multiply and the Rural England Prosperity Fund, which are covered in their own sections of this strategy.

MHCLG will also seek to carry out a programme level process evaluation. Rather than a standalone component, the programme level process evaluation will bring together the individual place level case studies to understand how effectively the UKSPF has operated across different types of place. The rest of this chapter focuses solely on the programme level impact evaluation. 

As noted in the UKSPF additional information, MHCLG has engaged with the What Works Centre for Local Economic Growth (WWG) to address some of the challenges of assessing the overall impacts that are inherent to a programme like the UKSPF. MHCLG will continue to draw on expert advice from WWG and elsewhere - including through the UKSPF Technical Advisory Group - as development of the programme level evaluation continues.

Aims

The programme level evaluation aims to understand the extent to which the UKSPF programme as a whole - rather than specific intervention themes or specific places - has impacted the core aims of building community pride and opportunities across the UK, and by extension whether the programme level theory of change as set out in Figure 2 has worked in practice.

This will: 

  • assist MHCLG and UKG more widely in understanding what works for local growth and communities funding in terms of delivery models, local autonomy, and investment priorities

  • complement the intervention level evaluation in helping to build the evidence base on the broad categories of projects that deliver the highest levels of benefit for every £ spent

  • hold MHCLG (rather than LLAs) accountable for the UKSPF making a positive contribution to people’s community pride and opportunities across the UK in support of the government’s Missions

The programme evaluation will employ a theory-based evaluation approach using contribution analysis. This will test the UKSPF’s programme level theory of change and assess whether (rather than to what extent) a link between UKSPF funding and its intended outcomes can be reasonably inferred – and, if not, to identify where the theory of change breaks down. By focusing on establishing the existence of causal links, rather than quantifying it, contribution analysis will allow us to draw on a wider range of evidence and data from other components of the evaluation to construct a high-level, non-technical narrative to explain the overall impacts (or lack thereof) of the UKSPF. 

Structure

The wider design characteristics of the UKSPF present multiple obstacles to effective programme level impact evaluation. In particular:

  • all parts of the UK receive a UKSPF allocation – therefore, no natural control groups exist
  • funding has been allocated using a mixed methodology based on a combination of past EU structural funding, population, and a need-based index. Some places have also received a REPF allocation based on a different set of criteria
  • funding has been allocated across multiple tiers of LLAs, including Mayoral Strategic Authorities, unitary authorities, and lower-tier authorities in different parts of the UK, and at regional scale in Northern Ireland

Any impact evaluation requires an estimate of what would have happened if the project had not taken place – a counterfactual – for comparison purposes. The characteristics of the UKSPF, as set out above, mean that use of RCT or even quasi-experimental (QED) evaluation methods is not feasible at the programme level. Table 2 below sets out more detail on why this is the case for each approach considered.

Table 2: Alternative programme evaluation methods considered

Methodology Description Obstacles
RCT Using a RCT to evaluate the overall impact of the UKSPF would require the random allocation of some places to receive funding, and others not, to allow comparison of ‘with vs without’ impacts Not feasible under any circumstances; all places have been allocated a share of the UKSPF, as it replaces existing funding from EU programmes
QED: treatment vs control Comparison of places which receive funding with ‘similar’ places which do not – similar to an RCT-based approach but without the random allocation of treatment and control groups. This approach will be used in the UKSPF intervention level evaluation Not feasible for the programme evaluation, for same reasons as RCTs; all places have been allocated (and must receive) funding
QED: relative intensity Comparison of places according to the size of their UKSPF allocation - assessing whether places which received more funding experienced concomitantly larger positive impacts Requires that the measures influencing places’ allocations (population size, index rank, and prior EU funding) are independent of the outcomes being assessed (community pride and opportunities) Without a concrete, objective measure of these outcomes that is demonstrably independent in this way, this approach is unlikely to yield robust insight
QED: staggered delivery Comparison of places delivering a project against those that have yet to do so, using the latter group as a counterfactual for the former UKSPF’s funding horizon and in-year spend requirements mean that places will largely draw on funding concurrently across the delivery timeframe. In addition, any evaluation-exploitable variations in delivery timing between places would have to be independent of the outcomes being assessed, as for the relatively intensity approach

In addition to the UKSPF-specific obstacles identified in Table 2, there exist challenges inherent to evaluating any local growth programme rooted in the difficulty of isolating the impact of the UKSPF from that of other funding sources that may contribute to delivering similar outcomes at the same time. These include (non-exhaustively):

  • other centrally delivered local growth programmes, like the Towns Fund, Levelling Up Fund (LUF), or devolution deals
  • baseline central government expenditure, such as funding for adult social care, crime prevention, and education
  • LLAs’ own locally funded initiatives, which may vary in their scope and overlap with UKSPF priorities across different places
  • projects funded by the devolved governments in Scotland, Wales, and Northern Ireland
  • residual funding from the EU programmes replaced by the UKSPF – which may continue to support projects up to December 2023 – as well as areas successful in receiving pilot funding from the UKSPF’s predecessor, the UK Community Renewal Fund
  • instances where UKSPF money only partially supports UKSPF projects: for example, where a project is a continuation from the LUF or Towns Fund, or if a project is receiving significant match funding from private sector partners or LLA internally

Finally, the variance in baseline characteristics and capabilities across different LLAs – that is, the extent of their local challenges and ability to deliver the UKSPF effectively in addressing them - must be accounted for. For example, a capital-focused project to build a new park in a place with very little green space would be expected to have a greater relative impact than the same project in an already green place, for reasons unrelated to the intrinsic value-add of the activity.

Similarly, a complex multi-modal project aimed at decarbonising local transport, housing, and civic buildings could be far more impactful when delivered by a more capable LLA with the expertise, experience, and resources than somewhere without these attributes. Characteristics such as rural-urban classification (which affects a place’s eligibility to receive the REPF) may also affect the UKSPF’s impacts across different types of place.

Methodology and data collection

Considering the barriers to alternative options as set out in Table 2, MHCLG has concluded that a theory-based approach is the only viable option for impact evaluation of UKSPF at the programme level. Specifically, contribution analysis is the most appropriate choice of methods. This approach is compatible with The Magenta Book and will attempt to interrogate if, how, and why UKSPF made an impact rather than trying to estimate the size of that impact. 

Contribution analysis can be used to better understand how effective UKSPF funding has been across different types of place in a more holistic and less purely quantitative manner.  It can also operate in the context of complex programmes and employ a wide range of evidence. However, it lacks the robust inferential power of experimental and quasi-experimental approaches - contribution analysis may enable us to reasonably conclude that UKSPF has had an impact on community pride and opportunities, but it will not in isolation be able to quantify those impacts, or state them factually. 

In addition, some local authorities will have used UKSPF funds to continue projects funded by EU structural funds or other funding sources – albeit with a different set of investment priorities and delegated delivery model. This makes it harder to measure clear ‘before and after’ effects, even when using a non-experimental, before vs after type approach.

Outputs

The main output of the programme level impact evaluation will be a detailed report setting out:

  • the methodology and underlying data behind the contribution analysis alongside any evidence that UKSPF projects – rather than other factors – was critical in achieving impacts at both the programme level and with regards to themes and sub-themes

  • accompanying analysis of how different aspects of the UKSPF design (investment priorities, delivery models, local structures) have supported or undermined said impacts, and how this may have varied across different types of place

Timeline

Milestone Timeframe
Development of three theories of change and testing with Technical Advisory Group, and other stakeholders Autumn 2025
Data collection begins End Autumn – Winter 2025
Further development and iteration of three theories of change Autumn – end 2025
Initial results ready for publication Summer 2026

Multiply programme evaluation

Between 2022 and 2025, MHCLG administered Multiply in Scotland, Wales, and Northern Ireland. In England, the Department for Education (DfE) was responsible for administering Multiply to places, whereby Mayoral Strategic Authorities and other upper-tier LAs can draw down their allocated amounts via DfE-approved Multiply-specific investment plans that are distinct from places’ core UKSPF investment plans. The Multiply prospectus sets out further detail on the aims and funding model for Multiply in England.

Additional Multiply funding during 2022-25 was also allocated to DfE to run a ‘what works’ programme of evaluation activity, comprising a systematic review of the current evidence base on numeracy projects, a programme level process, impact, and value for money evaluation, and a set of randomised controlled trials to build the evidence base on what works to improve adult numeracy. This section largely focuses on the programme level evaluation.

Some aspects of Multiply’s evaluation as outlined in this section will be UK-wide, with additional activity in England. This is in line with the devolved nature of adult skills policy and the fact that Multiply’s wider delivery model in Scotland, Wales, and Northern Ireland is managed by MHCLG and delivered by a delegated delivery model locally. Specifically:

  • a complete process evaluation and impact evaluation will be carried out in England only, with involvement of English LLAs only. This is led by DfE
  • UKSPF place level case studies in Scotland, Wales, and Northern Ireland will include process evaluation of Multiply and will be supported by involvement from LLAs in Scotland and Wales and local partners in Northern Ireland. This is led by MHCLG – see place level case studies section

Aims

The core objectives of Multiply’s programme level process, impact, and value for money evaluation are to:

  • understand how effectively Multiply is being implemented in places, in order to improve delivery of adult numeracy projects in the latter years of the programme
  • assess the impact and value for money of the Multiply programme, including whether it has achieved its aims and delivered against its success metrics using a mix of quasi-experimental, survey-based, and qualitative methods
  • summarise lessons learnt and identify promising practice, especially at the local level, to feed into the evidence base around, and inform delivery of, future adult skills programmes

Structure

The Multiply programme evaluation will be split into three phases – feasibility, main, and final reporting and dissemination, as summarised in Figure 8:

Figure 8: Multiply evaluation structure

This image sets out visually the evaluation structure of Multiply as described verbally in the rest of this section, split into the feasibility, main, and final reporting phases.

Within the main phase, the diagram expands on the activities under the headings of impact, process, and value for money assessment. Below these, the diagram lists the cross-cutting activities that will impact all three, including secondary data analysis, quantitative surveys, qualitative research, and use of interim reporting from places. Further details of how each of these will be used in the evaluation can be found in the remainder of this section.

The feasibility phase involved the development and iteration of the overarching evaluation questions and the individual research questions nested within them. Through this, DfE identified which questions were most important to answer and agreed on data collection methods.

In England, this phase also involved a preliminary descriptive analysis of data sources including Individual Learner Records (ILR), the National Pupil Dataset (NPD), large-scale representative panel data, and places’ local Multiply investment plans, to understand:

  • which groups are more or less likely to engage in Multiply
  • equivalent changes in total learner numbers in relevant courses over time
  • how best to measure impacts on numeracy and earnings to establish counterfactuals and outcome measures

The feasibility phase drew on evidence gathered in support of wider work across government around evaluation of adult skills programmes and devolution of the Adult Education Budget (AEB) and, from Academic Year 2024/25, the Adult Skills Fund (ASF). Likewise, the Multiply evaluation will help feed into this work.

The evaluation activities phase will comprise the bulk of the fieldwork, split into three broad strands: impact evaluation, process evaluation, and value for money assessment, with specific activities under each strand outlined in Figure 8.

The final reporting and dissemination phase will build on and tie together all research findings across the various workstreams in a series of detailed reports that will be shared with LLAs and published.

Methodology and data collection

The methods and data sources comprise:

Counterfactual development (England only): to compare the progress of Multiply beneficiaries with a comparable set of those not involved in the programme, a counterfactual group will be constructed comprising statistically matched non-Multiply learners drawn from DfE data about all learners funded from the AEB in the ILR and NPD datasets. To enable this, DfE have undertaken a survey with a random sample of demographically and numeracy-level matched, non-Multiply AEB-sourced learners to inform the counterfactual. To further account for selection bias, DfE will also look at an additional counterfactual group, comprising of adults with low numeracy skills who are not engaged in basic skills (i.e., non-AEB learners) to further improve the robustness of any comparisons. Development of Multiply-specific counterfactuals will take account of parallel counterfactual work for the core UKSPF intervention level evaluation.

Quantitative surveys (England only): a separate set of surveys will be conducted among a randomly selected range of Multiply participants involved in delivering the programme to assess satisfaction with the programme, views on its effectiveness, its strengths, and what could be improved. The surveys will also be used to gather extra information on the backgrounds, experiences, and attitudes of learners in the programme, in order to help DfE understand more about who Multiply is reaching, how it is being delivered, and how views about Multiply differ amongst different groups. Surveys will include (non-exhaustively) learners, practitioners, providers, LLA staff, and local employers.

Qualitative research: will be used to explore in more detail themes and trends emerging from the quantitative survey work. This includes:

  • (England only) in-depth interviews with Multiply learners who dropped out of their courses, research providers undertaking the Multiply RCTs, and central UKG staff involved in developing and delivering the programme across DfE, MHCLG, and the Education and Skills Funding Agency (ESFA)
  • (England only) case studies within specific local authorities to understand in detail how investment plans were developed, how they have spent Multiply funding, how they have assessed value for money of this spend (especially non-classroom approaches) at the local level, and how they have avoided duplication with other local AEB spend. These will be distinct from the place level case studies carried out for the core UKSPF evaluation
  • (UK wide) collecting detailed examples of promising practice delivered as part of the Multiply programme from local area commissioners and education providers

The Multiply evaluation will also rely heavily on secondary analysis of data, matching and linking data collected through surveys and other primary research methods using common identifiers to develop as complete a picture of learners as possible. This may include:

  • in England, linking of NPD, ILR, and Longitudinal Education Outcomes (LEO) data to track progression into further learning or the labour market
  • quarterly and six-monthly reporting data, management information, and data from local investment plans
  • analysis of wider datasets collected to support the wider UKSPF impact evaluation, particularly where they relate to people and skills-type projects that may overlap with Multiply’s aims

Outputs

Key outputs will include:

  • a UK-wide systematic review of the evidence base around numeracy interventions, which will help provide context for the landscape Multiply will be delivering in, highlight historic good practice, and inform the refinement of evaluation questions at the feasibility stage
  • four interim reports, with at least one per year of delivery, which will present any emerging and cross-cutting findings from evaluation activity up until that point. The findings from these reports have been shared with LLAs
  • standalone reports, grouped by similar research activities and covering the top line findings of each group of research activities
  • learner survey interim reports for local areas, where more than 100 learners per area have responded to the survey. These will show the results for the local area and the England average and are updated on a monthly basis. The reports are for local area internal use only until the final programme evaluation report is published
  • a final written evaluation report providing comprehensive answers to all evaluation questions, organised by evaluation component (impact, process, and value for money). It will draw on the standalone reports to set out a narrative for the impacts, successes, and pitfalls of Multiply across the UK, alongside clear recommendations for future programme design and implementation
  • a practitioner facing report, which will be an accessible summary of key evaluation findings, lessons learned, and recommended best practice aimed at local level stakeholders involved in delivery of Multiply interventions
  • a searchable database of examples of promising practice identified by the sector from the Multiply programme

Timeline

Please note this timeline relates to the DfE-led Multiply programme evaluation. Timings for the MHCLG-led Scotland, Wales, and Northern Ireland case studies can be found in the place level case studies chapter.

Milestone Timeframe
Feasibility phase End 2022 - Spring 2023(#)
Fieldwork for learner surveys Jan 2022 - Spring 2025
Fieldwork for most other research activities Spring 2023 - Spring 2025
In-depth interviews with stakeholders (of which LLAs are included) End 2023 - Early 2024(#)/End 2024 - Early 2025(#)
Interim report 1 disseminated to LLAs Summer 2023(#)
Interim report 2 disseminated to LLAs Spring 2024(#)
Interim report 3 disseminated to LLAs Summer/Autumn 2024(#)
Interim report 4 disseminated to LLAs Spring 2025
Final reports published Autumn 2025

(#) This milestone has been completed (as of April 2025)

Rural England Prosperity Fund Evaluation

For the REPF DEFRA will be leading a process evaluation.

Aims

The REPF evaluation aims to:

  • examine the effectiveness of the administration and implementation of REPF at a department, local authority, and individual project level

Structure

The REPF evaluation focuses on understanding processes and impacts to build an evidence base on local growth projects in rural areas. In parallel to the REPF evaluation, DEFRA are undertaking additional strategic research to understand the particular challenges to growth that rural areas experience. This is not the focus of this section.

Methodology and data collection

The evaluation is split into 2 phases: developing the theory of change and process evaluation.

Phase 1: developing the theory of change will involve:

  • a review of literature, documentation, and secondary data to develop the theory of change for the REPF

Phase 2: process evaluation will focus on understanding how the REPF is working and what has been delivered, using:

  • analysis of monitoring data: examining factors such as levels of expenditure and the delivery of outputs
  • survey of LLAs: an online survey of local authorities covering processes, how LLAs have used the REPF, views on the effectiveness of the REPF as an instrument for stimulating economic development, and whether projects would have occurred in the absence of the programme
  • depth research with LLAs: interviews with stakeholders from LLAs participating in the REPF. The interviews will cover a range of process evaluation questions, as well as the broader strategic policy questions about factors that may be holding back prosperity in rural areas
  • case studies: ten local authority case studies comprising:
    • a review of the local socio-economic context and the performance of the rural economy
    • site visits to local authorities
    • follow-up interviews with a sample of business and community groups awarded funding through the REPF

Outputs

The REPF evaluation will produce two outputs: an interim and a final evaluation report.

The interim report will set out the theory of change and lines of enquiry to be investigated in the final evaluation report.

The final evaluation report will consider the implementation of REPF, its theory of change, and the findings from the process evaluation of the programme. This report will also provide scoping and feasibility analysis on the recommended approach for a rigorous impact and economic evaluation of the REPF.

Timescale

Milestone Timeframe
Evaluation planning Winter 2024(#)
Phase 1: developing the theory of change Spring 2025
Phase 2: process evaluation Spring 2025
Interim process evaluation submitted to Defra Spring 2025
Ready for publication Summer 2025

(#)This milestone has been completed (as of April 2025)

Evaluation tools and data sources

UKSPF surveys package 

The UKSPF surveys package facilitates other components of the UKSPF evaluation by providing a new and direct way to understand the impact of UKSPF activity, focusing on specific projects and places.  

Survey sub-components 

Survey data is being collected through 4 channels: 

  • Community Life Survey (CLS): the CLS has run annually as an online survey since 2012, gathering data from around 10,000 respondents per year in England only, aggregated to regional (ITL1) level. It includes questions on satisfaction with place, community engagement, civic pride, and social cohesion and is already a key evidence source for policymakers across UKG. Delivery of the ‘core’ CLS is the responsibility of DCMS
  • CLS local level boost: to better serve UKSPF-specific evaluation needs, MHCLG and DCMS developed an additional component of the CLS – the ‘local level boost’. The local level boost uses the same methodology as the core CLS but boosts sample sizes to provide LLA level estimates. The question set has also been extended to aid UKSPF evaluation. This is essential for those parts of the evaluation – such as the intervention level impact evaluation and place level case studies – that require more granular data at LLA level. The first wave of the CLS boost, for 2023/24, is published, and second and third boosts will run in 2024/25 and 2025/26
  • Your Community, Your Say (YCYS) surveys: alongside expanding the scope of the CLS, MHCLG have in parallel run a series of UK-wide surveys that focus on - within those geographies participating in the intervention and place level components - building an understanding of local views on the UKSPF, the projects it is supporting, and its delivery in places. This will be achieved through two variants:
  • in case study areas: additional questions focusing on each area’s local context, people’s familiarity with the UKSPF as a whole alongside related local growth programmes (such as the LUF and Towns Fund), and baseline levels of wellbeing
  • in study group areas: additional questions focusing on people’s perception of the communities and place study groups in a particular geography rather than the UKSPF as a whole. For this variant, survey participants were selected based on them being identified as direct UKSPF beneficiaries through residing in the impact geography of an area-based project (e.g., living near to a planned new green space)
  • local survey tool for LLAs: a collection of guidance has been developed to enable LLAs and local stakeholders to run surveys and collect data on community perceptions themselves, to support local evaluations where planned, and to enable a comparison with other geographies. Use of the tool is optional; LLAs are not expected to use it to collect data for the overall UKSPF evaluation

The rest of this chapter will focus on the latter two bullets, for which MHCLG are responsible for delivering specifically in support of the UKSPF evaluation.

Figure 9 illustrates how the different surveys described above will contribute to our understanding of UKSPFs impact across a range of spatial scales.

Figure 9: Survey sub-components

This image breaks down the survey sub-components as described earlier in this section, grouping them visually on the basis of: 

  • whether they are delivered by DCMS (CLS and CLS local boost) or MHCLG (YCYS place and intervention specific surveys, local survey tool) 

  • the spatial scale they will operate at: regional and above (CLS), LLA level (CLS local boost, YCYS place surveys) or intervention level (YCYS intervention surveys) 

The diagram then shows that these subcomponents will all feed into the intervention, place, and programme level components of the wider UKSPF evaluation

Survey implementation

Work on the surveys was split into 2 phases the development phase and the implementation phase

The development phase involved: 

  • developing the core set of questions to be used in the surveys 

  • developing tools to be used for collection of survey data 

  • identifying broad sampling strategies for both the study group and case study-focused variants of the YCYS survey 

  • developing the local survey tool for LLAs, in line with the principles in the above 3 bullets 

The implementation phase involved: 

  • refining the survey design for each study group and place (including the inclusion of intervention and place level evaluation specific questions) 

  • refining the individual sampling strategies identified during the development phase 

  • subsequent delivery of the surveys themselves, carrying out (initially) two survey sweeps within each of the study groups and identified counterfactuals, and one sweep of a selection of case studies 

To allow for comparison between the different survey components set out in Figure 9, the base question set for the YCYS surveys built on, rather than replaced, that used for the CLS - meaning that all surveys will share the same core content. Further UKSPF-specific questions were added through the development phase relating to study groups and/or places as needed, attempting wherever possible to follow a similar tone and format to the core CLS. As part of question development and refinement for both the place and intervention level surveys, cognitive testing was used to make sure that all questions and contextual information are as clear, unambiguous, and user-friendly as possible. 

For the intervention level strand of YCYS, surveys ran twice over the UKSPF funding cycle, with the first sweep running in Spring 2024 and the second sweep running in Spring 2025. For the place level strand of YCYS, one sweep was opted for, running only in Spring 2025. Based on examination of previous iterations of the CLS, MHCLG estimates that to effectively measure the long-term impacts of the UKSPF, further regular sweeps will be necessary after the current programme period comes to an end in March 2026. 

Methodology and data collection

For the intervention level YCYS surveys, sample populations were determined with reference to the treatment and control groups constructed as part of the intervention level communities and place evaluation. Nine projects were targeted alongside nine counterfactuals, which aggregate to three communities and place study groups. Each survey then targeted a catchment area around the identified project. 

For the place level surveys, samples were drawn from across the case study LLA (or LA area in NI) in question, though depending on the size of the LLA and the spread of projects within its boundaries, it was necessary to draw samples from a more narrowly defined geographic area within the LLA in some cases. 

In all cases, sampling frames were designed to achieve statistically significant results. For the intervention level surveys, the achieved sample size was 500 respondents. For the place level surveys, a larger sample of around 1,500 respondents per sweep was achieved, given the aim of the case studies to cover a broad range of population groups and projects within each LLA/place. 

Survey data for both the core questions and the intervention and place level variants were collected directly from participants using an online survey platform, following the methodology of the CLS.

Alongside the MHCLG-led YCYS surveys, a survey tool was developed to allow LLAs to capture their own data from respondents in support of optional local level evaluation and their own policy development. The local survey tool mirrors the questions developed for the YCYS surveys, with scope for LLAs to add additional bespoke questions where appropriate.  

The local survey tool contains detailed guidance setting out how to develop a sampling strategy, a subset of the CLS core question set, and how to collect and process the data, in a manner consistent with the strategy followed by YCYS surveys. Development of the survey tool itself was funded by MHCLG. 

Survey outputs

The CLS results and methodology will be published in full, as per previous years. This will include the additional information collected as part of the local level boost.

The YCYS surveys will form a crucial component of the intervention and place level evaluations covered elsewhere in this strategy. The UKSPF-specific YCYS surveys will not serve as outputs in their own right; rather, they will help facilitate other parts of the evaluation. For this reason, MCHLG doesn’t currently plan to publish a separate report covering the YCYS survey results in isolation, though may choose to make selected outputs and aggregated results available through inclusion in reports covering other evaluation components or separately, as appropriate.

The local survey tool, alongside guidance on how to use it, will be published shortly and made available on the UK Shared Prosperity Fund: evaluation page.

Timeline

Milestone Timeframe
Community Life Survey and local level boost  
Development of survey questions and sampling strategy Spring – Summer 2023(#)
First survey sweep October 2023 – March 2024(#)
Second survey sweep October 2024 – March 2025(#)
Your Community, Your Say surveys  
Development of survey questions and sampling strategies Spring – Summer 2023(#)
First sweep of survey for intervention February – March 2024(#)
Second sweep of survey for intervention and first sweep for place February – April 2025
Local survey tool  
Local survey tool published April 2025

(#) This milestone has been completed (as of April 2025)

Overview of data sources

UKSPF evaluation will draw on a broad range of data sources, both UKSPF- and place-specific. Some data sources – for example, ONS-held administrative data on subregional productivity, economic activity, and demography – will feed into multiple components, while others will be specific to a particular component.

Table 3 sets out a broad, non-exhaustive taxonomy of the different types of data that MHCLG expect to use across the evaluation as a whole.

Table 3: High-level taxonomy of UKSPF data sources

Data type UKSPF-specific? Spatial scale Description
Beneficiary Yes Individual Identifiable information (NINOs and CRNs) concerning individuals and businesses who stand to theoretically benefit from a given UKSPF funded project. Will be used alongside matching data to track the impact of UKSPF funded projects at a granular level
Matching No Individual Universal Credit, earnings, and tax records of individuals held by HMRC and DWP, to be combined with UKSPF-specific beneficiary data in order to track the impact of UKSPF funded projects
Survey-based Mixed Individual Survey responses from individuals participating in the DCMS-run Community Life Survey and additional UKSPF-specific Your Community, Your Say surveys at both intervention and place level
Monitoring Yes Project Information collected from every LLA through the 6-monthly UKSPF reporting cycle covering progress and spend by project and project toward the outputs and outcomes agreed locally
Administrative Mixed LLA ONS, UKG, and LLA-held contextual data covering local demography, economic activity, labour markets, and detail of other funding (current and historic) received by places that may overlap with the UKSPF. Also, may include UKSPF-specific information around project delivery structures, private and third sector partners, and local stakeholders

To minimise burdens on LLAs and local partners in Northern Ireland across all parts of the evaluation, MHCLG will try to use existing data wherever possible. In some cases, it will be necessary for LLAs, local partners, or grant recipients (depending on which parts of the evaluation they participate in) to assist in gathering additional information.

Data will be collected via:

  • regular UKSPF reporting: the UKSPF’s light-touch reporting structures requires all LLAs to report six-monthly on project and theme level spend and progress towards agreed outputs and outcomes. MHCLG will utilise the data captured for the evaluation strands, minimising the number of data collection channels that LLAs will need to service
  • contractors directly: as well as the monitoring data collected by all LLAs, contractors for each of the components will collect additional data as needed to support evaluation delivery. Contractors across different components will be required to collaborate and share data where possible to minimise the number of discrete data requests for LLAs involved in multiple parts of the evaluation. The individual sections for each component contain more detail on the types of data they will draw on

Outside of regular UKSPF reporting, LLAs are not expected to start collecting data for any part of the evaluation unless contacted and asked to do so by MHCLG or their evaluation partners.

Data management

Where appropriate and in line with data protection guidelines, MHCLG will seek to publish the aggregated data and analysis underpinning the main evaluation reports to allow for additional examination and investigation by external experts and stakeholders.

All data sources to support the UKSPF evaluation - including personal data where appropriate - will be collected and processed in full compliance with data protection guidelines as set out in the Data Protection Act 2018 and General Data Protection Regulation (GDPR). MHCLG has carried out a data protection impact assessment for each evaluation strand and will be responsible for establishing data protection agreements with LLAs and other relevant local stakeholders, where needed, to facilitate the collection, sharing, and processing of data with MHCLG, other government departments, and contractors.

  1. The UK government’s 2024 Autumn Budget announced a further £900 million of funding for local investment by the end of March 2026, extending the original timeline and spend of the UKSPF. 

  2. All data sources to support the UKSPF evaluation - including personal data, where appropriate - will be collected and processed in full compliance with data protection guidelines as set out in the Data Protection Act 2018 and General Data Protection Regulation (GDPR). 

  3. The place level case studies element focused on an original 36 LLAs. Two case studies have not been taken forward due to local capacity, leaving a total of 34 LLAs. 

  4. Except Northern Ireland, where only a Northern Ireland-wide process evaluation will be conducted.