3. Determine scope and approach

Once a decision has been taken to do an evaluation, you need to determine:

  • the scale of effort and resources that will be allocated to do the evaluation
  • the key evaluation questions that will be addressed and
  • the optimal design to conduct the evaluation.

You will also need to consider where the activity or program is in the policy cycle, and whether the necessary data and evidence‑base is available to conduct the evaluation.

Preparing a detailed evaluation plan that describes how an activity or program will be monitored and evaluated, and, importantly, how the results will be used, can help to ensure there is a common understanding of exactly what needs to be evaluated and why.

Defining the evidence and data required for an evaluation, and how it will be collected, are important aspects of scoping a fit for purpose approach. For more information see Define evidence and data sources and Collect evidence and data.

Things to consider

What are the key evaluation questions that will be addressed by the evaluation?

  • Have the evaluation questions been developed collaboratively with all stakeholders?
  • Have the people who will use the evaluation, and any other key stakeholders or delivery partners, helped to prioritise and select which questions the evaluation will focus on?
  • Are the key evaluation questions clear, measurable and culturally appropriate?

Have you developed an evaluation plan?

  • Do you have a plan that outlines exactly what you want to evaluate, what information you need to collect, who will collect it, and how it will be done?
  • Do you have specific evaluation activities, tasks, roles, resource allocations, and deadlines been identified?
  • How will the evaluation be governed?
  • How will the results be used to prove or improve the way an activity or program is delivered?

Are the scope and approach for the evaluation agreed? If so, can it be done within available resources?

  • Has the scope and approach for your evaluation been agreed to by key stakeholders?
  • Have the resources available to do the evaluation been taken into account when determining the scope and approach?

Develop an evaluation plan

Before commencing an evaluation, it is important to understand:

  • your operating context
  • the purpose and objectives of undertaking an evaluation
  • who is the main audience
  • how and when the results will be used to improve outcomes.

Once a decision has been taken to do an evaluation, it is important to determine:

  • the scale of effort and resources that will be allocated to do the evaluation – this is usually proportional to the value, impact, strategic importance and risk profile of the program or activity
  • the key evaluation questions that will be addressed by the evaluation – these should outline exactly what you hope to learn through the evaluation
  • the optimal design – evaluations should be well‑designed, robust, ethical and culturally appropriate. Evaluation approaches and methods also need to be appropriate for the relevant stage of the policy cycle (for example, before, during or after a program or activity has been implemented, or as part of a rapid evaluation to support urgent decision‑making).

Preparing a detailed evaluation plan that describes how an activity, program or policy will be monitored and evaluated, and, importantly, how the results will be used, can help to ensure that there is a common understanding of exactly what needs to be evaluated and why.

Data for different evaluation designs

The design of an evaluation identifies:

  • the types of information that the evaluation will need to generate (that is, descriptions, judgements and interpretations)
  • the types of comparisons that will be made (for example, before and after; between sites; clients compared with non‑clients) – evaluations of appropriateness, efficiency and/or effectiveness require different types of information and comparisons and therefore require different designs
  • the size and composition of samples to be used.

Deciding on the information, comparisons, and samples needed for an evaluation, sets the parameters for selecting your methods of data collection and analysis.

Appropriateness evaluations

Appropriateness evaluations establish judgments about the level of need for a government program or activity, and assess whether the strategies proposed or adopted to address the need are likely to be successful.

This often involves using the major statistical collections maintained by the Australian Bureau of Statistics (demographic, social, labour force, balance of payments, foreign trade, and so on) and other sources of Big Data.

It may involve a review of research and evaluation literature relating to similar programs/activities, and may also seek feedback from community consultations or focus groups to assess appropriateness.

Efficiency, effectiveness, and cost‑effectiveness evaluations

Efficiency, effectiveness, and cost‑effectiveness evaluations often make substantial use of existing data created for the day‑to‑day management of the program (referred to as “administrative data”).

This includes data about inputs (staff, administrative costs and program expenditure) and outputs (for example, clients assisted, research projects funded, conferences held, training courses provided, and so on). Such information is essential to enable descriptions and judgements as the basis for interpreting outcomes.

Effectiveness evaluations

Effectiveness evaluations often make use of existing data and statistics to make comparisons between program outcomes and national norms.

Measuring effectiveness in health, welfare and education programs often requires individual case data, which may not be monitored or updated on a regular basis due to cost when many clients are involved.

Data may need “topping up” with additional research on attitudes, opinions, behavioural intentions, or perceptions on whether the program has been effective for them. Such data would need to be collected from individuals (rather than groups) using methods such as questionnaires, interviews, diaries or activity logs.

Back to top