Glossary

A common language and shared understanding are critical to good MEL practice.

Our glossary of terms encompasses language from MEL, primary prevention practice, and project management.

A more extensive list of terms related to the primary prevention of family violence and all forms of violence against women can be found in these resources:

We expect our glossary to expand and refine as our knowledge and experience increases. If you have any advice on terms that should be changed, removed or added, please email our MEL team..

Definitions may vary slightly across sectors and organisations, so it is important to refer to your grant funding program, organisation or department for its preferred MEL definitions.

 

Term Definition
Activity

Activities are what a program delivers: the services and initiatives that are undertaken, and the everyday work done (e.g. providing workshops and training), which will produce an output from the use of inputs (such as funds or staff).

(Source: Adapted from Outcomes Reform Statement and OECD – Glossary of key terms in evaluation and results based management [PDF])

Analytical tools

Methods used to process and interpret data and information during an evaluation. Also used during monitoring activities.

(Source: Adapted from WHO Evaluation Handbook [PDF])

Attribution

The ascription of a causal link between observed (or expected) changes and a specific intervention. Attribution refers to that which is credited for the observed changes or results achieved.

(Source: WHO Evaluation Handbook [PDF])

Baseline

The status of output and outcome-related measures such as knowledge, attitudes, norms, behaviours and conditions before an intervention. A baseline is what progress can be assessed, or comparisons made, against.

(Source: Adapted from UNAIDS – Basic Terminology and Frameworks For monitoring and evaluation [PDF])

Beneficiaries

The individuals, groups or organisations, that benefit, directly or indirectly, from an intervention.
(See also: ‘target group’)

(Source: OECD – Glossary of key terms in evaluation and results based management [PDF])

Capacity-building

Strengthening of individual, community and organisational capability. Developing of long-term skills and knowledge that promote self-sufficiency.

(Source: Adapted from Department of Foreign Affairs and Trade [DFAT])

Collective impact

Collective impact is a collaborative approach to addressing complex social issues, consisting of five conditions: a common agenda; continuous communication; mutually reinforcing activities; backbone support; and shared measurement.

(Source: Australian Institute of Family Studies)

Confidentiality

The obligation of people not to use private information – whether private because of its content or the context of its communication – for anything other than its intended purpose.

(Source: National Statement on Ethical Conduct in Human Research (2007) – Updated 2018)

Consent

A person or group’s agreement, based on adequate knowledge and understanding of relevant material, to participate in research (or an intervention).

(Source: National Statement on Ethical Conduct in Human Research (2007) – Updated 2018)

Contribution

Contribution describes a recognition that an initiative is only one cause of (contributing factor to) change.
(See also: ‘attribution’)

(Source: Centre for Evaluation and Research – Department of Health and Human Services Evaluation Guide [PDF])

Data

Specific collections of quantitative and qualitative facts, figures and numbers. Data must be analysed to become information. Data can refer to raw data, cleaned data, transformed data, summary data and metadata.

(Source: Adapted from UNAIDS – Basic Terminology and Frameworks for monitoring and evaluation [PDF] and National Statement on Ethical Conduct in Human Research (2007) – Updated 2018)

Data analysis

Data analysis is a systematic process that involves organising and classifying data, tabulating it, summarising it and comparing the results with other appropriate data to extract useful information and reach a broader understanding. The data analysed can be both quantitative and qualitative.

(Source: Adapted from INTRAC – Data Collection [PDF])

Data collection

The process and activities of identifying information sources and generating information. Data can be collected through specific collection tools (e.g. interviews, observation and surveys).
(See also: ‘data collection instruments’), ‘data analysis’)

(Source: Adapted from INTRAC – Data Collection [PDF])

Data collection instruments (or tools)

Documents, whether hard copy or virtual, designed to inform the collection of data and/or record that data for research or evaluation purposes. Example instruments include questionnaires, surveys, interview schedules, journaling, or record books.
(See also: ‘survey’), ‘semi-structured interview’), ‘focus group’), ‘data collection’)

Data collection disaggregation Disaggregation is the breaking-down of observations in data, usually within a common branch of a hierarchy, to a more detailed level from which detailed observations can be taken (e.g. gender, age, ethnic groups, location, disability, socioeconomic status).
Data management

The function and processes that provide access to data and which performs or monitors its storage.

(Source: Centre for Evaluation and Research – Department of Health and Human Services Evaluation Guide [PDF])

Data sources

The resources used to obtain the data needed for monitoring and evaluation activities. These sources may include
(among many others) official government documents, clinic administrative records, staff or provider information, client-visit registers, interview data, sentinel-surveillance systems, and satellite imagery.

(Source: M&E Fundamentals: A self-guided mini-course [PDF])

De-identification

The process of ensuring that data does not reveal an individual’s identity, or enable someone to reasonably guess a person’s identity.

Dissemination The process by which findings and learnings of monitoring and evaluation are communicated to the relevant audiences/stakeholders.
Developmental evaluation

A structured way to monitor, assess and provide feedback on the development of a project or program while it is being designed or modified; that is, where inputs, activities and outputs may not yet be known, or may be in a state of flux.
(See also: ‘process evaluation’)

(Source: Australian Institute of Family Studies. See also Patton, M.Q. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. The Guilford Press. New York.)

Effectiveness

The extent to which an intervention’s objectives were achieved, or are expected to be achieved, taking into account their relative importance.

(Source: OECD – Glossary of key terms in evaluation and results based management [PDF])

Efficiency

A measure of how economically resources/inputs (funds, expertise, time, etc.) are converted into results.

(Source:  WHO Evaluation Handbook [PDF])

Ethical standards

The principles and standards of conduct that direct a group or individual in research, monitoring or evaluation activities.

(Source: Save the Children International M&E Handouts Package 2011)

Evaluation

The formal systematic and objective assessment of the value of an ongoing or completed program or project. This process determines the relevance and fulfilment of objectives.

(Source: Free from Violence Monitoring and Evaluation Strategic Framework [PDF])

Family violence

Family violence is violence that is directed by one or more family members against another or others. Intimate partner violence is the most common, and commonly recognised, form of family violence. However, as defined by the Family Violence Protection Act (2008) (Vic), family violence can occur in other relational forms including relationships amongst people considered family sharing accommodations, people across generations including the elderly and children, kinship relationships, and any other relationships recognised by communities as constituting family relationships.

(Source: Respect Victoria)

Feedback

The sharing of findings generated by monitoring and evaluation to help facilitate learnings.

Finding

A factual statement based on evidence from one or more evaluations.

(Source:  WHO Evaluation Handbook [PDF])

Focus group

Facilitated discussions held with a small group of people who have specialist knowledge or interest in a particular topic.

(Source: INTRAC – Focus Group Discussions [PDF])

Formative evaluation

Evaluation during the period of investment or intervention that focuses on implementation. Formative evaluation supports improvements, re-designs and the development of new initiatives to address identified gaps.

(Source: Free from Violence Monitoring and Evaluation Strategic Framework [PDF])

Gender analysis

The variety of methods (from planning to data analysis) used to understand the relationships between genders, their access to resources, their activities, and the constraints they face relative to each other. Gender analysis recognises that gender and its relationship with race, ethnicity, culture, class, age, disability, and/or other status is important in understanding the different patterns of involvement, behaviour and activities in economic, social and legal structures.

(Source: Adapted from Government of Canada)

Impact

Positive and negative, primary and secondary long-term effects produced by an intervention – directly or indirectly, intended or unintended.

(Source: OECD – Glossary of key terms in evaluation and results based management [PDF])

Impact evaluation

A methodological approach that shows how much of the observed change in intermediate or final outcomes, or ‘impact,’ can be attributed to the program or project. It requires designing the evaluation to assess differences in the outcomes of having or not having the program or project.

(Source: M&E Fundamentals: A self-guided mini-course [PDF])

Indicator

 

 

A quantitative or qualitative factor or variable that provides a simple and reliable means to measure achievement and reflect the changes connected to an intervention. Indicators can apply to both outcomes and outputs.
(See also: ‘outcome indicators’)

(Source adapted from: OECD – Glossary of key terms in evaluation and results based management [PDF])

Indigenous Data Sovereignty

 

Refers to the right of Aboriginal and Torres Strait Islander peoples to exercise ownership over Indigenous data. Ownership of data can be expressed through the creation, collection, access, analysis, interpretation, management, dissemination and reuse of Indigenous data.

(Source: Mayi Kuwayu – The National Study of Aboriginal & Torres Strait Islander Wellbeing – Indigenous Data Sovereignty Principles)

Inputs

The resources or investments that will be allocated (or required) to deliver activities and achieve outputs.

(Source: Adapted from Outcomes Reform Statement [PDF])

Intersectionality

How characteristics such as gender, ethnicity, ability, sexual orientation, sex, gender identity, religion, age or location can compound and interact on multiple levels to create overlapping forms of discrimination and power imbalances, which may increase the risk of experiencing violence in families or by women.

(Source: Free from Violence)

Lessons learnt

Generalisations based on evaluation experiences with projects, programs or policies that abstract from the specific circumstances to broader situations. Lessons frequently highlight strengths or weaknesses in preparation, design and implementation that affect performance, outcome and impact.

(Source adapted from:  WHO Evaluation Handbook [PDF])

Logic model

A program or project design, management, and evaluation tool that describes the main elements of a program or project and how these elements work together to reach a particular goal. The basic components of a logic model are inputs, activities, outputs, and outcomes (short-, medium- and long-term).

Meta-evaluation

A meta-evaluation is a process used to aggregate findings from a series of evaluations. It also involves an evaluation of the quality of this series of evaluations and its adherence to established good practice in evaluation. Meta-evaluation can also be a systematic tool for the quality control of evaluation studies.

(Source:  WHO Evaluation Handbook [PDF])

Monitoring

Monitoring is a continuing form of assessment that uses systematic collection of data on specified indicators and wider information during the delivery of an activity project or intervention.

(Source: Adapted from OECD DAC Handbook on Security System Reform [PDF])

Monitoring and evaluation plan

A written plan outlining processes and standards for the monitoring and evaluation of a program, project or initiative. The plan describes the data needs linked to a specific project, and the standardised measures that need to be collected.

(Source: Adapted from UNAIDS – Basic Terminology and Frameworks For monitoring and evaluation [PDF])

Most-significant change

An approach that generates and analyses personal accounts of change, then decides which of these accounts is the most significant – and why.

(Source: Better Evaluation)

Objective A specific statement detailing the desired accomplishments of a program or project.
Outcome

Clear, unambiguous and high-level statements about the changes that indicate success.

(Source: Adapted from Outcomes Reform Statement [PDF])

Outcome evaluation

Outcome evaluation measures program or project effects in the target population by assessing progress in the outcomes that the program aims to address.

(Source: CDC – Types of Evaluation [PDF])

Outcome indicators

Indicators that specify what needs to change in order to achieve a desired outcome. Outcome indicators reflect the key drivers and influences on progress.

(Source: VSG Outcomes Reform Statement [PDF])

Outcome measures

Granular, specific criterion that detail the size, amount, or degree of change.

(Source: VSG Outcomes Reform Statement [PDF])

Outputs

The number of activities delivered, products produced, or clients served by a program.

(Source: VSG Outcomes Reform Statement [PDF])

Participatory evaluation

An approach that involves stakeholders (including beneficiaries) of an initiative in the evaluation process. It can occur at any stage of the process.
(See also: ‘evaluation’)

(Adapted from: Better Evaluation)

Primary data

Original or raw data collected using methods such as surveys, interviews or experiments.

(Source: Free from Violence Monitoring and Evaluation Strategic Framework [PDF])

Primary prevention

An approach that seeks to prevent family violence or all forms of violence against women before they occur in the first instance.

(Source: Free from Violence Monitoring and Evaluation Strategic Framework [PDF])

Process evaluation A systematic appraisal that determines whether program or project activities are being implemented as planned.  Process evaluations may be done periodically or continually.
(See also: ‘evaluation’)
Program

An organised set of projects and/or activities, and the allocation of resources directed toward a common purpose, objective, or goal. It often has a long-term focus.

(Adapted from: Centre for Evaluation and Research – Department of Health and Human Services Evaluation Guide [PDF])

Project

A structured undertaking of limited duration with a clearly defined scope through the delivery of activities to produce specific outputs or results that will contribute to achieving an objective.

(Adapted from: Centre for Evaluation and Research – Department of Health and Human Services Evaluation Guide [PDF])

Qualitative data

Descriptive data collected using methods such as interviews, focus groups, observation and key informant interviews. It is usually expressed in narrative form or pictures (not numerically). It can provide an understanding of social situations and interactions, as well as people’s values, perceptions, motivations and reactions.

(Source: Adapted from UNAIDS – Basic Terminology and Frameworks For monitoring and evaluation [PDF])

Quantitative data

Measures of values or counts that are expressed as numbers.

Quantitative data use numeric variables (e.g. how many, how much, how often).

(Source: Australian Bureau of Statistics)

Research

The systematic investigation of information to create new knowledge and/or the use of existing knowledge in a new way so as to generate new concepts, methodologies and understandings. This could include synthesis and analysis of previous research to the extent that it leads to new or adjusted outcomes.

(Source: National Statement on Ethical Conduct in Human Research (2007) – Updated 2018)

Rights-based approach An approach that aims to integrate international human rights and humanitarian law norms, standards and principles and the values that underpin it into how initiatives are planned, designed, and delivered.
(See also: ‘strengths-based approach’)
Secondary data

Secondary data is data that has already been collected (through primary sources) by a researcher or others and is then used for another research purpose.

(Source: Free from Violence Monitoring and Evaluation Strategic Framework [PDF])

Semi-structured interview A meeting in which the interviewer does not strictly follow a formalised list of questions. Instead, they ask more open-ended questions, allowing for a discussion rather than a straightforward question-and-answer format. Semi-structured interviews are useful for collecting information on people’s ideas, opinions, or experiences.
(See also: ‘data collection tools’)
SMART A criterion for guiding the design of good measures and indicators. The acronym stands for: specific, measurable, achievable, realistic, and timebound.
Stakeholder

A person, group, or entity who has a direct or indirect role and interest in the goals, objectives and implementation of a program or project intervention and/or its evaluation.

(Source: Adapted from UNAIDS – Basic Terminology and Frameworks For monitoring and evaluation [PDF])

Strengths-based approach

An approach that identifies and builds on the existing strengths of individuals, communities and organisations instead of starting with a focus on their deficits.

(Source: Adapted from Social Care Institute for Excellence)

Summative evaluation

An evaluation that seeks to judge the worth of an initiative, typically at the end of a cycle of intervention or investment. Its focus can be on the short, medium-term and the longer-term outcomes and causal mechanisms of change, as well as the appropriateness of an overall strategy, program or project design.

(See also: ‘evaluation’), ‘impact evaluation’)

(Source: Free from Violence Monitoring and Evaluation Strategic Framework [PDF])

Survey

A tool to collect and record information from multiple people, groups or organisations in a consistent way. Surveys and questionnaires can be used on their own as data collection tools and at any time within a project cycle.

(Source: Adapted from INTRAC – Surveys and Questionnaires)

Target group The specific people, communities or organisations that your primary prevention intervention is aiming to benefit and engage with.
(See also: ‘beneficiaries’)
Tertiary data

Information based on a collection of primary and secondary sources (e.g. bibliographies, dictionaries, indexes).

(Source: University of Saskatchewan)

Theory of change

A theory of change explains how the activities undertaken by an intervention (such as a project, program or policy) contribute to a chain of results that lead to the intended or observed impacts.