Step 7: Interpret your results

Making sense of the data

In Step 7 it is time to make sense of the data you have collected and analysed. Step 6 and Step 7 are closely related. While analysing data in Step 6, you will already have noticed some interesting information that tells you about your measures and key evaluation questions.

Step 7 is about taking a more organised approach to the use of data. Your job is to gather the relevant parts of the data that show how you are progressing against each indicator. It is at this point that you are using data as ‘evidence’ of success. We call this interpreting the data. As you read through the analysed data, you might ask questions such as, ‘What can this interview feedback or survey response tell me about Indicator A or Indicator B?’

Remember – evaluation is about making a ‘value’ judgment drawing on the information in front of you. The more sources of data you have, the more likely you are to form an accurate picture of your indicators. You might be content with reporting some basic findings (through descriptive data) or you might bring together different sources or types of data to form a view about an indicator or evaluation question (this is called triangulation).

Remember, even when an aspect of your project has not quite worked as you anticipated, this is in itself a finding! In MEL, learning is a critical part of the process. It is just as important to learn from what doesn’t work, as what does.

The process of interpretation may not be linear. For example, interpreting your data may lead to further questions, as the following example shows.

Data interpretation examples

Indicator Data source & data Application
Increased confidence amongst participants about putting bystander learning into action
  • Participant feedback sheet –
    Question asking the degree to which the workshop was a key contributing factor to an increase in confidence to take bystander behaviour
  • 65% of participants ‘strongly agreed’ or ‘agreed’ with the statement.
  • Question about the likelihood that because of the workshop, participants would take some kind of bystander action in the future.
  • 80% strongly agreed or agreed.
  • You now have two data points from which to interpret the answer to your question as to whether the workshop has increased confidence. Your written analysis can explain that together, these results show a self-reported increase in confidence from the workshop, and added to that, a reported intention to act.
  • This data encourages further question for the improvement of planning and practice. For example, what aspects of the workshop were most important to those whose confidence increased? For those whose confidence did not increase, why might this be the case? Follow up data collection activities might include a focus group (where participants can share experiences), interviews, or an open- ended survey question.

A note on the differences between findings, conclusions, and recommendations

It is important to understand the differences between findings, conclusions and recommendations in your report. Findings are factual statements, conclusions are a synthesis of findings, and recommendations are suggestions for action(liii). Each plays a different role in your report.

Suggested resource:

The United Nations World Food Program: How to manage an evaluation and disseminate its results, M & E Guidelines [PDF].

Tip – Hold a reflections workshop

After you have interpreted your data and have some early findings, it’s a good idea to engage with the key stakeholders identified in Step 1 to present your early findings for discussion and feedback. This will enable you to get their input into the results and to test their interpretation of the findings.

 

Don’t forget ...

  • Do your data entry as soon as possible after data collection.
  • It can often be difficult to take the first step with analysis and findings – so just dive in and start brainstorming or writing! Keep in mind that interpretation is a process, not a product. Put your ideas down and keep working on them.
  • Seek out different perspectives on your interpretation.
  • Besides engaging your stakeholders, it is okay to engage someone outside the project. Fresh eyes and objectivity are good.
  • Less is more – focus on quality over quantity of data.
  • Take photos, videos, or recordings during activities (after obtaining consent) as this can also be a good source of data, and can add to the way in which you communicate analysis or findings.
  • Always link back your results to the outcomes and indicators in your MEL framework.
  • Keep an open mind after drafting conclusions or assumptions and make sure you test them.
  • Be flexible and have a back-up plan – what if your preferred method of analysis does not work (or is not the type of information you were looking for)?

Suggested resources:

See this great primer (book) – Davidson, E. Jane (2005) Evaluation Methodology Basics. The nuts and bolts of sound evaluation. Sage. Thousand Oaks.

Endnotes

(liii) p. 17, United Nations World Food Programme (nd) Monitoring & Evaluation Guidelines: How to manage an Evaluation and disseminate its Results. Rome, Italy. Available at:
[https://www.betterevaluation.org/sites/default/files/14-WFP%20-%20How%20to%20manage%20an%20Evaluation%20and%20disseminate%20its%20Results.pdf]