Step 6: Collect and analyse your data

On this page:

This section provides advice on collecting data – which might include distributing and collecting feedback sheets or surveys, conducting interviews or focus groups, or collecting project administrative data. It also provides advice on how to think about and analyse your data.

Preparing for data collection

In addition to planning your data collection methods and instruments, there are a range of practical considerations for collecting data. Different considerations apply to different data collection methods.

Considerations for surveys

  • Have you tested the survey by sending it to supportive colleagues for feedback about errors or glitches?
  • Do you need to send the surveys out through another project manager or important stakeholder? If so, have you asked this other person, and made them aware of the timeline?
  • If you are sending your survey out electronically (in word form or using some other digital tool or link) and do you have the email addresses for distribution? Have you paid attention to privacy issues? (e.g., are participants aware they may receive a survey? Have you blind copied (bcc’d) participants into the email?)
  • Are there other areas of work or activity in your organisation that need to be aware you are commencing data collection? If so, how should you notify them?

Considerations for interviews and focus groups

  • Where will you conduct the interviews or focus groups? Do you need to book a venue or room? Are there costs? Will they be online? Do you need to send instructions to participants about how to use the technology? Is it an accessible venue?
  • How will you send invitations? Will they be a formal or informal invitation? How long will you give people to reply?
  • How will you record the session? Will you take notes on the computer or record audio?
  • If you are working with people from vulnerable communities, is there value in sending a text reminder on the morning a few days before, and on the day of the interview?
  • Do participants need support with transport, accessibility, or other resources?
  • Is the time scheduled suitable for participants? If it is after hours, have you planned for how participants will get home (especially when it is dark)?
  • Have you considered aspects of cultural safety, cultural inclusion, and victim-survivor safety?

Considerations for observational work (e.g., a network meeting)

  • How will you record the data? Will you take notes or record the session? Do you require a second person present to support with administrative tasks?

Considerations for accessing administrative data or literature

  • Are there ‘gatekeepers’ to the data or literature from whom you need permission? Are you, or is a manager, best placed to make these requests?
  • In what form do you want the data?
  • Are there privacy considerations?

General administrative considerations

Planning is key to your data collection. Refer to your data collection plan (Step 5) and take the time to think about the more practical elements of your data collection.

  • As noted, have trusted colleagues reviewed your final data collection instruments. It’s okay to update them or change them before use.
  • Clarify roles and responsibilities for those working on the evaluation. For example:
    • Who will be printing out the hard-copy feedback forms or consent forms?
    • Who is setting up the survey? If it is online, will someone be responsible for putting that into a specialised software (e.g., SurveyMonkey) and sending it to participants?
    • Who is facilitating the focus group and who is observing and/or taking notes?
    • How many people are conducting the focus groups or how many are interviewing?
  • Consider whether you need to hold a briefing or run some training for people who will be collecting data (conducting interviews or facilitating focus groups).
  • Liaise with your stakeholders on logistics, their involvement and check in about how the tools will be distributed.
    • Confirm with your stakeholders the appropriateness of the times and location set for interviews and/or focus groups. Is it safe? Will people turn up at the planned time? Is the location too far away?
  • Get consent forms ready for participants. If you will be taking pictures or recording (audio or visual) during interviews and focus groups:
    • Have you asked your stakeholders if taking photos or recording is okay?
    • Will you be getting written or verbal consent? This might be based on adapting to the context of the participants and your ethics obligations.
    • Will participants understand the detail? Is there a simpler way to get consent or will it need to be translated into another language?
  • Identify where you will store your data after collection:
    • On your laptop or USB if a survey was done on SurveyMonkey or using other software
    • In a folder and locked cabinet if forms were filled out in hard copy.
  • Factor time into your workshop or training agenda for participants to complete post-feedback or surveys.

Tip – Baselines and comparisons (timing is everything!)

If you are hoping to collect data that compares something before your project begins with something that occurs after your project has started (for example, how confident participants are before and after participation in your project), you will need to collect baseline data before or during the first session. Baseline data is ‘starting point’ data(xlviii).

It provides data from a point in time before your activities so that you can measure afterwards whether your activities have made a difference. Make sure to do some further reading about collecting baseline data; questions or concepts for follow up surveys or interviews will need to mirror precisely those in the baseline survey (otherwise the comparisons will not be valid).

Please note: It is not always necessary to do comparisons of data. For example, it is okay simply to ask participants at the end of a workshop(s) whether their confidence has increased.

Suggested resources:

Quality of data

Evaluators need to think about the quality of data they collect. Quality of data generally refers to two things – that data is reliable and valid(xlix). Reliability is about consistency; that is, whether answers or results will be the same if measured under the same conditions. Validity is about accuracy and means ensuring that a tool measures what it is supposed to measure, and that any inferences/judgments you draw from the data are reasonable. Take steps to ensure your data is accurate. It should be traceable back to the de-identified source.

The WHO Evaluation Handbook(l) suggests three broad strategies for improving the quality of your data:

  • Improve the quality of sampling (to improve representativeness)
  • Improve the quality of data gathering (e.g., by testing your tools)
  • Use mixed methods in data collection (so that you can cross check information and perspectives).

Suggested resource:

See World Health Organisation’s (2013) Evaluation Practice Handbook [PDF].

Ethical responsibilities

Taking care of ethics responsibilities means taking the following steps:

  • Ensure everyone in your evaluation is a willing participant by seeking ‘informed consent.’ For a survey, this might mean creating a front page with information about purpose, risks, and benefits of the research and asking people to select a checkbox if they’re willing to proceed. In an interview, ‘informed consent’ could mean providing a written ‘informed consent form’ for participants to sign or seeking verbal consent.
  • Provide details of support services for your participants if any were to experience distress during your data collection and/or general project activities (e.g. interview, training, workshop, focus groups and surveys).
  • Understand the demographics of your participants and what type of consent is needed (e.g., are translated materials helpful/necessary?)
  • Consider safety and the appropriateness of how and where you are collecting the data (family violence victim/survivors might need extra privacy protections).
  • Protect participants’ confidentiality.
  • Understand and respect cultures, norms and roles participants will have in their communities
  • Keep things simple where you can. Not everything needs to be complex.
  • Maintain appropriateness, respect and relevance to the individual, communities and organisations that data is being collected from.
  • If you have sought approval from a formal ethics committee, do not commence any data collection until you receive an approval number.

For further information, please refer to the Ethics section.

Suggested resources:

Check out information from the Australian Evaluation Society on ethics and informed consent.

Tip – Remember the importance of monitoring

Monitoring means collecting and using information about your project and activities to understand how you are progressing. When you collect data throughout your project, use it to assess whether you are on time, on budget and on track to achieving your measures for success.

Data management

Data needs to be securely stored, processed and shared so that it can be accessed only by authorised people. Data management needs to be relevant and planned for according to the requirements, size and complexity of your project.

Create a checklist for data management that includes consideration of some or all of the following:

  • What format will the data be collected in? When and where will it be stored? Will data be visual, audio, descriptive, hard-copy form or electronic?
  • Will the data be organised by location, format or time? Will it be easy to understand and access?
  • Will the data be securely stored and available for the key project staff/stakeholders? Will passwords be needed for online data or specific files locked so data is not editable? Will the data be easily searchable by key project staff/stakeholders?
  • What is in place to control data quality? Are there protocols and procedures in place to ensure consistency? How will you minimise data entry errors or accidental deletions?
  • Who is responsible for managing the data? If protocols are in place, does everyone that uses the data understand these protocols? How will data confidentiality be enforced?

Cleaning your data

Raw data is the form of data you collect in a survey or interview. It is ‘raw’ responses that have not been treated, organised, or amended by anybody else. It might also be contained in spreadsheets provided to you by another data collector.

An important part of preparing your data for analysis is making sure that it is clean. To ‘clean’ data means to make sure the data is correctly entered and ordered, that it is as free from errors as possible and that it is as consistent as it can possibly be(li).

Data cleaning applies to both quantitative and qualitative data. It might require entering data into specific computer software (for example, putting interview notes into Microsoft Word so they are easy to read and re-arrange), rearranging exported data or removing irrelevant data (for example, survey data exported to Microsoft Excel), removing data duplication (for example, survey responses that appear multiple times), or checking the accuracy of the data collected.

Tip – Keep in mind that raw data is not information

The data you have collected is most likely raw or unprocessed data. It is not information yet. You need to identify the type of data you have and prepare it so that it is ready to be processed which will turn it into information about your primary prevention project.

After your data is collected and cleaned, you will need to organise it so that it can be analysed and interpreted to tell you about what you have achieved.

Suggested resources:

Data analysis

Data analysis is the process of examining the information you have collected to reveal relationships and patterns that help you answer your evaluation questions.

Tools for analysis

There are many tools for analysing different kinds of data. Putting data into categories that you can use for analysis is usually called coding or theming. Widely available software packages like Microsoft Excel and Microsoft Word can be used for recording and organising numbers, text, and other data. The Microsoft suite is usually all you require for a small scale, internal evaluation. You may also have heard about software packages like NVivo and SPSS which allow more advanced coding of your data. While these can be good to try out (especially if you have lots of evaluation going on in your organisation), you can still undertake a quality analysis without them.

Coding and analysing data

Different approaches to coding and analysis are used for quantitative and qualitative data. In this section we provide some general information about approaches to analysis.

Quantitative data analysis

Quantitative data usually comes in the form of survey responses and administrative data, such as lists contained in spreadsheets. In many cases, you don’t need specialised skills to analyse this kind of data: a basic descriptive analysis is usually sufficient. This generally means using simple equations to count survey responses or other administrative data items or to calculate percentages, rates, or averages.

Primary prevention projects are usually interested in understanding how change has occurred – that is, how the project you are working on contributed to a change. So, the numbers you calculate are helpful to show different types of change, for example increases or decreases in something.

For example, you may want to know whether people who have attended your workshop feel increased confidence about taking bystander action. You might do this by counting the raw number of people who were confident before and after (and those who weren’t) and applying a percentage calculation to the figures.

Note: if you are interested in more complex analysis, you can explore inferential statistics or descriptive statistics which are used in more formal or bigger picture evaluations. Inferential statistics, for example, is when you use information from a small group to draw conclusions about a bigger population, or the likelihood that one thing caused another.

Qualitative data analysis

Qualitative data sources include interview and/or focus-group recordings and transcripts. The point in reading through qualitative data (words) is to find themes or groups of information that help you measure your indicators, and that speak to your outcomes. You need to have a system or method for coding (theming) qualitative data, just like you do for quantitative data.

Before using a formal coding process, it can be useful to do a first read through of your materials to become generally familiar with them. Decide if you will be taking an inductive or deductive approach to qualitative coding. Taking an inductive approach means reading through the data with some precise and pre-determined themes or categories in mind. You look for these themes in the material and record each instance. With a deductive approach, you might have some broad or general topics in mind, and as you read through the material you let the themes ‘speak’ to you. For example, you might be looking for participants’ experiences of what they found the most powerful in a workshop. These answers might be similar or different. In either case, you are categorising what you find.

While counting is a part of qualitative analysis, it doesn’t have the same purpose as with quantitative analysis. Qualitative enquiry can be useful for ‘surfacing’ information – for example, hearing about people’s experiences, their thinking about why something matters, or an understanding of what made something difficult. While counting the number of people who mention the same thing can be one value of qualitative analysis, it is not the only approach. For example, if you ask people to suggest ways of improving the delivery of your course, five separate suggestions from five separate people might all have value.

We have provided the following resources to support your understanding of coding and analysis.

Suggested resources:

External resources:

The resources and templates page includes links to external resources for data collection and analysis.

Endnotes

(xlviii) Simister, N. & Giffen, J. (nd.) Baselines. INTRAC Associates, Oxford. Accessed on 75/7/22. Available at: [https://www.intrac.org/wpcms/wp-content/uploads/2016/06/Monitoring-and-Evaluation-Series-Baselines-10.pdf]

(xlix) Bamberger, M., Rugh J, & Mabry, L. (2006). Strengthening the evaluation design and the validity of the conclusions. Sage Publications, Thousand Oaks, CA.

(l) P. 54, World Health Organisation (2013) WHO Evaluation Practice Handbook. Accessed 6/7/22. Available at [https://apps.who.int/iris/bitstream/handle/10665/96311/9789241548687_eng.pdf?sequence=1]

(li) Better Evaluation (2014) Data Cleaning. Accessed on 7/7/22. Available at: [https://www.betterevaluation.org/en/evaluation-options/data_cleaning]