top of page
  • Writer's pictureGP UWC

Why We Can’t (Yet) Discuss Certain Methods, Impressions, and Findings

Updated: May 19, 2022

Originally published March 30, 2020

Our team first began our study of the United World Colleges movement and other internationally-oriented or mission-driven secondary schools in August 2017. As we are now well into the third year of our research of how such schools impact their students and produce impact on the world, we have collected a wealth of data in multiple forms and from multiple stakeholders. We have surveys completed by alumni and two cohorts of students. From UWC specifically, we have gathered observational data from classrooms and activities; and we have interview data from hour-long conversations with students, alumni, and faculty/administrators.

With so much information at our fingertips, we have often been asked by people outside our team, including school stakeholders and others in the general public, whether we can share particular information about what our study is probing, what our impressions are from what we have done so far, and what we have observed in our analysis of the data up to now.

There are certain parts of our research design and methodology that we have been very open to sharing, in accordance with standard robust research practices. These aspects of the project, which we have communicated to many people, include:

· The mixed-methods modes of our data collection, which allow us to draw on multiple sources (surveys, interviews, observations) and inherently different types of information (quantitative and qualitative)

· The types and numbers of participants in the research (over 3000 students, over 8000 alumni, nearly 200 staff members)

· The participation of all 18 UWC schools as well as 11 other diverse mission-based secondary schools in the study

· An overall timeline of the project (results are expected at the conclusion of the 2021 calendar year)

· Partial information about the types of overall questions we are probing in our instruments and their relation to our research questions (we seek to understand the impact of educational experiences in the short-term on students and in the long-term on alums, as well as whether and how the individuals in our study are making an impact on wider society)

However, we have also tread very carefully in deciding what to reveal about the study and when. For example, we have not shared too much more information about the motivations we have in asking certain questions of participants, or about how we are conducting our coding of particular themes from our interviews. Some frequent questions we have received in this vein include:

· When you sit in a classroom, what specific behaviors or events are you writing down as you are taking your notes?

· Why did you ask me about ______ on my survey? How will you judge my answer to this question?

· Now that you’ve visited a number of schools, which one is doing the best at ______?

· Did I give the right answer to your interview question?

Occasionally, our reluctance to share more information in response to the queries above has led to frustration and perhaps even doubt about whether we are conducting the study properly. But our reluctance to answer is not due to secretiveness or lack of familiarity with or thought about the question. Instead, we have, in our view, very solid reasons why we should not and cannot speak further about our data—yet.

First, while much of the data has been collected, the data collection is still not entirely complete and remains ongoing. We are still conducting surveys with students, and we are still carrying out interviews and site visits. Any information that we share with others, no matter how seemingly insignificant or tentative, has the potential to influence the data still to be collected and to corrupt our study from this point forward. As one example, were we to begin sharing with schools that one particular program or another seemed particularly important to students, the schools could ramp up that program in the next academic year, thereby potentially changing our survey results from the second participating student cohort. Another example: if we were to share the meaning or motivation behind a particular question we ask participants, word could spread, changing the way that future participants answer. We need to avoid situations like these at all costs to maintain the integrity and neutrality of the research.

Second, our data analysis is not complete and is in many ways just beginning. We have not completed the full scope of analysis on any particular question or dimension of the data, and we have a road ahead of us for the remaining two years of the study to perform statistical and qualitative analyses. We therefore cannot speak at this point with any authority about what trends might emerge from the data when we do these analyses. Even though we might have impressions about what is happening, these impressions might be proven wrong by the data, or may be nuanced in a way that is different than how we would explain things without access to the full data analysis. Therefore, it is premature for us to be providing “results.”

Third, and related to the two points above, it is our intention to consider our data in totality. This research is complex; as described, we are including multiple populations and modes of data collection, and the various pieces are meant to inform and direct one another. If the survey results statistically point us to evidence that there are certain dispositional outcomes of educational experiences, we will turn to our qualitative survey questions (and, in the case of UWC, interview transcripts) to look for more information about what people might say about those dispositional outcomes. The process can also happen in the opposite way: if our coding of interviews demonstrates that UWC interviewees speak frequently and with uniformity regarding how “impact” on society occurs, we will look to our survey results to investigate whether a similar trend occurs there. Our data sources are meant to “speak” to each other, and until analysis is further along, we don’t have a full “picture” to evaluate for sensibility and coherence. Longitudinal studies function best when the totality of data over time is available to the researchers for pattern analysis, and we can’t do that effectively until we have the whole pattern spread in front of us.

At present, then, we continue not to share some details about the meanings behind our specific protocols, the underlying items we are searching for, our impressions, and our nascent “findings.” Yet with the attention of our team turning more and more on a daily basis towards data analysis, as opposed to data collection or logistics, we will be poised to share more information publicly as time moves on, especially in 2021, when our data collection will be completed and we will be well into writing summary reports for our school partners.

Please stay tuned to this blog for future information about our analyses and results. We welcome questions and feedback regarding the study, and we ask for patience as we work towards releasing the results.

1 view0 comments
bottom of page