MIT researchers analyzed more than 800,000 online school reviews using advanced natural language processing and found that the reviews were largely tied to schools’ test scores – a measure that is closely tied to race and family income and tends to amplify inequalities in educational opportunities – rather than a measure of student growth that reflects how well schools actually support students learn.
“We hope that parents who learn about our study will be very careful to read school reviews and take what they read with a grain of salt, triangulating subjective ratings through a range of indicators that attempt to capture what is really happening in School. school,” says Nabeel Gillani, a graduate student and research assistant at MIT’s Media Lab and the book’s lead author testswhich was published this week in a peer-reviewed journal of the American Educational Research Association.
Gillani and his fellow researchers — his faculty advisor, Professor Deb Roy; MIT graduate Eric Chu; Media Lab scientist Doug Beeferman; and Rebecca Eynon of the University of Oxford — was based on approximately 830,000 reviews of more than 110,000 publicly funded K-12 schools across the United States. Reviews were posted by parents from 2009–2019 on the website GreatSchools.org school information website. GreatSchools, which provided peer review data for the study, has updated its grading systems in recent years to improve the effectiveness of providing information that minimizes inequities in educational opportunities.
The study characterizing the reviews is the first of its kind. Gillani, whose volunteer work involves helping families unfamiliar with American public education choose high-quality schools for their children, first thought of the concept after a phone call with her mother, who had recently immigrated to the United States. When a mother read online reviews to choose a school for her daughter, Gillani says he was struck by one school in particular, about which the reviews were very positive, “but considering the various quality indicators, the school itself did not seem to be a good, high-quality school ”, which emphasized students’ learning and development.
“Since then, I have been interested in what information reviews contain about various measures of school quality. What do they say about the quality of education children have access to in their schools?”
Gillani says these questions “fit well with our research group’s goal of using machine learning and natural language processing to understand patterns of human discourse and behavior.”
To conduct the study, the authors linked GreatSchools reviews to the Stanford Education Data Archive and census data on race and socioeconomic status by neighborhood. Preliminary analyzes show that the reviews were posted mainly by parents from urban schools and those serving wealthier families. They then developed machine learning models that used the language used in the reviews to predict various school characteristics, including test scores, measures of student growth, the percentage of white students, and the percentage receiving free or reduced-price lunches. They found that the models predicted test scores and school demographics quite accurately but were virtually unable to predict student growth, suggesting that the information contained in the reviews was closely tied to schools’ racial and demographic indicators.
To better understand these connections, researchers then examined the decision-making processes used in the models, identifying the words and phrases most closely associated with school performance measures and demographics. Many of these words and phrases – such as “PTA”, “emails”, “private school”, and “we” and “us” versus “me” and “mine” – were more closely associated with higher performing, whiter and wealthier schools. According to Roy, the MIT professor, these associations reflect documented trends in education that have revealed that parents in such schools often have more time and comfort to engage in parenting groups, have greater digital connectivity, more educational opportunities, and lead two-parent households. parents. media arts and sciences, director of the Center for Constructive Communication at MIT and executive director of the Media Lab at MIT. “Our study shows how machine learning techniques applied to large-scale datasets describing human thoughts and behavior can reveal subtle patterns that might otherwise be difficult to detect,” says Roy.
The results led the authors to conclude that “parents who rely on school reviews may be accessing information and making decisions based on biased perspectives that widen achievement gaps.”
If reviews reflect test scores and demographics, and parents apply them to decide where to send their children to school, they may even lead schools to continually prioritize high test scores over student progress and development, Gillani argues.
“In an education system where test scores are notoriously tied to race and income, one concern is that reviews tied primarily to test scores could influence parent and school decision-making in ways that increasingly skew the demographics of a school by race and income,” she says. “As with any marketplace, consumer reviews and preferences are likely to have a major impact on what kinds of products are ultimately created.”