New grant project enhances teachers' toolbox
As a graduate student in educational psychology, Laine Bradshaw developed a psychometric method to assess students' comprehension of key concepts. Now, thanks to a grant from the U.S. Department of Education's Institute of Education Sciences, she is able to apply her work to classrooms, combining research with practice.
The $1.4 million grant, "Diagnostic Inventories of Cognition in Education," investigates a new way to create an assessment to figure out which misconceptions students have based on which incorrect answers they pick—information that is typically disregarded in assessments.
The aim of the assessment is to give teachers a useful resource to understand where their students are struggling and address the root of the problem.
Bradshaw, an associate professor in the University of Georgia College of Education's department of educational psychology, and her research team are working to develop data science tools that will allow students to take an online assessment and determine the probability that they have a particular misconception. The tools analyze patterns of student responses to pick up on specific types of reasoning students show across a set of meaningful problems. The bits of reasoning are smaller than what is typically assessed in schools.
The purpose of assessing these specific bits of reasoning, Bradshaw said, is to give feedback to teachers and students that's not easy to see during typical classroom activities.
Identifying the misconception helps the teachers understand how the student is reasoning. And sometimes, Bradshaw said, it's helpful for students to have incorrect ideas about a concept than to have no ideas about the concept.
"It's not just if they understand a concept or not, but rather when they don't understand a concept—why?" said Bradshaw. "Sometimes it shows thoughtful student thinking to have a misconception. The thinking may be logical and even sophisticated, but it's not quite right. Detecting a misconception is different from detecting a random guess or an error."
For her dissertation at UGA, Bradshaw developed a psychometric method that detected misconceptions in mathematics by analyzing data that identified which wrong answer a student picked—not simply whether the item was right or wrong. The test was designed in such a way, she said, to include possible misconceptions among the answer choices. Other possible answers represented errors students might make in their calculations. While assessments that are designed in this way are not uncommon in educational research to evaluate interventions, the grant project is pushing the field forward by combining this style of assessment with new data science methods to help increase the validity and trustworthiness of the results.
The new grant project builds upon Bradshaw's past work to develop a diagnostic assessment about probability and chance in collaboration with researchers from North Carolina State University and Research Matters. Probability concepts are foundational mathematical concepts taught to middle-grade students, said Bradshaw although misconceptions about probability and chance are not uncommon for adults to have as well. One common misconception about probability and chance is known as "the gambler's fallacy."
Imagine you are on a streak of betting losses. The gambler's fallacy says that the odds on the next bet are in your favor—you're due to win. But because every roll of the dice represents an independent set of odds, that idea of turning your luck around is false.
Reasoning about probability is key to making good decisions in life, but also to understanding statistics, a relatively recent addition in middle grades mathematics curricula. Because teachers are less familiar with teaching statistics topics compared with traditional mathematics topics, Bradshaw and her team are developing a web-based assessment to support instruction for this topic and making it freely available to all teachers.
"We're trying to develop the tools that essentially students can log on and take a assessment, and it will tell you the probability that you have this misconception," Bradshaw said. "So if a student doesn't do well, the teacher can look at the assessment and say, 'Oh, they are struggling with the gambler's fallacy."
Bradshaw said her goal is to give teachers new information from an assessment. Often, teachers become frustrated with assessments because there is no nuance to the results—if a student isn't performing well in, say, geometry, a poor score only reinforces that.
A better way, said Bradshaw, would be to help teachers identify which concepts the students don't understand, which is affecting their overall performance in the subject.
"So that's what we're trying to do—to give teachers something they can't already see," said Bradshaw. "We're trying to help develop tools that are sophisticated under the hood but quick and easy for students and teachers to use."