12 April 2010

April 2010 Grading Roundup

This month's cavalcade of grading comes early this month. Let's see what's caught in the net, shall we?


The Chronicle of Higher Education has an article on Outsourced Grading. There is now an option for professors to use "expert assessors" in India to provide constructive feedback on student papers. The outsourcing has grown from a frustration on the part of professors that students aren't doing high quality writing, but teachers do not have the time to provide feedback to large numbers (e.g. 1000) students. Even teaching assistants are not enough. For those who are using the service, there is an opportunity for teachers to review (and adjust comments), but I somehow doubt that anyone who does not have enough time to read 1000 papers is not going to have time to read comments on 1000 papers. You'd have to take a sample and call it a day. In the meantime, if the purpose of doing this is to provide support for student improvement, then the professor would need to do two things: teach students how to use the feedback and change his/her instruction to help.

My favourite quote from the article is at the end:
"People need to get past thinking that grading must be done by the people who are teaching," says Mr. Rajam, who is director of assurance of learning at George Washington University's School of Business. "Sometimes people get so caught up in the mousetrap that they forget about the mouse."
Really, I'm shaking my head in disbelief. Dude, if anyone is forgetting about the mouse, er "student," it's you. I can understand that it is unreasonable for one person to score 1000 papers at a time and provide appropriate feedback---but it is also ridiculous to assume that professors who are "freed from grading papers [so they] can spend more time teaching and doing research" is going to benefit students.

In other college news, Loyola Law school in Los Angeles has been retroactively changing grades (back to 2007) in order to alter their scale and make their graduates more competitive. It's not a major shift, but intriguing with continuing conversations about "grade inflation" that a college would choose to boost its students' transcripts (as opposed to look at what goes into the grades or supporting students to meet expectations). You can read more at the LA Times.


Speaking of grade inflation, have a look at this graph from I Love Charts:

No explanation as to its origins. Not sure how meaningful the information is. If I'm reading this correctly, then the average change in GPA is about .5 grade point (13%) spread out over 40 years. Is this a significant difference? Not going to pull out my stats here, but someone else is welcome to whip out a chi-squared analysis. I also wonder about the change in college populations over that time frame. Draw your own conclusions here---just wanted to offer it up for consideration.


Finally, Science News has a story on how Homework Makes the Grade. Surprise, surprise: Physics students who actually do the homework/practice score higher than students who just copy homework from others.
Students at MIT and other universities commonly complete homework using an online system, giving Pritchard and his colleagues a wealth of data to analyze. The team tracked homework for four terms of introductory, calculus-based physics, a requirement for all MIT undergraduates. Since all of the students’ entries were time-stamped, Pritchard and his colleagues knew how quickly the problems were completed once the question appeared on the screen.
In the team’s analysis, three clusters emerged: One group of students solved the problems about 10 minutes after the problem first popped up, another answered a day or two later, and a third typically answered correctly in about a minute. Because the online system presents problems one at a time, it precludes working out all of the answers ahead of time and entering them all at once.
“Our first reaction was “Wow, we must have some geniuses at MIT’,” Pritchard says. The team soon realized that the answers in this quick-solving group were entered faster than the time it takes students to read the question, raising suspicions that these students had a cheat sheet of copied answers.
Equating speedy answers with copying, the team concluded that about 10 percent of the students copied more than half of their homework, about 40 percent copied 10 to 50 percent of their homework, and about half the students copied less than 10 percent of their homework. By the end of the semester, students who copied 50 percent or more homework earned almost two letter grades below students who didn’t copy very much, the team found. Heavy copiers were also three times more likely to fail the course.
Other patterns emerged from the data as well. Students who copied were much more likely to put off the majority of their homework until the last minute. And copying rates increased dramatically after the first midterm.
In the study, the heaviest copiers were male, and although most of the students in the classes were freshmen and had yet to declare a major, subsequent analyses turned up an interesting trend: “Copying homework is a leading indicator of becoming a business major,” Pritchard says.
As I've written here many times, as long as grades are valued over learning, you will have cheating. This does not mean that homework is evil. It doesn't mean that students don't need practice. What it does mean is that we as teachers need to make it clear to students why we're assigning the work.


That's all the grading news fit to print for the month of April. There appears to be a dearth of K-12 information at the moment, probably because it's testing season. I expect another fit of discussion as the year draws to a close in May.

No comments: