Thursday, September 04, 2008

Grade Inflation


The link in the title to this post will take you to an article in this week's Chronicle of Higher Education, "Just Say 'A': Grade Inflation Undergoes a Reality Check." Thomas Bartlett and Paula Wasley write an in-depth examination of the current debates and recent efforts to rein in grade inflation. For instance, grade inflation was already being decried in 1894 at Harvard. And currently, some academics argue that the rising numbers of A's and cum laudes reflect increasing competence among students based on rising standardized test scores.

But the issue has been debated at my school, along with the unfairness of some profs grading harder in one section than another. We tried unsuccessfully to impose a mandatory curve on ourselves. The closest we came was a recommended grade distribution in courses with anonymous grading. The article lists many pressures against such efforts, some of which came up in our own discussions:

1. Harder grades hurt recruitment and retention. Our discussions focused on the extra burden our students faced in interviewing for jobs, when they had to explain that Suffolk's median grades are one-half to a full grade point lower than comparable courses at other area schools, not because of lower performance but because of efforts by faculty to grade tougher.

2. Pressure by students trying to keep grades up for scholarship requirements, or by parents helicoptering around the students.

3. Students' evaluations of profs and complaints from students are factored into tenure and promotion decisions. And some profs feel a pressure not to grade tougher than their easier colleagues, both from students and from a subtle peer pressure. One of the profs in the article is quoted: "You can write that A or B, and you don't have to defend it. You don't have students complaining or crying in your office. You don't get many low student evaluations. The amount of time that is eaten up by very rigorous grading and dealing with student complaints is time you could be spending on your own research."

The article lists a number of other efforts on institution levels, and notes the varying successes:

1) Publish grade distributions and trying to standardize teaching practices in the large lecture classes (those that tend to be gatekeeper courses, I believe). University of Colorado says this has dropped the College of Arts and Sciences' grade point average "...from an average of 2.99, in 2004, to 2.94, in 2007." On the other hand, at Cornell, similar efforts to publish the median grades by course resulted in students cherry-picking classes and profs by the easy A.

2) Either add class rank to the grade point data (law schools already do this), or add other data that helps the reader evaluate the student's performance relative to peers.

3) Modify the grade point calculation to help refine the information. This was the part of the article that was most intriguing to me:

Spurred by a report in 2000 that showed a steady rise in grades at Chapel Hill, a faculty committee proposed a GPA alternative called the Achievement Index, a weighted class-ranking system that measures a student's academic performance relative to those of classmates.

Andrew J. Perrin, a professor of sociology who is one of the system's backers, likens the index to the "strength of schedule" system used in basketball to compare teams from different leagues on the basis of wins and losses against common opponents. Similarly, he says, the Achievement Index formula takes into account not only how a student performs vis-à-vis others in the course section, but also how those classmates fare in all of their courses.

The index is a resurrected version of a 1997 proposal by a Duke University statistician, Valen E. Johnson, who found that positive student evaluations correlated with lenient grading. The algorithm he devised was intended to neutralize differences in professors' grading practices and remove incentives for students to choose easier courses to inflate their GPA's.
To date, this weighted "achievement index" has not been passed either at Duke or at Chapel Hill. Students object that it would increase the competition among students. And professors seem to be hesitant because it seems like a complication of the intuitive GPA. They are not certain how it would work. At this point, Chapel Hill students will see their index ranking when they check grades online, but it will not appear on transcripts. Perhaps time will settle whether the information is helpful or even valid. An interesting solution to a complex problem.

The image is courtesy of http://www.wales.nhs.uk/sites3/documents/582/grade%20A.jpg

No comments: