Tag Archives: statistical analysis

Robo-graders like long words, not so big on intellectual coherence

When I glanced at the title of a recent New York Times piece on automated essay grading, “Facing a Robo-Grader? Just Keep Obfuscating Mellifluously,” I assumed it was just another fluffy popular science article. Surely no serious organization would use a computer program to grade essays. Not long into the article, however, I discovered that the “robo-grader,” named the E-rater, was developed not by university scientists but by the Educational Testing Service — the organization that administers the GRE and the TOEFL, among other exams.

For now, E-rater only grades essays that are also read by a human grader. Though the grades given by humans and E-rater have been remarkably similar, Les Perelman, an MIT professor, has his reservations about the software. After a month of testing, he has determined that E-rater favors long paragraphs and sentences, connecting words like “moreover,” and words with many syllables. Most troubling is that the E-rater can’t determine the truth or intellectual coherence of statements in the essay, used to hilarious effect in an example essay  by Perelman.

John Snow and the cholera outbreak of 1854: Revealing an unintuitive truth using data

Truth reveals itself to us in many different ways. Sometimes, it takes the form of an amazing revelation, an eye-catching explosion of color, or a terrifying act of nature. Other times, it takes on a more subtle form, discovered only through a combination of patience, knowledge, and determination.

Turn the clock back a few hundred years, and you would find a culture that did not have the sophisticated data analysis techniques to uncover the truths of natural world that we have today.  Claims were often backed up by “common sense”.  Society lacked a way of quantifying information and letting the data speak for itself. But this mentality began to change in the 1800s, marking an important shift in our scientific culture that continues to this day.  While the process spanned several generations and countless individuals, one of the more interesting stories is that of a man named John Snow.

Psi Your Mind?

Earlier this year, Daryl Bem, a Professor at Cornel University, published a paper on Psi phenomena (also known as psychic phenomena). Bem’s Paper was published in the premier journal of social-personality psychology, the Journal of Personality and Social Psychology (JPSP). In the paper, Bem presents results from eight experiments where he finds evidence for precognition (conscious cognitive awareness of future events) and premonition (affective apprehension about future negative events). The results have shocked our field!

The experiments themselves are among the most clever I’ve read and truly–at least in my mind– are evidence of creative innovation. The experiments use normal psychological interventions (e.g., giving participants a list of words to memorize and then testing their memory for the words) in reverse order. Thus, in a memory study, participants’ memory of words is tested before they learn the words. In another experiment, participants choose between sets of neutral images, with the expectation that participants will choose these images to avoid subliminally primed negative stimuli. Once again though, participants are primed after they choose between the stimuli.

Dead salmon finds home. Still dead.

It isn’t breaking news, and it’s hardly science. Still, considering last month’s discussion of statistical rigor and the recent kerfuffle over a paper in a well-respected psychological journal purporting extra sensory perception, now is the perfect time to revisit the dead salmon study.

In 2005, a graduate student in the lab of Abigail Baird at Dartmouth College needed to test his fMRI protocols for an upcoming experiment. Having already tested a pumpkin and a cornish game hen, the obvious next step was to scan a whole salmon from the local supermarket.