Wednesday, April 25, 2012

News that isn't

Yikes.  I know it's not a particularly professional response to what I'm reading this morning, but it's honest.  I guess I could go with 'wow' or 'how frustratingly short-sighted,' but I think I'll stick with the mono-syllabic expression of shock and alarm that preserves the element of humor.  A bit of humor and a strong dose of get-a-grip seems helpful when reading that tools can do some things better than people (gasp!) and how that's all well and good right up until the trite we-can't-let-machines-replace-us fear is invoked.  

It may be the slightest bit hypocritical and/or short-sighted to be thankful for a hammer (which works far better than the heel of my shoe...) or a screwdriver (try using's not pretty) and bemoan tools that specifically--and selectively--affect my own profession.  What has me a bit ruffled this morning is the NPR article which opens with this paragraph:
Computers have been grading multiple-choice test in schools for years.  To the relief of English teachers everywhere, essays have been tougher to gauge.  But look out, teachers:  A new study finds that software designed to automatically read and grade essays can do as good a job as humans--maybe even better.
It doesn't get a lot better from there, as the NPR article continues to sound the alarm about the tool, how students might game the tool, and the danger(s) of using the tool to replace both good teaching and good teachers.  Credibility is lent to the NPR article through judicious quotes from (with requisite links to) a New York Times article about a study conducted at the University of Akron.

The scary part is this:  The NPR story was largely based on an interview with the author of the New York Times article.  But only the reader (not the listener..) who takes the time to read the NPR article, follow the links to the New York Times article, and then follow those links to the original source will find this:
Shermis, the lead author of the Akron study, says thrift-minded administrators and politician should not take his results as ammunition in a crusade to replace composition instructors with...robots.  Ideally, educators at all levels would use the software 'as a supplement for overworked [instructors of] entry-level writing courses, where students are really learning fundamental writing skills and can use all the feedback they can get.' "
At this point, I have so many browser windows open (love this tool!) that I'm having a bit of trouble keeping them in the order that makes sense to me.  But by now I also have access (because of this amazing tool...) to the original study conducted at the University of Akron.

I can read the study for myself and draw my own conclusions, conclusions which are likely to be far more similar to those of the lead author than those of the NPR writer who does not quote or reference the original work--at all.  Yikes.

Friday, April 6, 2012

Fluffy reads

I'm trying to keep an open mind about virtual books and to listen respectfully to those with whom I disagree.  All too willing to accept that my preference for books (the ones with actual paper in them) may be rooted in emotion or history, I've been pretty quiet about those preferences.  Until now.

While reading levels of American high school students continue to fall, it's time to ask when we accepted the fallacy that reading is supposed to be easy and fun.  Light reading for pleasure while on holiday, sure.  But all reading?  I don't think so.

The best writing (of the caliber of Shakespeare, Plato, Dante, Socrates, or Homer) is intended to challenge the reader, to make the reader think, to force the reader to The Oxford English Dictionary or another study aid, or to require the reader to tackle a paragraph s-l-o-w-l-y one or more times before the dense text becomes clear.  Great writing takes work, both for the author and for the reader.  And it's not the kind of work that lends itself to handy electronic devices far better suited for popular contemporary fiction.

Most American high school and college students write poorly, largely because they read poorly.  And the need for curriculum to develop 'critical thinking' would be diminished, if not eliminated, by requiring students to read prose and poetry above their grade level, grapple with the complexity of what they are reading, and explain the meaning--in writing. That's critical thinking.  The critical thinking employers want.  And learning to read Dante well helps with reading--and writing--case law, historical documents, philosophy, learning objectives, short stories, instructional manuals, position papers, and annual reports.

Consider, for example, Bram Stoker's Dracula, which can be read here courtesy of Project Gutenberg.  Written in 1897, Dracula could be considered 'light' reading of that era.  The writing style, the historical references, and the cultural differences illustrated by the narrative are what make Dracula harder to read in 2012 than it was in 1897.  And those are exactly the reasons for students to read Dracula rather than--or, at a minimum, in addition to--one of the plethora of current teen best sellers.  If students aren't able to grapple with adventure writing from 1897, what are their chances of understanding philosophy from 400 B.C. or poetry from the 16th century?

What we seem to want is the reading and learning equivalent of cotton candy.