It's curious that a most modern of technologiesdictation softwareprompted memories of mimeograph, a printing tech we used years ago to produce a daily newspaper at my high school. I'll explain a bit later.
The idea of talking to my computer has long intrigued me, but early tests of dictation software were discouraging. That's changed with the latest version of Dragon NaturallySpeaking ($100), which does an impressive job of capturing my spoken words and translating them into text.
I gave version 9 of the software an acid test. I skipped training, which means reading prepared text so the program can get used to your voice and accent. After dictating about 200 words, the software had botched only about a half-dozen, giving it a success rate of better than 95 percent. The training routine might just get me to 99 percent accuracy and speeds faster than typing, as suggested by Nuance, the company that produces NaturallySpeaking.
Still, I'm unsure how I might use the software day-to-day. That's partly because of the need to speak clearly; when trying to compose thoughts, as in writing this column, I mumble a bit as I fumble for thoughts. I also need to learn how to edit with spoken commands, and to decide if that's faster than using a keyboard.
That's what made me think of high school, where our adviser tried to stop us from "composing on the typewriter" because the inevitable changes were messy affairs, with correction fluid splashed onto a waxed mimeograph stencil (think Liquid Paper, only worse). But we persisted because we could type faster than writing longhand and found that the copy flowed better. And soon enough, computers made it easy to compose while typing. Now, is it logical to think that dictation is the next step, producing stories with a more conversational style than typing? Perhaps, once I first master its version of correction fluid.