A student in my “History of the English Language” course stopped me after class a few weeks ago and asked, “I was just wondering—how do you feel about the Oxford comma?” She could have asked about the rationale behind the Oxford comma (the comma after the penultimate item in a list—e.g., apples, chocolate, and peanut butter) or about the history of the Oxford comma. But instead, she asked how I felt about the Oxford comma, the suggestion being that a punctuation mark could be meaningful enough to arouse personal feelings.
I like the Oxford comma (she was right: I do have feelings about it), and I told her so. I am not an advocate of comma proliferation, but this one can support clarity and even usefully disambiguate some lists—e.g., my two brothers, the doctor, and the nurse (to indicate there are four people, not just my brothers, who happen to be a doctor and a nurse). That said, it should be noted that there occur very few such lists. I tend to use the Oxford comma in my own prose, and I tend to notice when others don’t.
Note the verb there: I notice. I don’t judge those who do not use the Oxford comma. I do not hold out that using the Oxford comma is inherently better than not using the Oxford comma. In the end, it’s just a comma. And as a historian of the language, I know that punctuation marks have shifted greatly over time and will probably continue to do so, although mass standardization may slow that process. When I copy-edit others’ work, I don’t force them to use the Oxford comma if they consistently opt not to use it. But I do require comma consistency when I copy-edit, which raises another important question.
Why does consistency in the use of commas seem so important? I myself have bought into the hegemony of comma consistency: If you’re going to use an Oxford comma some of the time, then you should use it all of the time, and vice versa. But why couldn’t I or anyone else opt to use the Oxford comma when it seems most useful and not use it when it seems unnecessary? Couldn’t we respond to the aesthetics or the logical requirements of each list on a more local basis instead of requiring a global decision about comma use?
The linguist Deborah Cameron, at the University of Oxford, has provocatively—and usefully, I think—pointed out that the emphasis on consistency in punctuation conventions, as well as other formatting guidelines, leads to the expenditure of many hours of academics’ and editors’ time, which could productively be spent in other ways. House style guides lay out rule after rule about when to use commas (and other punctuation marks) in the prose and citations of published text.
This laserlike focus on the consistency of each comma is relatively new to English. Standardization of the written language stabilized spelling at first, an effort that took hold in the 17th century. In medieval times, scribes could spell a word different ways even on the same page of text, with the assumption that readers would know that, for example, might and miht were really the same word. (The analogy I use to explain this seemingly foreign lack of consistency is that in my handwriting, I employ both a script “f” and a print “f,” and I assume that readers will understand that they are the same letter.) Efforts to create hard-and-fast rules about punctuation came later, and writers of all levels of educational attainment still struggle to adhere to them, as any copy editor will tell you.
I’m not arguing for punctuation anarchy. And honestly, I haven’t yet bought into my own argument: I still copy-edit my own work to ensure full consistency of comma use. But I think it is worth all of our asking whether these feelings we harbor about the importance of getting our commas “right” and of getting them “right” in the same way each time are the best use of our energies. The effectiveness of the comma might even benefit from a bit more flexibility, allowing it to respond to the rhythm, logic, and punctuation needs of any given sentence.