Category Archives: language and linguistics

Sounding Like the “Right Sort”

I was in Columbus for the annual ACES conference for the last few days. I gave a presentation on how we use vocabulary and grammar to filter audiences in and out – often in subtle ways. Here it is!

Unknown knowns

It’s opening the door to meet a stranger and realizing you know the person already. It’s sitting down to do a thing for the first time and finding that somehow you know how to do it. It’s trying long and hard to figure something out and at last realizing that you always knew the answer, but just didn’t know you knew. Sometimes it’s because you didn’t have the chance to see that you had been seeing it; sometimes it’s because you had chosen not to see it sooner.

What am I going on about? Let’s look at what can be called a Rumsfeld square or Rumsfeld matrix. It’s named after Donald Rumsfeld, who famously said, “there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know.” (Some people have characterized this as incomprehensible, but I have no idea what they’re talking about.) If we diagram it out in Greimas style, we see there’s a fourth square he doesn’t mention:

The existence of known knowns, known unknowns, and unknown unknowns implies the existence of unknown knowns. It’s right there. Yet people don’t seem to talk about them* – perhaps that’s why they’re unknown.

But the fact we have a space for them in the matrix doesn’t mean that they actually exist. Putting things into tidy diagrams and taxonomies can be rewarding, but it doesn’t necessarily add information any more than organizing your bookshelf adds information about what’s in the books. Mainly, it tells you about how your mind – and the structures it has learned – views and organizes things. But inasmuch as it describes aspects of reality, it may also be a heuristic for discovering things we aren’t aware of yet, or at least for knowing where to look for them – or, as the case may be, for becoming consciously aware of them.

Linguistics is a great place to look for this kind of example, and I mean that in several ways. Here’s a table of the International Phonetic Alphabet symbols for consonants (thanks to Kwamikagami on Wikimedia Commons):

I won’t explain all the terminology because we don’t have all night. But you’ll see that this chart has grey areas: those are sounds that are considered impossible. A pharyngeal lateral fricative, for instance, would require having your something like the tip of your tongue stuck back deep in your throat – which, even if it were physically possible, would stimulate your gag reflex. So the grey areas help us confirm aspects of reality. They also force us to clearly define what we mean by our terms. For instance, “lateral” it means going around the side(s) of the tongue, like the sound [l]; a labiodental lateral fricative is impossible because it would require going around the sides of your tongue without using your tongue, because “labiodental” means using a lip and the teeth (and not the tongue). If “lateral” could mean the side of something other than the tongue, such as air going out the sides of the mouth while biting your lip, the labiodental lateral fricative box would not be grey.

But it still might be empty, like the labiodental trill, which is considered possible (I think it would take practice!) but no known language uses it as a speech sound. But look right above that empty box, and you’ll see the symbol ⱱ, which represents a labiodental flap: a sound you make by flipping your lower lip out brushing it past your upper teeth. That box was empty until fairly recently, when the people who agree on the chart were made aware of an African language that uses the sound as a distinct speech sound. 

In a way, an empty box is a challenge to fill it – just as a grey area is a challenge to prove it wrong, or to scrutinize the definitions. So these taxonomies help turn unknown unknowns into known unknowns, and sometimes eventually to known knowns, and they also help us understand how we know – but they don’t produce the data themselves; you still have to go find speakers of real languages for that. 

But our choice of what to include in the grid – what questions to ask – can still leave unknown unknowns unknown and unknown, and it can also divert attention away from known knowns. For example, there is no column for linguolabials (tongue and lip), which is altogether possible (you could do it right now: touch your tongue to your upper lip). In this case, it’s not because they’re judged impossible, nonexistent, or unimportant; it’s just that they’re treated as variants on bilabials and coronals, and so are represented with a mark under a letter: [n̼] and [l̼], for instance. They’re known knowns but more easily overlooked because they’re not given equal weight in the taxonomy – which effectively makes them lesser-known knowns.

But there’s one more thing that linguistics tells us about all of this. Every linguistics student comes in feeling sure they know all about the sounds we make in our mouths and how we make them, and every linguistics student comes to realize they were doing things they weren’t aware that they were doing. For instance, before I encountered phonetics, I knew how to say “pot” and “spot” like a usual English speaker, but I didn’t realize that, like a usual English speaker, I was making a puff of air after the p in “pot” but not so much after the sp in “spot.” Every introductory linguistics course has one class where the students are all holding their hands, or pieces of paper, in front of their mouths and discovering that they’re doing something that they know to do but don’t know they’re doing. This is was what is often called tacit knowledge.

In fact, most of language is tacit knowledge for most speakers. We know how to put together a sentence, but we aren’t really aware of how we do it or why some things sound right and others sound wrong. And we learn rules in school and think that anything that doesn’t follow those rules doesn’t follow any rules, when in fact “nonstandard” varieties of languages have grammars that are every bit as developed and constraining as “standard” varieties. 

Some of the things we learn don’t just block knowledge, they put in false belief in place of accurate knowledge. If you say “doin’” instead of “doing,” for instance, we typically say you’re “dropping the g,” but there is no g. The difference between those two consonant sounds – [n] versus [ŋ] – is only a matter of where they’re said in the mouth; it just happens that we don’t have a separate letter for [ŋ] so we write it as ng, which also sometimes stands for [ŋg].

That’s not just an unknown known; it adds a whole new dimension to the Rumsfeld square. As the popular saying goes (seen in many versions, and often inaccurately attributed to Mark Twain), “It ain’t so much the things that people don’t know that makes trouble in this world, as it is the things that people know that ain’t so.” But I’m not going to redraw the diagram with veracity versus delusion as another dimension right here and now, because that would be a very mentally taxing digression.

Beyond linguistics, and beyond tacit knowledge and knowledge blocked by falsehood, there are also other unknown knowns in life. I think of the time more than 20 years ago when I was looking for a job and a friend said a friend of his needed someone to do some proofing corrections on HTML. So I phoned the friend-of-a-friend and we chatted a bit and he told me to come in. When I walked in the door to the office, this person I had chatted with already on the phone looked at me, and I looked at him, and we both said, “I know you!” We had had a great long conversation at our mutual friend’s place at a party some months earlier, but had not learned each other’s full name, so when we were talking on the phone we didn’t know we knew each other. The knowledge wasn’t tacit, and it wasn’t blocked; it was transiently and accidentally obscured. (He remains one of my closest friends.)

And then there’s the Karate Kid kind of moment. In the first Karate Kid movie, the hero wants to learn karate, so he apprentices himself to an old Japanese man who makes him do menial tasks such as washing and waxing the car according to very specific instructions and painting the fence with exact strokes. When at length the hero complains that he hasn’t learned anything and is just being used for free labour, the master throws some punches at him, which by reflex he blocks using the muscle moves he had internalized through doing the scut work. He knew how to do it, but he didn’t know he knew. This, too, is tacit knowledge, but not one that had already been demonstrated, like speech knowledge; it was first manifest at the point of awareness.

That’s also how I learned how to do structural editing. I picked it up through researching and writing essays and through evaluating and grading very large numbers of student essays as a grad student and instructor. I wasn’t fully conscious of the sense of flow and structure and the intuitions I was developing, but when I first sat down to actually edit articles and books, I realized that I knew how to do something I hadn’t known I knew. Of course, once it’s a known known, it can be further developed – but I have to watch out that I don’t start misleading myself into “knowing” things that aren’t so!

And then there’s the ultimate unknown known: the “enlightenment” (satori, kensho) of Zen practice. If my sense of it from accounts I have read is accurate, it involves seeing the world and realizing that you always knew its true nature, but you just didn’t know you knew… because you were too busy putting it into boxes and matrices and categories and words. Which reminds us again that while logical deductions and categorizations can lead us to discoveries, they can also lead us away from them.

Unknown knowns are some of life’s greatest pleasures, its greatest serendipities. There are also, yes, great discoveries of unknown things you either suspected might exist (as a known unknown) or had no idea would be there like that (unknown unknown). But as T.S. Eliot wrote in “Little Gidding,” 

We shall not cease from exploration 
And the end of all our exploring 
Will be to arrive where we started 
And know the place for the first time.

In an important way, our lives are a course of coming to know ourselves and our worlds – of coming to know the things we had always known but had not been aware we knew. The unknown knowns.

* Well, Slavoj Žižek has – he has used the term for “the disavowed beliefs, suppositions and obscene practices we pretend not to know about, even though they form the background of our public values.” These are not so much unknown as “unknown” – we agree to pretend not to know them so as to avoid cognitive dissonance. He was responding to Rumsfeld’s use of the idea of unknown unknowns to justify attacking Iraq.

“Pay no attention to that man behind the curtain”

Originally published on The Editors’ Weekly, the national blog of Editors Canada

What’s missing from this sample text?

A set of subjects, n = 180, were surveyed using a predetermined questionnaire. Statistical analysis of the responses revealed a statistically significant pattern of association of low-frequency polysyllabic lexemes with greater intellectual value.

It’s not short on words, nor on syllables per word, nor on grammatical complexity. It’s an imposing and impressive display. But who chose and surveyed the subjects? Who predetermined the questions? Who conducted the statistical analysis?

It’s like the Great and Powerful Oz. You’re supposed to pay no attention to whoever’s behind the curtain, making it happen.

What you’re seeing is the effect of a language ideology, the ideology of objectivity – an underlying belief in the association between detachment and authority. It’s a belief that humans are messy, subjective bags of feelings, and that to achieve real, authoritative, reliable, unquestionable truth, you remove people: these facts were not worked out by fallible humans; they were just… revealed. It’s one reason so much academic writing is so hard to read.

It’s not the only reason, of course. There are other ideologies at play too. The effects of one of them are described in the example text above (not quoted from a real study, however): the ideology of mental effort. We know that complex ideas take extra mental effort, and so we assume that greater mental effort is an indicator of greater intellectual value.

Complex syntax is equated with complex thought, and, as the example says, long and uncommon words are associated with rare and rarefied ideas. If something is easy to read, how impressive can it be, really? And, more to the point, if you make the reader sweat to figure out what you’re saying, they might not notice that what you’re saying is really fairly trivial. Once again, watch the Great and Powerful Oz, and don’t look behind the curtain!

This is not to say that everyone who writes that way is consciously trying to be the Great and Powerful Oz. Most authors, academic or otherwise, write in a way that’s considered appropriate for the type of text, and questioning why it’s “appropriate” might itself seem inappropriate – isn’t it obvious that in a research paper you don’t say “really fun,” you say “highly enjoyable”? We seldom stop to look at what’s driving our assumptions about the intellectual value of the way we phrase things. The real “man behind the curtain” is language ideology itself.

But there is no language use without language ideology: we believe that certain qualities go with certain kinds of language. It’s part of how we understand language in its context of usage. And our ideas about language are always ideas about the people we envision using that language. We don’t all agree all the time; there can be competing ideologies, for instance, about whether colloquial speech is a mark of unintelligence or of honesty. But we never come to language without baseline assumptions about what it says about the people who use it – even if it’s language that pretends they’re not there at all.

And from time to time, we can all benefit from pulling back the curtain.

Rules and laws

For Grammar Day, I want to talk briefly about laws and rules, and the fact that some people who should know better get them confused.

Let’s start with laws of nature. Say someone holds a rock in front of them and lets go of it. It flies upward instead of falling. Do you say, “No, you’re doing it wrong – the rock is supposed to fall down”?

Then there’s criminal law. Let’s say that instead of dropping the rock, they throw it through a store window. You might say “Hey!”; a cop who is nearby might arrest them – or they might get away with it.

That’s sort of like the rules of sport. Say the person is playing football, and they throw a rock instead of a football – or maybe they just throw a football the wrong way. The player will get a penalty – if the referee sees it.

But how about the rules of grammar? Let’s say someone writes a sentence: “Person the throw rock football and window at.” Your reaction on reading it is probably something like “Huh? That doesn’t even make sense.”

So let’s say instead that the sentence is “Smashing a window, the person throwed rock and football.” If you’re like a lot of people, you’ll readily utter a correction of one or more errors, even if no one asked you to. You may also say something about the intellect of the writer.

The law of gravity, like any law of nature, doesn’t need anyone to enforce it. If you see a law of nature being broken, you’re wrong: either the law isn’t really being broken (it’s an illusion, or some other law is relevant) or the law as you know it is inaccurate or incomplete and your understanding needs to be revised.

Civil and criminal laws do need enforcement, because they’re human creations. Some of us may believe that laws are there to enforce laws of nature (or of God), but really at most we’ve just appointed ourselves to try and keep people behaving in accordance with our ideas of those laws, which is an us thing. Civil and criminal laws are like the rules of sports, but with broader application and stronger enforcement mechanisms.

And rules of grammar? Ones like in the last example, such as that it’s “threw,” not “throwed,” that you shouldn’t use dangling participles, and that you should be careful with definite and indefinite articles, are also like the rules of sports: in published texts, editors typically serve as referees, following specified style rules; in a broader social context, enforcement is mostly not formalized. The rules may have a certain tidiness, but that tidiness is not a natural law, nor is it inevitable – any editor who works with multiple house styles knows that.

But what about more basic rules of grammatical conmprehensibility, such as the ones broken by “Person the throw rock football and window at”? Those, too, are human creations – just at the level of social norms that we rarely stop even to inspect. Using the rules of some other languages, that weird sentence would be entirely coherent. English puts the definite article (“the”) before the noun, but Scandinavian languages tack it onto the end of the noun as a suffix. English can be very fussy with verb conjugations (“throw,” “throws,” “threw”), especially irregular ones, but other languages are less so, and some – such as Mandarin Chinese – don’t conjugate at all. English requires indefinite articles (“a rock,” “a football”), but Gaelic doesn’t, and Slavic languages don’t use definite or indefinite articles. And English expects “and” to go between the things it combines, but in Latin its equivalent can be tacked onto the second item, as in “Senatus Populusque Romanus” – literally “Senate People-and Roman” (in English, “the Senate and People of Rome”).

So, in short, the rules of grammar, even the most apparently essential rules, are not inevitable. Grammar, even the most fundamental grammar, is not a natural law; it is like the rules of a sport. The way you say a thing is not the one logical, inevitable, natural way to say it, even if – within the variety of the language you’re speaking – it’s the only “proper” way to say it. Even the idea that a double negative equals a positive, which seems plainly logical to modern English speakers, seems otherwise to speakers of languages such as Spanish or Italian, where a negative requires agreement (e.g., “No vale nada” and “Non vale niente”: “It’s not worth nothing”). After all, it can’t be a negative statement if it’s positive in some places. Logic!

But some people, even some otherwise well educated people, seem unaware of this. Editors and linguists are wearily used to people priggishly “correcting” them with simplistic grammar rules and ideas that they recall from school, as though those rules were basic truths like natural law. I’ve seen it even from people who have graduate-level educations and clearly ought to know better.

And why does it matter? I’ve written before about how this kind of dogmatic position is used to license social aggression (see What do we care about, really and Why all English speakers worry about slipping up), but the boorishness of grammar snobs is not the biggest thing. The idea that there is one correct, natural, logical grammar gives cover for not just class discrimination but also racism (because different social groups use different varieties of the language) and even sexism (in particular ideas about such things as pronouns and grammatical gender – I’ve given talks on this several times; a video of one time is at A Hidden Gender?). 

A person who understands the socially decided nature of grammar rules can understand that someone who’s using a kind of English that’s not “proper” is not inferior, and that different varieties of English are grammatically coherent even if they’re different from the schoolbook standard. Knowing this also broadens a person’s expressive repertoire.

Does all this mean that grammar is a free-for-all, or that there’s no point in teaching it? Of course it doesn’t mean that. We teach people about the rules of sports and the rule of law. We also teach people about dress codes – there are certain things you just don’t wear in certain places and occasions, not for any matter of intrinsic suitability (sweatshirts are no less functionally suited to formal occasions than tuxedos), but just because of the social implications they have come to have. Likewise, if you use a library, you learn how the books are arranged on the shelves, and it’s a tidy, systematic, enforceable order, but it’s not an inevitable one: the choice of Dewey versus Library of Congress, just for instance, will give quite different orderings. 

Tidiness can be good, and consistent, well-defined rules can be useful. I make a nice bit of money every year tidying up text. But rigidity and narrow-mindedness are bad. And believing that the simple rules you learned in your simple youth are the only true rules is a mistake that will limit your effectiveness – and, on the larger level, can limit others, and our effectiveness and potential as a society. Learn rules – as many different sets as possible – and use them judiciously.

Oh, and have fun.

Tsk, tsk! Or is that tut, tut?

“Tsk!” is a word that stands for something that isn’t a word that we use all the time because it’s not a word, but we mostly don’t use it for what we think we use it for. Here, let me explain in my latest article for The Week:

The not-word you’re always saying


Ain’t ain’t a word.”

Obviously, that’s functionally false, and the speaker knows it: if ain’t really weren’t an understandable lexical unit, the sentence would make no more sense than “Zcvny zcvny a word.” But what some of us miss – but the people who declare the unwordness of ain’t (and other words) know at least implicitly – is that they don’t mean “not usable as a word.” They mean that it’s not a word in roughly the same way as someone in, say, 1850 might say that an obvious human adult was “not a person.” 

It’s not that the human couldn’t speak, eat, run, or do other things that any human could do. It’s not even that the human wasn’t, in the broader and more common sense, a person. It’s that the human was not legally a person: she or he couldn’t vote. The human was not of the right sort. The human did not belong in certain places, and could not fill certain functions, that were open only to those who were duly enfranchised.

This question of what humans are legally persons has not been so contentious since all adult human citizens regardless of gender or race (though not necessarily regardless of certain other statuses, such as criminal or mental) have been eligible to vote. But the question of what words are words has not gone away, not least because it’s not a question for courts decide, nor for dictionaries, and especially not for linguists (if you assigned the task to linguists, they would refuse it, run away and hide, or arm up and fight you).

It ain’t for dictionaries to decide? Nope. And I say that not just because dictionaries are field guides, not legislation (you don’t say something that just flew past is “not a bird” just because it’s not in your pocket guide); I say that because even the people who appeal to the authority of dictionaries reject that authority when they don’t like what they find. Such as “ain’t contraction 1 : am not : are not : is not 2 : have not : has not.”

Ain’t is not legally disenfranchised, no (though I suspect its ingenuous use in legal documents would be frowned on). But it is pointedly socially “not our sort, dear.” It is a word that “the better people” want it to be understood they would not consort with. It would not be invited to society weddings. But it would work in the kitchen with the caterers.

And as it worked there, those in the wedding party would studiously avoid seeing or acknowledging it, just as they would any fallen poor relation. “Do not say that Uncle Frederick is working in the kitchen. I won’t have it! That man is just Freddy, a local ne’er-do-well to whom we try to give a bit of charity work from time to time. And he should be kept away from the guests.” Never mind that Freddy, the erstwhile cadet of the family, is doing quite well and in fact the wedding is entirely relying on his skill as a saucier.

Erstwhile cadet? By that I mean younger brother of the heir. But younger brothers, as louche as they may be, are still normally permitted entrance to society, and so was ain’t, at first.

You might think that ain’t was illegitimate, since it doesn’t match anything clearly: not am not, not are not, not is not, not have not, not has not. But if you spend a little more time with the matter, I think you won’t be of that mind for too long. Contractions can change form. Am not became a’n’t for some people for some time, as did are not, and have not and has not became ha’n’t and even ’a’n’t (with varying numbers of apostrophes). And then, with shifts in vowel, that lengthy a came to be a “long a” – the sound that is represented by the ai in ain’t. We also know that respected writers and assorted rich persons were using it in the late 1600s and into the 1700s. The debate has not been concluded as to which sense of it came first, or exactly how it came to cover so many different senses; it may have arisen independently for multiple forms and merged. But its ejection from polite society came as a result of several transgressions to the rigid and fragile roles and rules of privilege.

For one thing, it simply wouldn’t stay in its place, or even know its place. It covered too many senses. This was a problem for reasons of ambiguity, perhaps, though in truth only rarely, since its use for hasn’t and haven’t is only for the auxiliary: “I ain’t a dog” can’t mean “I haven’t a dog.” It was a bigger problem for reasons of flagrant promiscuity, which is frowned on. And – to put it plainly – it was too easy. Which is a terrible sin in English. All of the worst mistakes, my darling, come from trying to make a spelling or inflection too easy. “I goed”? Wretchedly childish. “I been?” Sloppy and lazy. And simplified spelling? Beyond disgusting. It also ran up against an increasing prejudice against contractions, which – starting not too far into the 1700s – were increasingly seen as too informal and lazy, making one syllable where our illustrious forebears had seen fit to make the effort of saying two.

And then it started being associated with the wrong sort of people, which is absolutely death, darling, death. It was heard on the tongues of those rural sorts from the farther reaches of the countryside, and those lower-class sorts from the poorer neighbourhoods of the city – those unpleasant people who sold fish and made deliveries and took away rubbish and cleaned gutters and, in short, did all the essential work without which all the fashionable people would be wallowing starving in the muck – and then it was done. No decent person could be heard to use it.

Except when slumming, of course. Your school teacher, socially vulnerable, might studiously avoid association with the lowlifes, but the assorted lords and barons could afford to consort slyly on the side with the riff-raff if they were the fun or useful sort of riff-raff. And ain’t has become the classic slumming word. With this one word, you can shift the tone and attitude of a whole sentence – “Sir Peter? He ain’t here, darling, so off with you” – or even set the tone for a song in the title – “Ain’t Misbehaving,” “It Ain’t Necessarily So.” It is, in short, an expert saucier. With its fall from grace came an ability to season a sentence as quickly and effectively as any pepper or aged cheese.

And that is a role it is happy to fill. In fact, it has far more effect and power than any of its more respectable siblings and cousins. It’s not just that it can instantly set the tone as casual, folksy, and thus (thanks to our ideologies around class and language) more honest; it’s that it does not shrink from respectable companions, but they can be frightened by it – one incursion of ain’t into the wrong place could be like a fly in the pudding: “In submitting this update, we acknowledge that we ain’t achieved our goals yet, but we hope that with further funding we will be able to provide conclusive results.” In short, ain’t is misbehaving, and that’s the point.

So I am not making an impassioned plea for the acceptance of ain’t into formal discourse. That would take away its power. It would be telling the best saucier in town that he must rejoin his starchy family and spend the afternoons discussing bank drafts and society weddings and never cook again. But I am saying to stop saying that it’s not a word. A word that is casual is still a word, and it does not demean or degrade anyone to use casual language when the situation calls for it. Our language is capable of almost infinite variety and nuance in tone; let’s make use of it unashamedly. And wave hi to Uncle Freddy in the kitchen.


I’m gonna lay down a three-part fact here:

Eye dialect is hypocritical, handy, and hazardous.

What’s eye dialect? It’s when you spell something “incorrectly” the way pretty much anyone would say it rather than the way it’s officially spelled, to indicate something about the speaker to whom it’s attributed and/or the context in which it’s presented. And by “something” I mean typically a lack of education, or at least a very informal, “folksy” context, which is just a positive tinge onto the same lower social position. 

So if, for instance, a character in a book says “I seen my reflekshun,” the “I seen” is nonstandard grammar, but the “reflekshun” is eye dialect: it’s exactly the way everyone says it, so the implication is just that the speaker would spell it that way if they wrote it down because they’re, you know, [makes deprecatory hand gesture].

Among the most common – and consequently least negatively toned – bits of eye dialect are woulda, coulda, shoulda, and, of course, gonna.

Everyone says going to as “gonna” most of the time when it’s used as a modal auxiliary. For one thing, frequent and unstressed grammatical widgets are usually uttered with the minimum necessary effort – heck, I often really say “I’n a” rather than even “I’m gonna”; for another, it allows us to differentiate between auxiliary and main-verb uses, for example between “Are you gonna get that now?” and “Are you going to get that now?” (the latter, spoken with full value, meaning “Are you going now to get that?”). You wouldn’t say or write “I’m gonna the store.”

But, because this is English and we just love showing where things came from and how they’re put together, and – more importantly – we love using spelling as a torture test and badge of virtue, we still insist on the “correct” (socially valued) spelling being going to – and would have, could have, should have – even when we say it in the reduced way.

So I think it’s plain why I say eye dialect is hypocritical: we use it to look down on people for doing exactly what we – and everyone we consider the “right sort” – do on the regular. (Do you protest? OK, tell me what your reaction is when you see that someone has written “I would of done it if I’d of known.” And then tell me the difference between how you would pronounce that and how you normally pronounce “I would have done it if I’d have known.” If you see “would of” in a novel, it’s because it’s attributed to a character who would write it that way.)

Why do I say eye dialect is handy? Ah, because that very class connotation – the one that is arrantly hypocritical when we use it to look down on others – lets us establish tone when we’re using it in our own voice: we can present ourselves as “casual,” “folksy,” “honest” (honesty is a virtue typically viewed as inversely correlated with sophistication – yes, it’s been studied: we tend to see it that way; and yes, we’re wrong about that: in reality there’s no correlation one way or the other). 

Yes, it’s still hypocritical, maybe even doubly so because we’re using it to avail ourselves positively of a distinction we otherwise wield negatively: when other people do it they’re unintelligent, but when we do it we’re folksy and honest. But ya know what? The more we use the spelling gonna generally as a colloquial usage, the more it loses the “unintelligent” connotation, so I’m not opposed to it. Which is fine, because everyone’s gonna use it anyway.

OK, so why do I say eye dialect is hazardous? I don’t mean as a further elaboration on the class distinctions. I mean for people learning English as a second (or later) language. I’ve known people who learned English as adolescents or adults who hadn’t quite processed that gonna is informal when written and relaxed when spoken. A professor I had would use it in comments and letters written in otherwise academic English. A co-worker always said it (in her slight German accent) with very clear and deliberate enunciation: “Are you gun na do that?” – which sounded more odd and awkward than if she had just full enounced the formal version, “Are you going to do that?”

So how long, by the way, have we been doing this?

That’s a two-pronged question, and the answer to the first prong – how long we’ve been reducing “going to” to “gonna” in speech – is that I have no way of knowing exactly for sure, but the odds are good that it’s just about as long as we’ve been using going to as a modal auxiliary. There are four very common phonological processes involved: 

  1. place assimilation, wherein the /ŋ/ moves to the front of the mouth and is realized as [n] because it’s between a front vowel [ɪ] and a stop at the tip of the tongue [t] – either one could be enough to move it forward, as we see from the common and long-established practice of saying -ing as [ɪn]; 
  2. assimilation and deletion, wherein the [t] just gets Borged right into that [n] and disappears – we do tend to reduce /t/ very often, turning it into a flap as in [bʌɾɹ̩] for butter or into a glottal stop as in [bʌʔn̩] for button, and this deletion is just the ultimate reduction;
  3. deletion again, in this case the [ɪ] before the [n]; and 
  4. reduction, when we make the minimum effort in pronouncing the o and it comes out just as [ə] (an argument could be made that the deletion of the [ɪ] is part of this reduction).

Basically, we say it as “gonna” because we naturally conserve effort when speaking – there’s a trade-off between conserving the effort of articulating the word and conserving the effort of being understood, and with modal auxiliaries, the effort of being understood is usually the lesser problem.

The answer to the second prong – how long we’ve been writing it as gonna – is just over a century in North America, but about a century longer than that in Scotland, if the available published citations are to be believed. Eye dialect did have a bit of a vogue in the US in the late 1800s and early 1900s, and this spelling was likely encouraged by that.

So there you have it. One of the most common bits of “wrong” spelling, so entrenched that in some contexts these days you’re making more of a point if you spell it the “right” way: picture Janet Jackson’s “What’s It Gonna Be” as “What’s It Going to Be,” or Led Zeppelin’s “We’re Gonna Groove” as “We’re Going to Groove” (and then why go halfway? why not “What Is It Going to Be” and “We Are Going to Groove”?). Eventually it might even qualify just as nonstandard spelling, not eye dialect. But my points about eye dialect are still gonna stand…

Prescriptivist or descriptivist?

I’m once again serving as a guest expert for a friend’s copyediting course. The students in these courses often ask me interesting questions about points of grammar. But this time, one of them asked me a broader question – or, rather, two of them:

Would you describe yourself as more of a prescriptivist or descriptivist?

What value do you see in each of these approaches to language? 

Since you’re here reading this, you probably know what the difference is between prescriptivist and descriptivist: a prescriptivist is someone who believes in imposition of authoritative prescriptions on language usage – fans of Lynne Truss, for instance, and avid users of Strunk and White’s Elements of Style – while a descriptivist is someone who believes in observing and describing how people actually use language and not holding stern judgmental positions on it. Most modern dictionaries are descriptivist: they include a word if it’s in common use – including, for instance, impactful and misunderestimate – and they try to include all senses that are in common use. Some people believe they should be prescriptivist and forbid certain words and senses of words.

Since I have a graduate degree in linguistics, it’s no surprise that by disposition I’m a descriptivist. I love language in all its forms, and I observe how it’s used in each context. But that doesn’t mean I have an “anything goes” approach in my work as an editor. After all, I’m editing a text that is part of a specific genre and is meant to have a particular effect on a certain audience. I use my observations about how people use language (and how they think about it, which is another important issue) to decide what choices of words and phrasing will work best. 

Generally, of course, there’s plenty of latitude – more than some people think. But we can recognize that, for instance, “Go ask your mommy” will have one effect in a children’s book and quite another in a political speech. Your elementary school teachers may have said “‘Ain’t’ ain’t a word,” but aside from being obviously false (the sentence would be incoherent if it weren’t a word; it would be like saying “‘Zzblgt’ zzblgt a word”), all that does is position ain’t as a very powerful mark of “bad” English (informal, nonstandard, folksy – which is also taken as frank and honest). So in an annual report, if you’re giving forecasts on projects, you would have “It isn’t coming by January” (or even “It is not coming by January”), but you may make use of “It ain’t coming by January” as a momentary excursion in style if you want to convey a particular (refreshing, informal) frankness, which might position the ostensible writer (e.g., the CEO) as a “regular guy.”

So, on the one hand, the idea that you must not ever use ain’t just ain’t true. But on the other hand, we can thank such teachers and others like them for maintaining that opprobrium, which gives the word such power. Likewise, you can have a huge effect by slipping in a vulgarity in the right context, and vulgarities maintain their power by having some people constantly treat them as the most awful things.

In that way, we need prescriptions to give us rules to push against, and to know where we stand; anyway, we will always have them, because some people just love rules (regarding rule-seeking behaviour, see “That old bad rule-seeking behaviour”). Beyond that, it’s useful to have prescriptions just to help us decide what to do where – I regularly look things up in the Chicago Manual of Style, thereby saving me from having to justify my choices on my own account and ensuring that my choices will be consistent with choices in other similar books, which also helps make the reading go smoother.

But many of the things that prescriptivists focus on the most have little to do with consistency or clarity. In fact, that’s probably why they focus on them so much. Someone once said “School board politics are so vicious precisely because the stakes are so small,” and the same goes with grammatical and lexical prescriptions: the ones that people get the most exercised about are precisely ones that make the least difference in clarity or effectiveness – which frees them up to function almost entirely as social shibboleths, signifiers of who is “the right sort.” Grammar peevery is just using the rule-seeking instinct to license social aggression while giving a plausible excuse. One of my favourite articles that I’ve written goes into this: “Why all English speakers worry about slipping up.”

So, in short, while many linguists are simply hard-set against prescriptivists, I have a more complex position. In some ways, I am by profession a prescriptivist: I enforce prescriptions within specific contexts – though those prescriptions are often made on the basis of descriptive observation. On the other hand, I don’t correct people’s grammar unless they’re paying me to do it, and I don’t think grammar is a useful indicator of character or intelligence; some very magnanimous and insightful people are not too tidy with grammar, and some people who have perfect grammar are obtuse and obnoxious. I don’t enjoy the presence of outspoken prescriptivists, but I’m sure we will always have them; and they fill a role, modelling a specific idea of propriety that we can choose to flaunt or flout as we fancy.

But what about plural “they”?

This article originally appeared on The Editors’ Weekly, the official blog of Canada’s national editorial association.

Singular “they” is here to stay, and that’s a good thing. There is no decent reason to require that third-person singular pronouns—and only third-person singular pronouns—always specify gender. “He” has never truly covered men and women equally, though starting in the 1800s some people tried to insist that it did, and constructions such as “he or she” or “s/he” are clunky at best. So it’s natural to accept officially what has been an informal workaround for centuries: extending the plural pronoun to cover the singular.

It’s not the first time that English has done this. As early as the 1200s, we started using the plural “you” for individuals of higher status, and by the 1800s, rather than continuing to specify respect—or lack of it—in pronouns, we had almost entirely stopped using the lower-status singular “thou.” If we can use a plural form in place of a singular to erase a status-based distinction, we can certainly do it to erase a gender-based distinction.

But there is one problem that we run into with singular “they,” a problem we have already encountered with singular “you”: how do you make clear when it’s plural?

That’s still a useful distinction, and it’s not always obvious from context. Consider a sentence such as “The CEO met the VPs at a bar, but they drank too much and started singing karaoke, so they left.” If specifying the gender of the CEO is out of the question, to clarify who “they” refers to you’ll need to rewrite it to avoid the pronouns—and if it’s a longer narration, that gets clunkier and clunkier. So what do we do?

Well, what did we do with “you”? For a time—quite a while, in fact, from the late 1600s through the late 1700s—singular “you” got singular verbs: “you was,” “you is,” “you does.” It was so common, Robert Lowth inveighed against it in his 1762 Short Introduction to English Grammar. Even Doctor Johnson used “you was.” Will we try the same kind of thing with “they”—saying “they is” and “they was”? A few people have tried it, but such usages are already strongly associated with “uneducated” English, and so they’re unlikely to become commonplace. And “you was” didn’t last, after all—Doctor Johnson and everyone else ultimately switched to “you were” even for the singular.

So how do we specify plural “you”? You know how: we add further plural specification to it. In the US South, “y’all” or “you-all” is very common, and it’s spreading; in other places, “yous,” “youse,” “you ’uns,” “yiz,” and “yinz” are local favourites. In many other places, we say “you guys” or something similar when we need to make the distinction. And I’ll wager we’ll end up doing the same kind of thing with plural “they.” “They-all” seems readily available; “those ones” and “those guys” are likely to show up; differential usages of “themselves” and “themself” are already in use and may be extended; and others may appear—I’ll be watching eagerly. And in some contexts, for added clarity, something like “the one” might be used for the singular.

What do we do as editors, here and now? We keep an eye on how popular use is changing. When we can, we use our positions to influence it a little. And, as always, we use our judgement to find what’s clearest and most effective for the audience of the text we’re working on. 


Socially, language functions in many ways like a scaffold.

I’ll explain. But first I’ll talk briefly about this word scaffold and where it comes from and what it is used to mean now. Because of course I will.

Scaffold has to do with neither folds nor scafs, nor for that matter with holds. It’s yet another word that came to English from French, and came to French from Latin (and Greek), and changed quite a lot en route. The modern French reflex of it is échafaud; both words came from a word that went through quite a few forms, but had the early form escadafaut, which was es- (from Latin ex-, ‘out’) plus cadafaut, which, like modern French catafalque, comes from later Latin catafalcum (‘viewing platform’), which in its turn was probably made from cata-, from Greek κατα- (‘back, against’) and Latin falicum, in turn from fala (‘wooden gallery; siege tower’).

So it started with a siege tower and then became a viewing platform and then became a… oh, yes, I didn’t say: escadafaut generally referred to a platform for viewing a tournament.

But of course that’s not what scaffold (or scaffolding) is usually used for now. It’s that structure of metal supports and wooden platforms you may see in front of a building. Sometimes the building is being built; sometimes it’s being restored or preserved; sometimes it’s just being kept standing. And, less commonly these days, scaffold can also refer to a platform for viewing something, or for a theatrical performance, or for public executions, or, in some cultures, for disposal of dead bodies. (And let us not forget its cousin catafalque, which in modern English usage is a temporary ornamental platform for a coffin to go on in funerary rites.)

OK, then. So how does language function socially like a scaffold?

To start with, we use language to mediate the development and maintenance of social structures and interactions. Language is an essential social tool; our social structures may not be made of it (though some arguably are, but that doesn’t work with the current metaphor, so let it slide), but they are made with it. You want to add a glorious new tower or wing to the edifice of our culture? You scaffold it with language: new words, new ways of using old words, new turns of phrase, sometimes even new grammar.

But we also use language to shore up, maintain, and refresh existing social structures. Turns of phrase, common idioms, colloquialisms, and metaphors can embed biases and presuppositions (as just one example, are you familiar with the term Indian giver?). Even basic grammatical details can function this way, as for instance insistence on he as the default pronoun (which it never was, though some people starting in the 1800s tried to claim it was in places where that would mean not having to explicitly recognize women, but somehow not in places where it might entail giving women completely equal rights – see Dennis Baron’s great book What’s Your Pronoun? for extensive details on this). And peeving about “new” usages reinforces an ideology of “old” as better – adherence to “tradition,” which always turns out to be just what the speaker remembers having learned in youth, plus some additions that reinforce their prejudices: the linguistic façade of the social structures and hierarchies that the person has learned and participated in and is quite comfortable with, thank you.

Not that all “old” words are acceptable in such a perspective, of course. Social stratification is maintained through ideas of “good” English (as opposed to the kind that people from the wrong region or socioeconomic level speak – by the way, “good” English is just as weird and arbitrary as many kinds of “bad” English, and in fact some things are “bad” because they’re not quite weird and arbitrary enough: just watch someone correct a kid who says “goed” instead of “went”). It is also maintained through taboos based on ideas of purity and sexual propriety. You display your conformity to these social structures by treating “bad” words as “bad” and at the same time by rejecting changes in usage that try to undo social subordination of certain groups of people. A person may argue “politely” that we needn’t change the names of any sports teams, for example, while at the same time objecting to the “bad English” or “bad words” uttered by people on the other side of the debate who are upset at being treated as stereotypes. 

Well. All good buildings have basements, dears, and they will collapse without them, but we don’t go down into them ourselves, do we? Oh, no, dears, we do not. A nice, tidy scaffold helps maintain decorum. And when we focus on the scaffold, we also don’t necessarily notice the structure that it’s there to maintain. We get stuck on the words and ignore the tilting tower of crumbling bricks behind it.

But the language has its own ostensive value too. With it, as on a scaffold (next sense), we can perform our identities and our attitudes – and we can watch others perform theirs. In fact, that’s a central function of language: words are known by the company they keep. We always use our language to let others know things about ourselves, our attitudes, and where we stand. Some of us, for example, will make sure to use some terms and avoid using others so as not to perpetuate social injustices, while others will make sure it’s understood they don’t brook “woke” “politically correct” “virtue signalling” and will stand for “family values” (which assume very specific kinds of families and exclude families that don’t meet the model).

And, of course, with language, as with scaffolds, we can view the tournaments of our societies, we can conduct – and display – executions, and we can show off the resulting corpses and expose them for the carrion birds. Choices of words and phrasing let you know who’s been cut dead, and they help keep it that way.

But at least, unlike (most) real-life scaffolds, language is here to stay – and it is deserving of aesthetic appreciation in its own right. And is an essential part of culture, not just an accessory. Metaphors have their limits… but language wouldn’t exist without them.