Category Archives: language and linguistics

“Pay no attention to that man behind the curtain”

Originally published on The Editors’ Weekly, the national blog of Editors Canada

What’s missing from this sample text?

A set of subjects, n = 180, were surveyed using a predetermined questionnaire. Statistical analysis of the responses revealed a statistically significant pattern of association of low-frequency polysyllabic lexemes with greater intellectual value.

It’s not short on words, nor on syllables per word, nor on grammatical complexity. It’s an imposing and impressive display. But who chose and surveyed the subjects? Who predetermined the questions? Who conducted the statistical analysis?

It’s like the Great and Powerful Oz. You’re supposed to pay no attention to whoever’s behind the curtain, making it happen.

What you’re seeing is the effect of a language ideology, the ideology of objectivity – an underlying belief in the association between detachment and authority. It’s a belief that humans are messy, subjective bags of feelings, and that to achieve real, authoritative, reliable, unquestionable truth, you remove people: these facts were not worked out by fallible humans; they were just… revealed. It’s one reason so much academic writing is so hard to read.

It’s not the only reason, of course. There are other ideologies at play too. The effects of one of them are described in the example text above (not quoted from a real study, however): the ideology of mental effort. We know that complex ideas take extra mental effort, and so we assume that greater mental effort is an indicator of greater intellectual value.

Complex syntax is equated with complex thought, and, as the example says, long and uncommon words are associated with rare and rarefied ideas. If something is easy to read, how impressive can it be, really? And, more to the point, if you make the reader sweat to figure out what you’re saying, they might not notice that what you’re saying is really fairly trivial. Once again, watch the Great and Powerful Oz, and don’t look behind the curtain!

This is not to say that everyone who writes that way is consciously trying to be the Great and Powerful Oz. Most authors, academic or otherwise, write in a way that’s considered appropriate for the type of text, and questioning why it’s “appropriate” might itself seem inappropriate – isn’t it obvious that in a research paper you don’t say “really fun,” you say “highly enjoyable”? We seldom stop to look at what’s driving our assumptions about the intellectual value of the way we phrase things. The real “man behind the curtain” is language ideology itself.

But there is no language use without language ideology: we believe that certain qualities go with certain kinds of language. It’s part of how we understand language in its context of usage. And our ideas about language are always ideas about the people we envision using that language. We don’t all agree all the time; there can be competing ideologies, for instance, about whether colloquial speech is a mark of unintelligence or of honesty. But we never come to language without baseline assumptions about what it says about the people who use it – even if it’s language that pretends they’re not there at all.

And from time to time, we can all benefit from pulling back the curtain.

Rules and laws

For Grammar Day, I want to talk briefly about laws and rules, and the fact that some people who should know better get them confused.

Let’s start with laws of nature. Say someone holds a rock in front of them and lets go of it. It flies upward instead of falling. Do you say, “No, you’re doing it wrong – the rock is supposed to fall down”?

Then there’s criminal law. Let’s say that instead of dropping the rock, they throw it through a store window. You might say “Hey!”; a cop who is nearby might arrest them – or they might get away with it.

That’s sort of like the rules of sport. Say the person is playing football, and they throw a rock instead of a football – or maybe they just throw a football the wrong way. The player will get a penalty – if the referee sees it.

But how about the rules of grammar? Let’s say someone writes a sentence: “Person the throw rock football and window at.” Your reaction on reading it is probably something like “Huh? That doesn’t even make sense.”

So let’s say instead that the sentence is “Smashing a window, the person throwed rock and football.” If you’re like a lot of people, you’ll readily utter a correction of one or more errors, even if no one asked you to. You may also say something about the intellect of the writer.

The law of gravity, like any law of nature, doesn’t need anyone to enforce it. If you see a law of nature being broken, you’re wrong: either the law isn’t really being broken (it’s an illusion, or some other law is relevant) or the law as you know it is inaccurate or incomplete and your understanding needs to be revised.

Civil and criminal laws do need enforcement, because they’re human creations. Some of us may believe that laws are there to enforce laws of nature (or of God), but really at most we’ve just appointed ourselves to try and keep people behaving in accordance with our ideas of those laws, which is an us thing. Civil and criminal laws are like the rules of sports, but with broader application and stronger enforcement mechanisms.

And rules of grammar? Ones like in the last example, such as that it’s “threw,” not “throwed,” that you shouldn’t use dangling participles, and that you should be careful with definite and indefinite articles, are also like the rules of sports: in published texts, editors typically serve as referees, following specified style rules; in a broader social context, enforcement is mostly not formalized. The rules may have a certain tidiness, but that tidiness is not a natural law, nor is it inevitable – any editor who works with multiple house styles knows that.

But what about more basic rules of grammatical conmprehensibility, such as the ones broken by “Person the throw rock football and window at”? Those, too, are human creations – just at the level of social norms that we rarely stop even to inspect. Using the rules of some other languages, that weird sentence would be entirely coherent. English puts the definite article (“the”) before the noun, but Scandinavian languages tack it onto the end of the noun as a suffix. English can be very fussy with verb conjugations (“throw,” “throws,” “threw”), especially irregular ones, but other languages are less so, and some – such as Mandarin Chinese – don’t conjugate at all. English requires indefinite articles (“a rock,” “a football”), but Gaelic doesn’t, and Slavic languages don’t use definite or indefinite articles. And English expects “and” to go between the things it combines, but in Latin its equivalent can be tacked onto the second item, as in “Senatus Populusque Romanus” – literally “Senate People-and Roman” (in English, “the Senate and People of Rome”).

So, in short, the rules of grammar, even the most apparently essential rules, are not inevitable. Grammar, even the most fundamental grammar, is not a natural law; it is like the rules of a sport. The way you say a thing is not the one logical, inevitable, natural way to say it, even if – within the variety of the language you’re speaking – it’s the only “proper” way to say it. Even the idea that a double negative equals a positive, which seems plainly logical to modern English speakers, seems otherwise to speakers of languages such as Spanish or Italian, where a negative requires agreement (e.g., “No vale nada” and “Non vale niente”: “It’s not worth nothing”). After all, it can’t be a negative statement if it’s positive in some places. Logic!

But some people, even some otherwise well educated people, seem unaware of this. Editors and linguists are wearily used to people priggishly “correcting” them with simplistic grammar rules and ideas that they recall from school, as though those rules were basic truths like natural law. I’ve seen it even from people who have graduate-level educations and clearly ought to know better.

And why does it matter? I’ve written before about how this kind of dogmatic position is used to license social aggression (see What do we care about, really and Why all English speakers worry about slipping up), but the boorishness of grammar snobs is not the biggest thing. The idea that there is one correct, natural, logical grammar gives cover for not just class discrimination but also racism (because different social groups use different varieties of the language) and even sexism (in particular ideas about such things as pronouns and grammatical gender – I’ve given talks on this several times; a video of one time is at A Hidden Gender?). 

A person who understands the socially decided nature of grammar rules can understand that someone who’s using a kind of English that’s not “proper” is not inferior, and that different varieties of English are grammatically coherent even if they’re different from the schoolbook standard. Knowing this also broadens a person’s expressive repertoire.

Does all this mean that grammar is a free-for-all, or that there’s no point in teaching it? Of course it doesn’t mean that. We teach people about the rules of sports and the rule of law. We also teach people about dress codes – there are certain things you just don’t wear in certain places and occasions, not for any matter of intrinsic suitability (sweatshirts are no less functionally suited to formal occasions than tuxedos), but just because of the social implications they have come to have. Likewise, if you use a library, you learn how the books are arranged on the shelves, and it’s a tidy, systematic, enforceable order, but it’s not an inevitable one: the choice of Dewey versus Library of Congress, just for instance, will give quite different orderings. 

Tidiness can be good, and consistent, well-defined rules can be useful. I make a nice bit of money every year tidying up text. But rigidity and narrow-mindedness are bad. And believing that the simple rules you learned in your simple youth are the only true rules is a mistake that will limit your effectiveness – and, on the larger level, can limit others, and our effectiveness and potential as a society. Learn rules – as many different sets as possible – and use them judiciously.

Oh, and have fun.

Tsk, tsk! Or is that tut, tut?

“Tsk!” is a word that stands for something that isn’t a word that we use all the time because it’s not a word, but we mostly don’t use it for what we think we use it for. Here, let me explain in my latest article for The Week:

The not-word you’re always saying

ain’t

Ain’t ain’t a word.”

Obviously, that’s functionally false, and the speaker knows it: if ain’t really weren’t an understandable lexical unit, the sentence would make no more sense than “Zcvny zcvny a word.” But what some of us miss – but the people who declare the unwordness of ain’t (and other words) know at least implicitly – is that they don’t mean “not usable as a word.” They mean that it’s not a word in roughly the same way as someone in, say, 1850 might say that an obvious human adult was “not a person.” 

It’s not that the human couldn’t speak, eat, run, or do other things that any human could do. It’s not even that the human wasn’t, in the broader and more common sense, a person. It’s that the human was not legally a person: she or he couldn’t vote. The human was not of the right sort. The human did not belong in certain places, and could not fill certain functions, that were open only to those who were duly enfranchised.

This question of what humans are legally persons has not been so contentious since all adult human citizens regardless of gender or race (though not necessarily regardless of certain other statuses, such as criminal or mental) have been eligible to vote. But the question of what words are words has not gone away, not least because it’s not a question for courts decide, nor for dictionaries, and especially not for linguists (if you assigned the task to linguists, they would refuse it, run away and hide, or arm up and fight you).

It ain’t for dictionaries to decide? Nope. And I say that not just because dictionaries are field guides, not legislation (you don’t say something that just flew past is “not a bird” just because it’s not in your pocket guide); I say that because even the people who appeal to the authority of dictionaries reject that authority when they don’t like what they find. Such as “ain’t contraction 1 : am not : are not : is not 2 : have not : has not.”

Ain’t is not legally disenfranchised, no (though I suspect its ingenuous use in legal documents would be frowned on). But it is pointedly socially “not our sort, dear.” It is a word that “the better people” want it to be understood they would not consort with. It would not be invited to society weddings. But it would work in the kitchen with the caterers.

And as it worked there, those in the wedding party would studiously avoid seeing or acknowledging it, just as they would any fallen poor relation. “Do not say that Uncle Frederick is working in the kitchen. I won’t have it! That man is just Freddy, a local ne’er-do-well to whom we try to give a bit of charity work from time to time. And he should be kept away from the guests.” Never mind that Freddy, the erstwhile cadet of the family, is doing quite well and in fact the wedding is entirely relying on his skill as a saucier.

Erstwhile cadet? By that I mean younger brother of the heir. But younger brothers, as louche as they may be, are still normally permitted entrance to society, and so was ain’t, at first.

You might think that ain’t was illegitimate, since it doesn’t match anything clearly: not am not, not are not, not is not, not have not, not has not. But if you spend a little more time with the matter, I think you won’t be of that mind for too long. Contractions can change form. Am not became a’n’t for some people for some time, as did are not, and have not and has not became ha’n’t and even ’a’n’t (with varying numbers of apostrophes). And then, with shifts in vowel, that lengthy a came to be a “long a” – the sound that is represented by the ai in ain’t. We also know that respected writers and assorted rich persons were using it in the late 1600s and into the 1700s. The debate has not been concluded as to which sense of it came first, or exactly how it came to cover so many different senses; it may have arisen independently for multiple forms and merged. But its ejection from polite society came as a result of several transgressions to the rigid and fragile roles and rules of privilege.

For one thing, it simply wouldn’t stay in its place, or even know its place. It covered too many senses. This was a problem for reasons of ambiguity, perhaps, though in truth only rarely, since its use for hasn’t and haven’t is only for the auxiliary: “I ain’t a dog” can’t mean “I haven’t a dog.” It was a bigger problem for reasons of flagrant promiscuity, which is frowned on. And – to put it plainly – it was too easy. Which is a terrible sin in English. All of the worst mistakes, my darling, come from trying to make a spelling or inflection too easy. “I goed”? Wretchedly childish. “I been?” Sloppy and lazy. And simplified spelling? Beyond disgusting. It also ran up against an increasing prejudice against contractions, which – starting not too far into the 1700s – were increasingly seen as too informal and lazy, making one syllable where our illustrious forebears had seen fit to make the effort of saying two.

And then it started being associated with the wrong sort of people, which is absolutely death, darling, death. It was heard on the tongues of those rural sorts from the farther reaches of the countryside, and those lower-class sorts from the poorer neighbourhoods of the city – those unpleasant people who sold fish and made deliveries and took away rubbish and cleaned gutters and, in short, did all the essential work without which all the fashionable people would be wallowing starving in the muck – and then it was done. No decent person could be heard to use it.

Except when slumming, of course. Your school teacher, socially vulnerable, might studiously avoid association with the lowlifes, but the assorted lords and barons could afford to consort slyly on the side with the riff-raff if they were the fun or useful sort of riff-raff. And ain’t has become the classic slumming word. With this one word, you can shift the tone and attitude of a whole sentence – “Sir Peter? He ain’t here, darling, so off with you” – or even set the tone for a song in the title – “Ain’t Misbehaving,” “It Ain’t Necessarily So.” It is, in short, an expert saucier. With its fall from grace came an ability to season a sentence as quickly and effectively as any pepper or aged cheese.

And that is a role it is happy to fill. In fact, it has far more effect and power than any of its more respectable siblings and cousins. It’s not just that it can instantly set the tone as casual, folksy, and thus (thanks to our ideologies around class and language) more honest; it’s that it does not shrink from respectable companions, but they can be frightened by it – one incursion of ain’t into the wrong place could be like a fly in the pudding: “In submitting this update, we acknowledge that we ain’t achieved our goals yet, but we hope that with further funding we will be able to provide conclusive results.” In short, ain’t is misbehaving, and that’s the point.

So I am not making an impassioned plea for the acceptance of ain’t into formal discourse. That would take away its power. It would be telling the best saucier in town that he must rejoin his starchy family and spend the afternoons discussing bank drafts and society weddings and never cook again. But I am saying to stop saying that it’s not a word. A word that is casual is still a word, and it does not demean or degrade anyone to use casual language when the situation calls for it. Our language is capable of almost infinite variety and nuance in tone; let’s make use of it unashamedly. And wave hi to Uncle Freddy in the kitchen.

gonna

I’m gonna lay down a three-part fact here:

Eye dialect is hypocritical, handy, and hazardous.

What’s eye dialect? It’s when you spell something “incorrectly” the way pretty much anyone would say it rather than the way it’s officially spelled, to indicate something about the speaker to whom it’s attributed and/or the context in which it’s presented. And by “something” I mean typically a lack of education, or at least a very informal, “folksy” context, which is just a positive tinge onto the same lower social position. 

So if, for instance, a character in a book says “I seen my reflekshun,” the “I seen” is nonstandard grammar, but the “reflekshun” is eye dialect: it’s exactly the way everyone says it, so the implication is just that the speaker would spell it that way if they wrote it down because they’re, you know, [makes deprecatory hand gesture].

Among the most common – and consequently least negatively toned – bits of eye dialect are woulda, coulda, shoulda, and, of course, gonna.

Everyone says going to as “gonna” most of the time when it’s used as a modal auxiliary. For one thing, frequent and unstressed grammatical widgets are usually uttered with the minimum necessary effort – heck, I often really say “I’n a” rather than even “I’m gonna”; for another, it allows us to differentiate between auxiliary and main-verb uses, for example between “Are you gonna get that now?” and “Are you going to get that now?” (the latter, spoken with full value, meaning “Are you going now to get that?”). You wouldn’t say or write “I’m gonna the store.”

But, because this is English and we just love showing where things came from and how they’re put together, and – more importantly – we love using spelling as a torture test and badge of virtue, we still insist on the “correct” (socially valued) spelling being going to – and would have, could have, should have – even when we say it in the reduced way.

So I think it’s plain why I say eye dialect is hypocritical: we use it to look down on people for doing exactly what we – and everyone we consider the “right sort” – do on the regular. (Do you protest? OK, tell me what your reaction is when you see that someone has written “I would of done it if I’d of known.” And then tell me the difference between how you would pronounce that and how you normally pronounce “I would have done it if I’d have known.” If you see “would of” in a novel, it’s because it’s attributed to a character who would write it that way.)

Why do I say eye dialect is handy? Ah, because that very class connotation – the one that is arrantly hypocritical when we use it to look down on others – lets us establish tone when we’re using it in our own voice: we can present ourselves as “casual,” “folksy,” “honest” (honesty is a virtue typically viewed as inversely correlated with sophistication – yes, it’s been studied: we tend to see it that way; and yes, we’re wrong about that: in reality there’s no correlation one way or the other). 

Yes, it’s still hypocritical, maybe even doubly so because we’re using it to avail ourselves positively of a distinction we otherwise wield negatively: when other people do it they’re unintelligent, but when we do it we’re folksy and honest. But ya know what? The more we use the spelling gonna generally as a colloquial usage, the more it loses the “unintelligent” connotation, so I’m not opposed to it. Which is fine, because everyone’s gonna use it anyway.

OK, so why do I say eye dialect is hazardous? I don’t mean as a further elaboration on the class distinctions. I mean for people learning English as a second (or later) language. I’ve known people who learned English as adolescents or adults who hadn’t quite processed that gonna is informal when written and relaxed when spoken. A professor I had would use it in comments and letters written in otherwise academic English. A co-worker always said it (in her slight German accent) with very clear and deliberate enunciation: “Are you gun na do that?” – which sounded more odd and awkward than if she had just full enounced the formal version, “Are you going to do that?”

So how long, by the way, have we been doing this?

That’s a two-pronged question, and the answer to the first prong – how long we’ve been reducing “going to” to “gonna” in speech – is that I have no way of knowing exactly for sure, but the odds are good that it’s just about as long as we’ve been using going to as a modal auxiliary. There are four very common phonological processes involved: 

  1. place assimilation, wherein the /ŋ/ moves to the front of the mouth and is realized as [n] because it’s between a front vowel [ɪ] and a stop at the tip of the tongue [t] – either one could be enough to move it forward, as we see from the common and long-established practice of saying -ing as [ɪn]; 
  2. assimilation and deletion, wherein the [t] just gets Borged right into that [n] and disappears – we do tend to reduce /t/ very often, turning it into a flap as in [bʌɾɹ̩] for butter or into a glottal stop as in [bʌʔn̩] for button, and this deletion is just the ultimate reduction;
  3. deletion again, in this case the [ɪ] before the [n]; and 
  4. reduction, when we make the minimum effort in pronouncing the o and it comes out just as [ə] (an argument could be made that the deletion of the [ɪ] is part of this reduction).

Basically, we say it as “gonna” because we naturally conserve effort when speaking – there’s a trade-off between conserving the effort of articulating the word and conserving the effort of being understood, and with modal auxiliaries, the effort of being understood is usually the lesser problem.

The answer to the second prong – how long we’ve been writing it as gonna – is just over a century in North America, but about a century longer than that in Scotland, if the available published citations are to be believed. Eye dialect did have a bit of a vogue in the US in the late 1800s and early 1900s, and this spelling was likely encouraged by that.

So there you have it. One of the most common bits of “wrong” spelling, so entrenched that in some contexts these days you’re making more of a point if you spell it the “right” way: picture Janet Jackson’s “What’s It Gonna Be” as “What’s It Going to Be,” or Led Zeppelin’s “We’re Gonna Groove” as “We’re Going to Groove” (and then why go halfway? why not “What Is It Going to Be” and “We Are Going to Groove”?). Eventually it might even qualify just as nonstandard spelling, not eye dialect. But my points about eye dialect are still gonna stand…

Prescriptivist or descriptivist?

I’m once again serving as a guest expert for a friend’s copyediting course. The students in these courses often ask me interesting questions about points of grammar. But this time, one of them asked me a broader question – or, rather, two of them:

Would you describe yourself as more of a prescriptivist or descriptivist?

What value do you see in each of these approaches to language? 

Since you’re here reading this, you probably know what the difference is between prescriptivist and descriptivist: a prescriptivist is someone who believes in imposition of authoritative prescriptions on language usage – fans of Lynne Truss, for instance, and avid users of Strunk and White’s Elements of Style – while a descriptivist is someone who believes in observing and describing how people actually use language and not holding stern judgmental positions on it. Most modern dictionaries are descriptivist: they include a word if it’s in common use – including, for instance, impactful and misunderestimate – and they try to include all senses that are in common use. Some people believe they should be prescriptivist and forbid certain words and senses of words.

Since I have a graduate degree in linguistics, it’s no surprise that by disposition I’m a descriptivist. I love language in all its forms, and I observe how it’s used in each context. But that doesn’t mean I have an “anything goes” approach in my work as an editor. After all, I’m editing a text that is part of a specific genre and is meant to have a particular effect on a certain audience. I use my observations about how people use language (and how they think about it, which is another important issue) to decide what choices of words and phrasing will work best. 

Generally, of course, there’s plenty of latitude – more than some people think. But we can recognize that, for instance, “Go ask your mommy” will have one effect in a children’s book and quite another in a political speech. Your elementary school teachers may have said “‘Ain’t’ ain’t a word,” but aside from being obviously false (the sentence would be incoherent if it weren’t a word; it would be like saying “‘Zzblgt’ zzblgt a word”), all that does is position ain’t as a very powerful mark of “bad” English (informal, nonstandard, folksy – which is also taken as frank and honest). So in an annual report, if you’re giving forecasts on projects, you would have “It isn’t coming by January” (or even “It is not coming by January”), but you may make use of “It ain’t coming by January” as a momentary excursion in style if you want to convey a particular (refreshing, informal) frankness, which might position the ostensible writer (e.g., the CEO) as a “regular guy.”

So, on the one hand, the idea that you must not ever use ain’t just ain’t true. But on the other hand, we can thank such teachers and others like them for maintaining that opprobrium, which gives the word such power. Likewise, you can have a huge effect by slipping in a vulgarity in the right context, and vulgarities maintain their power by having some people constantly treat them as the most awful things.

In that way, we need prescriptions to give us rules to push against, and to know where we stand; anyway, we will always have them, because some people just love rules (regarding rule-seeking behaviour, see “That old bad rule-seeking behaviour”). Beyond that, it’s useful to have prescriptions just to help us decide what to do where – I regularly look things up in the Chicago Manual of Style, thereby saving me from having to justify my choices on my own account and ensuring that my choices will be consistent with choices in other similar books, which also helps make the reading go smoother.

But many of the things that prescriptivists focus on the most have little to do with consistency or clarity. In fact, that’s probably why they focus on them so much. Someone once said “School board politics are so vicious precisely because the stakes are so small,” and the same goes with grammatical and lexical prescriptions: the ones that people get the most exercised about are precisely ones that make the least difference in clarity or effectiveness – which frees them up to function almost entirely as social shibboleths, signifiers of who is “the right sort.” Grammar peevery is just using the rule-seeking instinct to license social aggression while giving a plausible excuse. One of my favourite articles that I’ve written goes into this: “Why all English speakers worry about slipping up.”

So, in short, while many linguists are simply hard-set against prescriptivists, I have a more complex position. In some ways, I am by profession a prescriptivist: I enforce prescriptions within specific contexts – though those prescriptions are often made on the basis of descriptive observation. On the other hand, I don’t correct people’s grammar unless they’re paying me to do it, and I don’t think grammar is a useful indicator of character or intelligence; some very magnanimous and insightful people are not too tidy with grammar, and some people who have perfect grammar are obtuse and obnoxious. I don’t enjoy the presence of outspoken prescriptivists, but I’m sure we will always have them; and they fill a role, modelling a specific idea of propriety that we can choose to flaunt or flout as we fancy.

But what about plural “they”?

This article originally appeared on The Editors’ Weekly, the official blog of Canada’s national editorial association.

Singular “they” is here to stay, and that’s a good thing. There is no decent reason to require that third-person singular pronouns—and only third-person singular pronouns—always specify gender. “He” has never truly covered men and women equally, though starting in the 1800s some people tried to insist that it did, and constructions such as “he or she” or “s/he” are clunky at best. So it’s natural to accept officially what has been an informal workaround for centuries: extending the plural pronoun to cover the singular.

It’s not the first time that English has done this. As early as the 1200s, we started using the plural “you” for individuals of higher status, and by the 1800s, rather than continuing to specify respect—or lack of it—in pronouns, we had almost entirely stopped using the lower-status singular “thou.” If we can use a plural form in place of a singular to erase a status-based distinction, we can certainly do it to erase a gender-based distinction.

But there is one problem that we run into with singular “they,” a problem we have already encountered with singular “you”: how do you make clear when it’s plural?

That’s still a useful distinction, and it’s not always obvious from context. Consider a sentence such as “The CEO met the VPs at a bar, but they drank too much and started singing karaoke, so they left.” If specifying the gender of the CEO is out of the question, to clarify who “they” refers to you’ll need to rewrite it to avoid the pronouns—and if it’s a longer narration, that gets clunkier and clunkier. So what do we do?

Well, what did we do with “you”? For a time—quite a while, in fact, from the late 1600s through the late 1700s—singular “you” got singular verbs: “you was,” “you is,” “you does.” It was so common, Robert Lowth inveighed against it in his 1762 Short Introduction to English Grammar. Even Doctor Johnson used “you was.” Will we try the same kind of thing with “they”—saying “they is” and “they was”? A few people have tried it, but such usages are already strongly associated with “uneducated” English, and so they’re unlikely to become commonplace. And “you was” didn’t last, after all—Doctor Johnson and everyone else ultimately switched to “you were” even for the singular.

So how do we specify plural “you”? You know how: we add further plural specification to it. In the US South, “y’all” or “you-all” is very common, and it’s spreading; in other places, “yous,” “youse,” “you ’uns,” “yiz,” and “yinz” are local favourites. In many other places, we say “you guys” or something similar when we need to make the distinction. And I’ll wager we’ll end up doing the same kind of thing with plural “they.” “They-all” seems readily available; “those ones” and “those guys” are likely to show up; differential usages of “themselves” and “themself” are already in use and may be extended; and others may appear—I’ll be watching eagerly. And in some contexts, for added clarity, something like “the one” might be used for the singular.

What do we do as editors, here and now? We keep an eye on how popular use is changing. When we can, we use our positions to influence it a little. And, as always, we use our judgement to find what’s clearest and most effective for the audience of the text we’re working on. 

scaffold

Socially, language functions in many ways like a scaffold.

I’ll explain. But first I’ll talk briefly about this word scaffold and where it comes from and what it is used to mean now. Because of course I will.

Scaffold has to do with neither folds nor scafs, nor for that matter with holds. It’s yet another word that came to English from French, and came to French from Latin (and Greek), and changed quite a lot en route. The modern French reflex of it is échafaud; both words came from a word that went through quite a few forms, but had the early form escadafaut, which was es- (from Latin ex-, ‘out’) plus cadafaut, which, like modern French catafalque, comes from later Latin catafalcum (‘viewing platform’), which in its turn was probably made from cata-, from Greek κατα- (‘back, against’) and Latin falicum, in turn from fala (‘wooden gallery; siege tower’).

So it started with a siege tower and then became a viewing platform and then became a… oh, yes, I didn’t say: escadafaut generally referred to a platform for viewing a tournament.

But of course that’s not what scaffold (or scaffolding) is usually used for now. It’s that structure of metal supports and wooden platforms you may see in front of a building. Sometimes the building is being built; sometimes it’s being restored or preserved; sometimes it’s just being kept standing. And, less commonly these days, scaffold can also refer to a platform for viewing something, or for a theatrical performance, or for public executions, or, in some cultures, for disposal of dead bodies. (And let us not forget its cousin catafalque, which in modern English usage is a temporary ornamental platform for a coffin to go on in funerary rites.)

OK, then. So how does language function socially like a scaffold?

To start with, we use language to mediate the development and maintenance of social structures and interactions. Language is an essential social tool; our social structures may not be made of it (though some arguably are, but that doesn’t work with the current metaphor, so let it slide), but they are made with it. You want to add a glorious new tower or wing to the edifice of our culture? You scaffold it with language: new words, new ways of using old words, new turns of phrase, sometimes even new grammar.

But we also use language to shore up, maintain, and refresh existing social structures. Turns of phrase, common idioms, colloquialisms, and metaphors can embed biases and presuppositions (as just one example, are you familiar with the term Indian giver?). Even basic grammatical details can function this way, as for instance insistence on he as the default pronoun (which it never was, though some people starting in the 1800s tried to claim it was in places where that would mean not having to explicitly recognize women, but somehow not in places where it might entail giving women completely equal rights – see Dennis Baron’s great book What’s Your Pronoun? for extensive details on this). And peeving about “new” usages reinforces an ideology of “old” as better – adherence to “tradition,” which always turns out to be just what the speaker remembers having learned in youth, plus some additions that reinforce their prejudices: the linguistic façade of the social structures and hierarchies that the person has learned and participated in and is quite comfortable with, thank you.

Not that all “old” words are acceptable in such a perspective, of course. Social stratification is maintained through ideas of “good” English (as opposed to the kind that people from the wrong region or socioeconomic level speak – by the way, “good” English is just as weird and arbitrary as many kinds of “bad” English, and in fact some things are “bad” because they’re not quite weird and arbitrary enough: just watch someone correct a kid who says “goed” instead of “went”). It is also maintained through taboos based on ideas of purity and sexual propriety. You display your conformity to these social structures by treating “bad” words as “bad” and at the same time by rejecting changes in usage that try to undo social subordination of certain groups of people. A person may argue “politely” that we needn’t change the names of any sports teams, for example, while at the same time objecting to the “bad English” or “bad words” uttered by people on the other side of the debate who are upset at being treated as stereotypes. 

Well. All good buildings have basements, dears, and they will collapse without them, but we don’t go down into them ourselves, do we? Oh, no, dears, we do not. A nice, tidy scaffold helps maintain decorum. And when we focus on the scaffold, we also don’t necessarily notice the structure that it’s there to maintain. We get stuck on the words and ignore the tilting tower of crumbling bricks behind it.

But the language has its own ostensive value too. With it, as on a scaffold (next sense), we can perform our identities and our attitudes – and we can watch others perform theirs. In fact, that’s a central function of language: words are known by the company they keep. We always use our language to let others know things about ourselves, our attitudes, and where we stand. Some of us, for example, will make sure to use some terms and avoid using others so as not to perpetuate social injustices, while others will make sure it’s understood they don’t brook “woke” “politically correct” “virtue signalling” and will stand for “family values” (which assume very specific kinds of families and exclude families that don’t meet the model).

And, of course, with language, as with scaffolds, we can view the tournaments of our societies, we can conduct – and display – executions, and we can show off the resulting corpses and expose them for the carrion birds. Choices of words and phrasing let you know who’s been cut dead, and they help keep it that way.

But at least, unlike (most) real-life scaffolds, language is here to stay – and it is deserving of aesthetic appreciation in its own right. And is an essential part of culture, not just an accessory. Metaphors have their limits… but language wouldn’t exist without them.

Global English?

This article originally appeared on the blog of ACES: The Society for Editing.

English is not one language and never has been. Even Old English had different dialects. Global English is a family of varieties, mostly mutually comprehensible but loaded with traps and surprises. And even when you can easily understand English from another part of the world, you will most likely recognize that it’s from somewhere you aren’t… and you’ll eventually get confused by something.

All of that shouldn’t be a surprise to anyone, but some people seem to think it’s possible to produce a neutral, non-regional, truly global English. I will grant that it’s possible to produce an English that seems at least slightly foreign to anyone anywhere – the famous “mid-Atlantic” English you hear in some movies is a spoken version – but it is not possible to produce a variety of English that is taken as unremarkably local by every English speaker everywhere. There are several reasons for this.

Pronunciation

The most obvious difference is in pronunciation. Get someone from Kalgoorlie, Western Australia, someone from Tuscaloosa, Alabama, and someone from Newcastle upon Tyne, England, to have a pleasant chat and see if they can understand each other at all. 

Pronunciation is less of an issue when dealing with the written word – you probably won’t have a person from Buffalo writing “hot” and a person from Toronto thinking it’s “hat,” as you may when it’s spoken. But text is, in fundamental ways, a representation of the spoken word, and it often relies on reference to the spoken word. 

Not just jokes but advertisements and catchphrases rely on rhymes and wordplays that are particular to just some varieties of English – “caught” and “court” sounding the same, or “quarter” and “border” rhyming for instance. These differences also help ensure the impossibility of English spelling reform: you can’t make a phonetic spelling of one variety of English that won’t be incomprehensible to users of many other varieties.

Spelling

Not that English spelling is the same everywhere of course. Canadians are used to American-style spellings but can be very patriotic about colour and centre in some contexts; if a Canadian book expects a largely American audience, however, you can count on those Canadian spellings to alienate them. And on the other hand, if you just go with British-style spellings in Canada, you’ll soon realise it doesn’t always suit. And there are more striking differences, such as gaol versus jail, oestrogen versus estrogen, and arse versus ass – though that last case is arguably a difference of which word is used, not just which spelling.

Same thing, different word

There are many, many things that have different names in different countries. It’s well known that British cars have boots and bonnets instead of trunks and hoods and that a British lorry is an American truck (of a specific kind); it’s generally famous that what Americans call a barbecue Australians call a barbie. Fewer people will know that South Africans call the same thing a braai, or that instead of saying bro or buddy they say boet (which sounds like “boot”) – while in India, they say yaar.

For that matter, there are regional differences even in America, some of them quite celebrated. Is a Pepsi a pop, a soda, or a Coke (used in defiance of trademarks)? Do children on playgrounds ride see-saws or teeter-totters? Such regional differences – which don’t always divide on the same lines – are what linguists call isoglosses, and maps showing the isoglosses are some of linguists’ favorite things.

Same word, different thing

Americans occasionally run up against the fact that pants and fanny mean less publicly acceptable things in British English, and Americans are likely to know that in England and Australia mate refers to a friend rather than a romantic partner.

They’re less likely to know that hotel can mean a restaurant in India; that South Africans call a traffic light a robot; that in India you don’t graduate, you pass out; that tea can be a full meal in England; that a torchlight in Nigeria is a torch in England and a flashlight in the US; that I understand you in the US is I hear you in Nigeria; or that South Africans say shame when they are shown a cute baby or told of happy news such as an engagement.

Americans may not even know what someone from a different part of the US means by boulevard (a grassy strip between sidewalk and street or a wide avenue with a green strip in the middle?).

Turns of phrase

The lexical differences also extend to idiomatic turns of phrase. Where an American might write Main Street on Friday is different from a suburb on the weekend, a Brit would have The High Street on Friday is different to a suburb at the weekend.

A person from England might say I’ll knock you up to mean I’ll drop by and might tell you to keep your chin up by saying Keep your pecker up, but if the hearer is from North America, the results could be… awkward.

Some differences are points of pride: New Yorkers make waiting on line rather than waiting in line a kind of local shibboleth, and for New Zealanders, a phrase like Kiwi as (as in This food is Kiwi as) is, well, as Kiwi as… as what? They expect you to fill in the blank.

Grammatical niceties

There is also the matter of things that are correct usage in one variety but terrible errors in another. I dreamed I dove into a lake may be fine in the US, but I dreamt I dived into a lake is necessary in England. I casted my vote yesterday is terrible in some countries but absolutely correct in Nigeria. I’ll call you when I reach is normal in India rather than I’ll call you when I arrive.

Cultural references

Words and grammar aren’t the only things that vary from place to place though. English-speaking culture is obviously far from uniform, and some baseline assumptions just don’t work the moment you cross a border. Food is different, and passing references can quickly be opaque: not everywhere has food trucks or pretzel carts or chaiwallahs; not everyone can order poutine or grinders or bangers.

And while any Canadian will know what another Canadian means by toque and parka, most other people in the world won’t.

Americanizing and Canadianizing texts is a large and expensive business, and the spellings are the least of the issue. I remember one time a Canadian colleague working on a converted document discovered a number of instances of underprovinciald in a document; it turned out that someone had done a replace-all from state to provincial without checking. But when a guide to a health care topic starts talking about insurance, no amount of word replacement will fix the disparity between the US and Canada – or, really, between the US and anywhere else.

Houses and other buildings can be different, including what’s called the first floor (ground floor in the US and Canada, the floor above ground in most of the rest of the world).

There are also regional differences. In Canada, for instance, if you talk about a condo in Ontario, you probably mean a high-rise apartment; in Alberta, a condo is more likely to mean a townhouse, possibly a vacation property. What you mean by the word bungalow can vary quite a bit depending on where you are in the US. And in some cities, a duplex is typically side-by-side residences with one common wall, while in others, it’s a house with one residence on the upper level and the other on the lower – meaning that a reference to the people in the other half banging on the wall may be confusing.

Global varieties

How many kinds of English are there? Hmm, get a book of paint colors from a hardware store and tell me how many kinds of white, or blue, or black there are. Get another book and count again. English has national standard varieties, regional varieties within countries, local variants, socially divided varieties (often people from the same social group in different cities will sound more like each other than like people from other social groups in their respective cities). 

And don’t forget that the status of English is not the same in every country where it’s spoken – it’s the historical main language in some, the language of a colonizing class in others, and a lingua franca in still others. 

But in every country where texts are published in English, someone needs to make sure that that English doesn’t seem strange. And that someone may be you. The one thing you can be sure of is that while one variety of English may be comprehensible to speakers of another, it may alienate them – and may give rise to significant misunderstandings.

No exceptions?

Do I see a hand in the back? …Yes? …Labels on boxes? And short warnings and things like that? Yes, it’s true that you can produce some short passages that look local to anyone anywhere. But that’s not a global variety of English; it’s a snippet, and many other similar snippets will not seem so universal. 

It’s like going up to a rail ticket office in a European country and knowing enough of the local language to buy a ticket without their noticing that you’re not a native speaker: it doesn’t mean you’re fluent. You couldn’t carry on a conversation without being smoked out. You sure couldn’t write an article – let alone a book – that would be smoothly idiomatic. 

The same is true with using English from one part of the world in another part of the world. Oh, they’ll understand you, probably. But they’ll know you’re not from there, and there will be extra friction and effort in the communication and comprehension. You may not realise it, but the little differences to what you’re expecting colour your reception. And editing means understanding, appreciating, and working with these subtleties.

In effect, localizing English is like translating from one language into another, just subtler. You should only localize into a variety you have native fluency in – if you try to adapt a text into the English of a country you’re not from, you will eventually make an embarrassing mistake. But you also need to know the variety you’re converting from well enough to understand the local points of usage and cultural assumptions, so you don’t think a Canadian’s toque is a chef’s hat, don’t believe that a South African at a robot is watching an android, or don’t get what the big deal is about jumping out a first-floor window.

Which, in my view, seems like an excellent excuse to do some international traveling… when you can.

Authority? What authority?

Look, I know what I’m talking about.

Have you ever said that? And has anyone ever said that to you? It’s an appeal to authority, and, according to some people, it’s an instant fail: the argumentum ab auctoritate (argument from authority) – a famous logical fallacy!

Except when it’s not. Because if appeals to authority were always fallacious, our entire legal and educational systems would be voided. Among many other things. 

I’ll explain.

Continue reading