Tag Archives: English grammar

Whither English?

Once again this week I guested into the editing class my friend teaches online at a local university. And this time, along with the usual questions about specific points of usage, one student asked what I think will change in English usage, and what changes editors should resist.

Which is a really interesting question! Predicting language change is fun and occasionally one gets it right, but there are always innovations that you just can’t predict – and social and technological changes, too. When you look at how things have changed in the past, it gives some sense of the usual forces of change. As I said in one presentation on the topic (more than a decade ago now), we tend to change language for four general reasons:

  • to make life easier
  • to feel better
  • to control
  • things slip

Fads that become accepted are common. The shifts in the pronunciation of the letter r – and their shifts in social status (for example, the advent of r-dropping in England, its adoption in America as a sign of higher status, its shift over time towards more of a working-class signifier in America but not in England) – are emblematic of this, as I wrote about in an article for the BBC. A lot of it has to do with signifying various kinds of social group belonging.

On the other hand, sometimes changes are invented and propagated – such as the ideas that you can’t split an infinitive (I’ve written about this more than once) and can’t end a sentence with a preposition, and the prescribed distinction between less and fewer. A few of these “rules” have become undisputed standard English now (such as the proscriptions of double negatives and double superlatives); others (such as the ones I just mentioned) are often waved around as rules but aren’t universally accepted, and serve mainly to license social aggression (as I wrote about in another BBC article). I did a whole presentation on when “errors” aren’t some years ago, and a bit more recently on when to use “bad” English.

None of which yet answers the question. Let’s see… 

  • I think that social media will continue to be a good vector for the rapid spread of new usages and references (everything is citational, after all), but of course I can’t predict which ones. 
  • I think that punctuation will get to be used more and more variably for subtle significations (after all, the presence or absence of a period at the end of a message can convey tone, sometimes importantly). 
  • I think emoji will keep getting used, including as nouns, verbs, and adjectives, not just interjections, but how far into formal writing they will spread I don’t know. 
  • I think capitalization will continue to be basically haywire, because it’s weird and complicated in English anyway (here’s something I ghostwrote for PerfectIt’s blog about it).
  • I’ve noticed what seems to be a shift (yet another!) in the pronunciation of r among younger people, at least partly under the influence of pop singers who are avoiding the retroflex sound in favour of something closer to a mid-high mid-front vowel. I’m not sure where that’s going, but keep an eye on it.
  • I suspect we will, at length, start using they-all or something similar to convey that we’re speaking of a group of people, rather than a single person of unspecified or neutral gender. I am very much on board with singular they – if you have an hour, watch this presentation I gave on gender in language, including the vaunted history of singular they and the deliberate reactionary imposition of the idea that he is the natural generic default. But singular they can bring the complication that we aren’t always sure of the number of people signified. When we started using you for all second persons rather than distinguishing between singular thou and plural you, various people in various places innovated y’all, youse, yiz, yinz, and so on. So why not the same with they?
  • And, because identity is important to people, especially when threatened, and because language is a key means of conveying that identity, I think Canadian usages and in particular Canadian spellings (centre, colour, you know), which have been slipping a bit in general Canadian usage, will come to be increasingly emphasized in response to threats to Canadian sovereignty. That’s not a change so much as a revitalization. But keep an eye out for innovation of Canadian signifiers too!

And as to the question of what changes to accept and what to resist, as I said in my “when does wrong become right” presentation, there are five questions we should ask when evaluating a change:

  1. What is the change? Really? (Sometimes the “change” is the original form and the “traditional” usage was invented and propagated more recently.)
  2. Where did it come from? When?
  3. Where is it used? By whom?
  4. Who is your text for? (Usages that annoy one audience may charm another.)
  5. What are the gains and losses – what does the change add in expressive value and clarity, and what does it take away?

Oh, and I am definitely in favour of being pragmatic to the point of deviousness in our choices. As Machiavelli said, “consider the results.”

If you were to use the subjunctive…

It’s March fourth. Happy Grammar Day! Today is a day when certain people who like to loudly declare their love for grammar put extra energy and volume into promulgating their favourite rules. Which is kind of a giveaway about their motivations: It’s not grammar itself that they love (since “bad grammar” is also grammar, adhering to a coherent underlying set of rules, just not the rules that they prefer), it’s security in an imposed order. It’s authority, as long as they get to be the authority. It’s like if someone were to say “I love flowers!” but simply could not stand the disorder of a meadow of wild flowers and had to have the tidy order of a strictly planted garden, with no flower out of place.

But there is an important difference here: Many neat grammar rules do have an organic basis in the language, and the imposed rule is intended to keep usage from drifting away from that. (This is not true of all grammar rules, mind you; for example, we know exactly when the strict distinction between less and fewer was invented, and we do not in fact owe allegiance to its inventor.)

But usage does drift. For most English speakers, for example, whom is effectively a foreign word; they have no natural feel for its usage, and so they use it in places where it’s inappropriate according to the rules they’re attempting to preserve. So a bit of freshening up on the established rules, for those who want to follow them, is not unreasonable. And – to get to my subject for this Grammar Day – for many English speakers, the subjunctive is also a strange thing that, even if they use it sometimes, they don’t altogether “get.”

Which isn’t that big a problem in most contexts. But if you were to use the subjunctive, you would need to know not just how to use it but when to use it. And the available guidelines for it are sometimes so detailed as to be confusing. Wikipedia, for example, says, “Subjunctive forms of verbs are typically used to express various states of unreality, such as wish, emotion, possibility, judgment, opinion, obligation, or action, that has not yet occurred.”

Part of the problem is that these do not all require the subjunctive, but they are things it can be used for. Another part is that people get confused about what’s real versus unreal and what you can and can’t use the subjunctive for. So – as it is Grammar Day (or, if you are reading this on another day, imagine it were Grammar Day) – let me give you the quick and easy way of thinking about the subjunctive mood: It can be thought of simply as a hypothetical mood. Note that I say “mood” – it’s not a tense; it’s a perspective that can be applied to any tense, just like the indicative mood (which is the usual mood, talking about things that definitely do or don’t exist). 

And this is where some people get confused, because hypotheses operate differently in the past and present than they do in the future. When we’re talking about things in the past or the present, something that’s hypothetical hasn’t happened and isn’t happening, whereas something that’s indicative has happened or is happening. To use Wikipedia’s term, in the present and the past, the unreal is known to be unreal. But when we talk about the future, it’s all hypothetical; none of it has happened yet, even when we’re using the indicative. None of it is real yet. Which means that the effect of the subjunctive in the future is not the same as in the present and the past. 

Let’s look at some examples:

Past:

Subjunctive: “If you had helped me, I would have been grateful.” (You didn’t, and I wasn’t.)
Indicative: “If you helped me, I was grateful.” (You might have helped me; I just can’t remember. If you did, I was grateful.)

Present:

Subjunctive: “If you were helping me, I would be grateful.” (You aren’t, and I’m not.)
Indicative: “If you are helping me, I am grateful.” (I’m not sure if you’re helping me; if you are, I’m grateful.)

Future:

Subjunctive: “If you were to help me, I would be grateful.” (I’m proposing that you help me, but I’m doing so indirectly, so as to make it clear that it is not expected but merely possible at your discretion.)
Indicative: “If you help me, I will be grateful.” (Just a straightforward conditional, laying out a possible course of action and a consequence of it.)

You can see that both ways of speaking of the future are possible, and both refer to the same case, but one is using the hypothetical framing to put in more distance so as to disavow any air of expectation or transaction – in other words, it’s being more passive and polite – whereas the other is simple and direct.

And this is where we see that choices of grammar are not just about what is technically correct; they are also about negotiations between people. Everything we say, we say to produce an effect, and part of that effect is a negotiation of status and expectations between us and the person(s) we’re speaking to. (Unsolicited corrections of other people’s grammar are an exemplary case and their intended effect is left as an exercise to the reader.) In the case of my example, “If you were to help me, I would be grateful,” the subjunctive is used to make a suggestion or implied request, or wish – none of which, by the way, asserts or implies that the thing is outside the realm of possibility; it simply uses the hypothetical framing to emphasize that it is not a certainty, and it does that so as not to impose or make a claim on the other person.

One more thing, though: All of this is just if you use the subjunctive. You don’t, in fact, have to; there is a version of English that simply doesn’t use distinct forms for the subjunctive. In it, you never say “if I were you”; it’s “if I was you,” even though I have never been you. This version is more common and more accepted in England than in North America, but it’s available everywhere… though it does have a less literary air to it, and it allows the occasional ambiguity, though that’s usually resolved in the next clause with the choice of tense. For example:

Subjunctive user speaking hypothetically: “If I were finished, I would stop writing.”

Subjunctive non-user speaking hypothetically: “If I was finished, I would stop writing.”

Subjunctive user or non-user using indicative: “If I was finished, then obviously I stopped writing.”

Rules and laws

For Grammar Day, I want to talk briefly about laws and rules, and the fact that some people who should know better get them confused.

Let’s start with laws of nature. Say someone holds a rock in front of them and lets go of it. It flies upward instead of falling. Do you say, “No, you’re doing it wrong – the rock is supposed to fall down”?

Then there’s criminal law. Let’s say that instead of dropping the rock, they throw it through a store window. You might say “Hey!”; a cop who is nearby might arrest them – or they might get away with it.

That’s sort of like the rules of sport. Say the person is playing football, and they throw a rock instead of a football – or maybe they just throw a football the wrong way. The player will get a penalty – if the referee sees it.

But how about the rules of grammar? Let’s say someone writes a sentence: “Person the throw rock football and window at.” Your reaction on reading it is probably something like “Huh? That doesn’t even make sense.”

So let’s say instead that the sentence is “Smashing a window, the person throwed rock and football.” If you’re like a lot of people, you’ll readily utter a correction of one or more errors, even if no one asked you to. You may also say something about the intellect of the writer.

The law of gravity, like any law of nature, doesn’t need anyone to enforce it. If you see a law of nature being broken, you’re wrong: either the law isn’t really being broken (it’s an illusion, or some other law is relevant) or the law as you know it is inaccurate or incomplete and your understanding needs to be revised.

Civil and criminal laws do need enforcement, because they’re human creations. Some of us may believe that laws are there to enforce laws of nature (or of God), but really at most we’ve just appointed ourselves to try and keep people behaving in accordance with our ideas of those laws, which is an us thing. Civil and criminal laws are like the rules of sports, but with broader application and stronger enforcement mechanisms.

And rules of grammar? Ones like in the last example, such as that it’s “threw,” not “throwed,” that you shouldn’t use dangling participles, and that you should be careful with definite and indefinite articles, are also like the rules of sports: in published texts, editors typically serve as referees, following specified style rules; in a broader social context, enforcement is mostly not formalized. The rules may have a certain tidiness, but that tidiness is not a natural law, nor is it inevitable – any editor who works with multiple house styles knows that.

But what about more basic rules of grammatical conmprehensibility, such as the ones broken by “Person the throw rock football and window at”? Those, too, are human creations – just at the level of social norms that we rarely stop even to inspect. Using the rules of some other languages, that weird sentence would be entirely coherent. English puts the definite article (“the”) before the noun, but Scandinavian languages tack it onto the end of the noun as a suffix. English can be very fussy with verb conjugations (“throw,” “throws,” “threw”), especially irregular ones, but other languages are less so, and some – such as Mandarin Chinese – don’t conjugate at all. English requires indefinite articles (“a rock,” “a football”), but Gaelic doesn’t, and Slavic languages don’t use definite or indefinite articles. And English expects “and” to go between the things it combines, but in Latin its equivalent can be tacked onto the second item, as in “Senatus Populusque Romanus” – literally “Senate People-and Roman” (in English, “the Senate and People of Rome”).

So, in short, the rules of grammar, even the most apparently essential rules, are not inevitable. Grammar, even the most fundamental grammar, is not a natural law; it is like the rules of a sport. The way you say a thing is not the one logical, inevitable, natural way to say it, even if – within the variety of the language you’re speaking – it’s the only “proper” way to say it. Even the idea that a double negative equals a positive, which seems plainly logical to modern English speakers, seems otherwise to speakers of languages such as Spanish or Italian, where a negative requires agreement (e.g., “No vale nada” and “Non vale niente”: “It’s not worth nothing”). After all, it can’t be a negative statement if it’s positive in some places. Logic!

But some people, even some otherwise well educated people, seem unaware of this. Editors and linguists are wearily used to people priggishly “correcting” them with simplistic grammar rules and ideas that they recall from school, as though those rules were basic truths like natural law. I’ve seen it even from people who have graduate-level educations and clearly ought to know better.

And why does it matter? I’ve written before about how this kind of dogmatic position is used to license social aggression (see What do we care about, really and Why all English speakers worry about slipping up), but the boorishness of grammar snobs is not the biggest thing. The idea that there is one correct, natural, logical grammar gives cover for not just class discrimination but also racism (because different social groups use different varieties of the language) and even sexism (in particular ideas about such things as pronouns and grammatical gender – I’ve given talks on this several times; a video of one time is at A Hidden Gender?). 

A person who understands the socially decided nature of grammar rules can understand that someone who’s using a kind of English that’s not “proper” is not inferior, and that different varieties of English are grammatically coherent even if they’re different from the schoolbook standard. Knowing this also broadens a person’s expressive repertoire.

Does all this mean that grammar is a free-for-all, or that there’s no point in teaching it? Of course it doesn’t mean that. We teach people about the rules of sports and the rule of law. We also teach people about dress codes – there are certain things you just don’t wear in certain places and occasions, not for any matter of intrinsic suitability (sweatshirts are no less functionally suited to formal occasions than tuxedos), but just because of the social implications they have come to have. Likewise, if you use a library, you learn how the books are arranged on the shelves, and it’s a tidy, systematic, enforceable order, but it’s not an inevitable one: the choice of Dewey versus Library of Congress, just for instance, will give quite different orderings. 

Tidiness can be good, and consistent, well-defined rules can be useful. I make a nice bit of money every year tidying up text. But rigidity and narrow-mindedness are bad. And believing that the simple rules you learned in your simple youth are the only true rules is a mistake that will limit your effectiveness – and, on the larger level, can limit others, and our effectiveness and potential as a society. Learn rules – as many different sets as possible – and use them judiciously.

Oh, and have fun.

Prescriptivist or descriptivist?

I’m once again serving as a guest expert for a friend’s copyediting course. The students in these courses often ask me interesting questions about points of grammar. But this time, one of them asked me a broader question – or, rather, two of them:

Would you describe yourself as more of a prescriptivist or descriptivist?

What value do you see in each of these approaches to language? 

Since you’re here reading this, you probably know what the difference is between prescriptivist and descriptivist: a prescriptivist is someone who believes in imposition of authoritative prescriptions on language usage – fans of Lynne Truss, for instance, and avid users of Strunk and White’s Elements of Style – while a descriptivist is someone who believes in observing and describing how people actually use language and not holding stern judgmental positions on it. Most modern dictionaries are descriptivist: they include a word if it’s in common use – including, for instance, impactful and misunderestimate – and they try to include all senses that are in common use. Some people believe they should be prescriptivist and forbid certain words and senses of words.

Since I have a graduate degree in linguistics, it’s no surprise that by disposition I’m a descriptivist. I love language in all its forms, and I observe how it’s used in each context. But that doesn’t mean I have an “anything goes” approach in my work as an editor. After all, I’m editing a text that is part of a specific genre and is meant to have a particular effect on a certain audience. I use my observations about how people use language (and how they think about it, which is another important issue) to decide what choices of words and phrasing will work best. 

Generally, of course, there’s plenty of latitude – more than some people think. But we can recognize that, for instance, “Go ask your mommy” will have one effect in a children’s book and quite another in a political speech. Your elementary school teachers may have said “‘Ain’t’ ain’t a word,” but aside from being obviously false (the sentence would be incoherent if it weren’t a word; it would be like saying “‘Zzblgt’ zzblgt a word”), all that does is position ain’t as a very powerful mark of “bad” English (informal, nonstandard, folksy – which is also taken as frank and honest). So in an annual report, if you’re giving forecasts on projects, you would have “It isn’t coming by January” (or even “It is not coming by January”), but you may make use of “It ain’t coming by January” as a momentary excursion in style if you want to convey a particular (refreshing, informal) frankness, which might position the ostensible writer (e.g., the CEO) as a “regular guy.”

So, on the one hand, the idea that you must not ever use ain’t just ain’t true. But on the other hand, we can thank such teachers and others like them for maintaining that opprobrium, which gives the word such power. Likewise, you can have a huge effect by slipping in a vulgarity in the right context, and vulgarities maintain their power by having some people constantly treat them as the most awful things.

In that way, we need prescriptions to give us rules to push against, and to know where we stand; anyway, we will always have them, because some people just love rules (regarding rule-seeking behaviour, see “That old bad rule-seeking behaviour”). Beyond that, it’s useful to have prescriptions just to help us decide what to do where – I regularly look things up in the Chicago Manual of Style, thereby saving me from having to justify my choices on my own account and ensuring that my choices will be consistent with choices in other similar books, which also helps make the reading go smoother.

But many of the things that prescriptivists focus on the most have little to do with consistency or clarity. In fact, that’s probably why they focus on them so much. Someone once said “School board politics are so vicious precisely because the stakes are so small,” and the same goes with grammatical and lexical prescriptions: the ones that people get the most exercised about are precisely ones that make the least difference in clarity or effectiveness – which frees them up to function almost entirely as social shibboleths, signifiers of who is “the right sort.” Grammar peevery is just using the rule-seeking instinct to license social aggression while giving a plausible excuse. One of my favourite articles that I’ve written goes into this: “Why all English speakers worry about slipping up.”

So, in short, while many linguists are simply hard-set against prescriptivists, I have a more complex position. In some ways, I am by profession a prescriptivist: I enforce prescriptions within specific contexts – though those prescriptions are often made on the basis of descriptive observation. On the other hand, I don’t correct people’s grammar unless they’re paying me to do it, and I don’t think grammar is a useful indicator of character or intelligence; some very magnanimous and insightful people are not too tidy with grammar, and some people who have perfect grammar are obtuse and obnoxious. I don’t enjoy the presence of outspoken prescriptivists, but I’m sure we will always have them; and they fill a role, modelling a specific idea of propriety that we can choose to flaunt or flout as we fancy.

But what about plural “they”?

This article originally appeared on The Editors’ Weekly, the official blog of Canada’s national editorial association.

Singular “they” is here to stay, and that’s a good thing. There is no decent reason to require that third-person singular pronouns—and only third-person singular pronouns—always specify gender. “He” has never truly covered men and women equally, though starting in the 1800s some people tried to insist that it did, and constructions such as “he or she” or “s/he” are clunky at best. So it’s natural to accept officially what has been an informal workaround for centuries: extending the plural pronoun to cover the singular.

It’s not the first time that English has done this. As early as the 1200s, we started using the plural “you” for individuals of higher status, and by the 1800s, rather than continuing to specify respect—or lack of it—in pronouns, we had almost entirely stopped using the lower-status singular “thou.” If we can use a plural form in place of a singular to erase a status-based distinction, we can certainly do it to erase a gender-based distinction.

But there is one problem that we run into with singular “they,” a problem we have already encountered with singular “you”: how do you make clear when it’s plural?

That’s still a useful distinction, and it’s not always obvious from context. Consider a sentence such as “The CEO met the VPs at a bar, but they drank too much and started singing karaoke, so they left.” If specifying the gender of the CEO is out of the question, to clarify who “they” refers to you’ll need to rewrite it to avoid the pronouns—and if it’s a longer narration, that gets clunkier and clunkier. So what do we do?

Well, what did we do with “you”? For a time—quite a while, in fact, from the late 1600s through the late 1700s—singular “you” got singular verbs: “you was,” “you is,” “you does.” It was so common, Robert Lowth inveighed against it in his 1762 Short Introduction to English Grammar. Even Doctor Johnson used “you was.” Will we try the same kind of thing with “they”—saying “they is” and “they was”? A few people have tried it, but such usages are already strongly associated with “uneducated” English, and so they’re unlikely to become commonplace. And “you was” didn’t last, after all—Doctor Johnson and everyone else ultimately switched to “you were” even for the singular.

So how do we specify plural “you”? You know how: we add further plural specification to it. In the US South, “y’all” or “you-all” is very common, and it’s spreading; in other places, “yous,” “youse,” “you ’uns,” “yiz,” and “yinz” are local favourites. In many other places, we say “you guys” or something similar when we need to make the distinction. And I’ll wager we’ll end up doing the same kind of thing with plural “they.” “They-all” seems readily available; “those ones” and “those guys” are likely to show up; differential usages of “themselves” and “themself” are already in use and may be extended; and others may appear—I’ll be watching eagerly. And in some contexts, for added clarity, something like “the one” might be used for the singular.

What do we do as editors, here and now? We keep an eye on how popular use is changing. When we can, we use our positions to influence it a little. And, as always, we use our judgement to find what’s clearest and most effective for the audience of the text we’re working on. 

Global English?

This article originally appeared on the blog of ACES: The Society for Editing.

English is not one language and never has been. Even Old English had different dialects. Global English is a family of varieties, mostly mutually comprehensible but loaded with traps and surprises. And even when you can easily understand English from another part of the world, you will most likely recognize that it’s from somewhere you aren’t… and you’ll eventually get confused by something.

All of that shouldn’t be a surprise to anyone, but some people seem to think it’s possible to produce a neutral, non-regional, truly global English. I will grant that it’s possible to produce an English that seems at least slightly foreign to anyone anywhere – the famous “mid-Atlantic” English you hear in some movies is a spoken version – but it is not possible to produce a variety of English that is taken as unremarkably local by every English speaker everywhere. There are several reasons for this.

Pronunciation

The most obvious difference is in pronunciation. Get someone from Kalgoorlie, Western Australia, someone from Tuscaloosa, Alabama, and someone from Newcastle upon Tyne, England, to have a pleasant chat and see if they can understand each other at all. 

Pronunciation is less of an issue when dealing with the written word – you probably won’t have a person from Buffalo writing “hot” and a person from Toronto thinking it’s “hat,” as you may when it’s spoken. But text is, in fundamental ways, a representation of the spoken word, and it often relies on reference to the spoken word. 

Not just jokes but advertisements and catchphrases rely on rhymes and wordplays that are particular to just some varieties of English – “caught” and “court” sounding the same, or “quarter” and “border” rhyming for instance. These differences also help ensure the impossibility of English spelling reform: you can’t make a phonetic spelling of one variety of English that won’t be incomprehensible to users of many other varieties.

Spelling

Not that English spelling is the same everywhere of course. Canadians are used to American-style spellings but can be very patriotic about colour and centre in some contexts; if a Canadian book expects a largely American audience, however, you can count on those Canadian spellings to alienate them. And on the other hand, if you just go with British-style spellings in Canada, you’ll soon realise it doesn’t always suit. And there are more striking differences, such as gaol versus jail, oestrogen versus estrogen, and arse versus ass – though that last case is arguably a difference of which word is used, not just which spelling.

Same thing, different word

There are many, many things that have different names in different countries. It’s well known that British cars have boots and bonnets instead of trunks and hoods and that a British lorry is an American truck (of a specific kind); it’s generally famous that what Americans call a barbecue Australians call a barbie. Fewer people will know that South Africans call the same thing a braai, or that instead of saying bro or buddy they say boet (which sounds like “boot”) – while in India, they say yaar.

For that matter, there are regional differences even in America, some of them quite celebrated. Is a Pepsi a pop, a soda, or a Coke (used in defiance of trademarks)? Do children on playgrounds ride see-saws or teeter-totters? Such regional differences – which don’t always divide on the same lines – are what linguists call isoglosses, and maps showing the isoglosses are some of linguists’ favorite things.

Same word, different thing

Americans occasionally run up against the fact that pants and fanny mean less publicly acceptable things in British English, and Americans are likely to know that in England and Australia mate refers to a friend rather than a romantic partner.

They’re less likely to know that hotel can mean a restaurant in India; that South Africans call a traffic light a robot; that in India you don’t graduate, you pass out; that tea can be a full meal in England; that a torchlight in Nigeria is a torch in England and a flashlight in the US; that I understand you in the US is I hear you in Nigeria; or that South Africans say shame when they are shown a cute baby or told of happy news such as an engagement.

Americans may not even know what someone from a different part of the US means by boulevard (a grassy strip between sidewalk and street or a wide avenue with a green strip in the middle?).

Turns of phrase

The lexical differences also extend to idiomatic turns of phrase. Where an American might write Main Street on Friday is different from a suburb on the weekend, a Brit would have The High Street on Friday is different to a suburb at the weekend.

A person from England might say I’ll knock you up to mean I’ll drop by and might tell you to keep your chin up by saying Keep your pecker up, but if the hearer is from North America, the results could be… awkward.

Some differences are points of pride: New Yorkers make waiting on line rather than waiting in line a kind of local shibboleth, and for New Zealanders, a phrase like Kiwi as (as in This food is Kiwi as) is, well, as Kiwi as… as what? They expect you to fill in the blank.

Grammatical niceties

There is also the matter of things that are correct usage in one variety but terrible errors in another. I dreamed I dove into a lake may be fine in the US, but I dreamt I dived into a lake is necessary in England. I casted my vote yesterday is terrible in some countries but absolutely correct in Nigeria. I’ll call you when I reach is normal in India rather than I’ll call you when I arrive.

Cultural references

Words and grammar aren’t the only things that vary from place to place though. English-speaking culture is obviously far from uniform, and some baseline assumptions just don’t work the moment you cross a border. Food is different, and passing references can quickly be opaque: not everywhere has food trucks or pretzel carts or chaiwallahs; not everyone can order poutine or grinders or bangers.

And while any Canadian will know what another Canadian means by toque and parka, most other people in the world won’t.

Americanizing and Canadianizing texts is a large and expensive business, and the spellings are the least of the issue. I remember one time a Canadian colleague working on a converted document discovered a number of instances of underprovinciald in a document; it turned out that someone had done a replace-all from state to provincial without checking. But when a guide to a health care topic starts talking about insurance, no amount of word replacement will fix the disparity between the US and Canada – or, really, between the US and anywhere else.

Houses and other buildings can be different, including what’s called the first floor (ground floor in the US and Canada, the floor above ground in most of the rest of the world).

There are also regional differences. In Canada, for instance, if you talk about a condo in Ontario, you probably mean a high-rise apartment; in Alberta, a condo is more likely to mean a townhouse, possibly a vacation property. What you mean by the word bungalow can vary quite a bit depending on where you are in the US. And in some cities, a duplex is typically side-by-side residences with one common wall, while in others, it’s a house with one residence on the upper level and the other on the lower – meaning that a reference to the people in the other half banging on the wall may be confusing.

Global varieties

How many kinds of English are there? Hmm, get a book of paint colors from a hardware store and tell me how many kinds of white, or blue, or black there are. Get another book and count again. English has national standard varieties, regional varieties within countries, local variants, socially divided varieties (often people from the same social group in different cities will sound more like each other than like people from other social groups in their respective cities). 

And don’t forget that the status of English is not the same in every country where it’s spoken – it’s the historical main language in some, the language of a colonizing class in others, and a lingua franca in still others. 

But in every country where texts are published in English, someone needs to make sure that that English doesn’t seem strange. And that someone may be you. The one thing you can be sure of is that while one variety of English may be comprehensible to speakers of another, it may alienate them – and may give rise to significant misunderstandings.

No exceptions?

Do I see a hand in the back? …Yes? …Labels on boxes? And short warnings and things like that? Yes, it’s true that you can produce some short passages that look local to anyone anywhere. But that’s not a global variety of English; it’s a snippet, and many other similar snippets will not seem so universal. 

It’s like going up to a rail ticket office in a European country and knowing enough of the local language to buy a ticket without their noticing that you’re not a native speaker: it doesn’t mean you’re fluent. You couldn’t carry on a conversation without being smoked out. You sure couldn’t write an article – let alone a book – that would be smoothly idiomatic. 

The same is true with using English from one part of the world in another part of the world. Oh, they’ll understand you, probably. But they’ll know you’re not from there, and there will be extra friction and effort in the communication and comprehension. You may not realise it, but the little differences to what you’re expecting colour your reception. And editing means understanding, appreciating, and working with these subtleties.

In effect, localizing English is like translating from one language into another, just subtler. You should only localize into a variety you have native fluency in – if you try to adapt a text into the English of a country you’re not from, you will eventually make an embarrassing mistake. But you also need to know the variety you’re converting from well enough to understand the local points of usage and cultural assumptions, so you don’t think a Canadian’s toque is a chef’s hat, don’t believe that a South African at a robot is watching an android, or don’t get what the big deal is about jumping out a first-floor window.

Which, in my view, seems like an excellent excuse to do some international traveling… when you can.

One of those questions that are often asked

A friend passed on to me one of those grammar questions that are often asked and often opined on:

In a sentence like “She is one of those people who are always late,” I learned to cross out prepositional phrases when linking subject to verb, so I would cross-out “of those people” and link “she” with “is” instead of “are.” Isn’t “of those people” modifying “one” (which acts as a complement to “she”) and not acting as the actual subject?

The problem with just crossing out preposition phrases is that you sometimes miss where the phrase ends – or doesn’t end! There are a few ways to look at it. The bracket way is short but benefits from further explanation:

She is one [of those people {who are always late}].

What that means is that there are people who are always late, and she is one of them. Yes, “of those people” is modifying “one,” but “who are always late” is modifying “of those people.”

A person could object (as many do) that it could equally be

She is one [of those people] [who is always late].

In other words, of those people, she is one who is always late. The problem with that is only in part that “She is one who is always late” is a bit odd; after all, “She is one” is a bit odd by itself too, but we’re not saying it by itself. The issue is really with “of those people.” For one thing, if the “always late” isn’t there to describe the set of “those people” of which she’s a member, it’s not specified who “those people” are. Who are they? And why are we mentioning them at all? Let’s look at a similar structure:

She is an eater of those hot dogs that have fallen on the floor.

She is an eater of those hot dogs that has fallen on the floor.

The difference is plain enough: in the first, the hot dogs have fallen; in the second, she has. And we have to assume that which hot dogs “those hot dogs” are has been established or can be inferred contextually; if not, it may be perplexing.

She eats those hot dogs. She has fallen on the floor.

Umm… tell me which hot dogs.

Returning to the example in question, the “is” version means this:

She is one of those people. Specifically, she is one who is always late.

If you’re in a context where you know who “those people” are, OK; but otherwise you have to specify them, or why are you mentioning them? And if your answer to “Who are they?” is “People who are always late,” you have shown why you really want to say “those people who are always late.” If she is one of them, then yes, she is one who is always late (as are they all), but if you go with the “is” version then you haven’t actually specified who they are; in fact, you’ve implied they’re not all like her in this respect. It’s like saying

It’s one of those hot dogs that is delicious.

You can see that the implication is that not all of those hot dogs are delicious; otherwise, why would you be singling that one out? Or if you say

He’s an editor who is popular at parties.

you know that it implies that not all editors are! And likewise, if she is one of those people who is always late, by implication others of those people are not. On the other hand, if you say “one of those people who are” and she is one of them, then she is covered.

That’s the logical analysis, and it’s the one I go with as an editor. In casual speech, I admit that I sometimes say “who is” in similar instances before I can catch myself, just because the structure of the sentence is so analogous to others where “is” would be appropriate; “one of those people” is a noun phrase like “a member of the club,” and we would most likely say “She is a member of the club who is always late.” (Unless it’s a club of people who are always late. Which is, in fact, what we mean in this case!) But when I’m editing, it’s more important to make it stand up to analysis. And it sounds good to me.

Made-up rules are what get on my nerves

What many word lovers love most are books. But what some word lovers love most is, apparently, a tidy bookshelf. Everything in its place. A single possible spot for any book. And, similarly, some language lovers love a nice tidy grammar, one where there’s only one option at any given juncture.

I understand the inclination. I’m an editor, and I know that tidiness is valuable. But I also know that it needs to serve effectiveness. If your drive for tidiness reduces the expressive potential of the language and proscribes something that people do with good effect, I do not think you are doing the good work.

I’ve harped on this in many of my articles on grammar. Lately I’ve encountered yet another instance of forced tidiness that I don’t think serves a good purpose. On a couple of occasions, people have said that they learned that what as a relative pronoun subject always takes a singular verb. In other words, Good gin and a little dry vermouth are what makes a good martini is correct and, according to them, Good gin and a little dry vermouth are what make a good martini is not. Continue reading

When to Use Bad English

Here’s my presentation at the 2019 ACES conference in Providence on when and how to use “bad” English (not just swearwords but nonstandard grammar and other things some people look down on).

There is to be no overthinking and no false agreement

A colleague asked me about a grammatical judgement someone had questioned her on: a sentence of the type “There is to be no swinging the legs back, no leaning forward, no pushing down on the feet.” Surely it should be “There are to be…” said the person, because there are three things named. My colleague knew well that it’s is – if you use your native-speaker reflex, that’s the choice you’ll make unless you second-guess yourself – but there’s always the matter of explaining why.

Well, here’s a quick analysis of why. It has to do with no and the number it negates. Have a look at some sentences that most native speakers would find idiomatic (they all work without the to be as well):

“There are to be no flowers.” → negating plural

“There is to be no gardener.” → negating singular countable

“There is to be no water.” → negating mass object, which is treated as singular because it’s not plural (singular is the default in English and plural is the “marked” option)

“There is to be no watering the flowers.” → negating gerund representation of action, which is inflectionally the same as a mass object because it’s not plural

“There is to be no water and no wine.” → negating mass and mass, which is still mass and thus still singular (absence of mass is absence of mass; nothing plus nothing is still nothing)

“There is to be no watering the flowers and no drinking the wine.” → as in the previous one, singular because unmarked (equivalent to mass objects – no specification of plural number)

“There is to be no gardener and no bartender.” → distributively negating non-plural objects; compare “There are to be no gardener and no bartender” or “There are no gardener and no bartender,” which may sound not quite right

“There are to be no flowers and no water.” → may seem weird because it’s conflicting in number

“There is to be no water and no flowers.” → also weird, but possibly more acceptable because we default to the singular on existential predicates (why we often say “There’s flowers on the table” when formally it’s “There are flowers on the table”)

So negation of a mass object is a mass negation, and as such takes the singular, and negation of multiple gerunds is also by default singular because it doesn’t specify plural and because in any case it would get the distributive singular. It only gets plural if it is specified to plural (“There are to be no swingings back of the legs”).

The “There are to be…” thought is clearly an example of overthinking. It’s false agreement, because although there are multiple noun phrases, the agreement is with not the quantity of noun phrases but the quantity signified by them. A native speaker’s ear will normally by reflex give the singular, but we override that reflex if we overthink. It’s like thinking too hard about the muscles used in standing up: swinging the legs back, leaning forward, pushing down on the feet… you may end up stuck in your chair until you stop overanalyzing it.

If you’re interested on more on there is versus there are, by the way, I’ve covered the topic a couple of times, once on this site in “There’s a couple of things about this…” and once for The Week in “There’s a number of reasons the grammar of this headline could infuriate you” (their title!).