Monthly Archives: March 2025

deem, redeem

Redeem appears in a couple of contexts: (1) saving souls and (2) savings bonds (and other similar coupons and fiscal tokens). In the first instance, your saviour redeems you – pays for your sins, preempting your punishment at the final reckoning and getting you an exemption from the dumpster of doom. In the second case, you redeem your savings bond or coupon – or is it that your bank (or chosen emporium) redeems it? Anyway, you bring it in and hand it over and get your money for it. The obligation is discharged. Either way, it’s redemption! And you come out ahead.

So the question is, if that’s re-deeming, what is deeming? 

Well, what do you deem deeming to be? That’s up to your judgement, in a way. If you deem something worthy, are you making it worthy or just estimating that it already is worthy? What kind of reckoning is it? 

In fact, we use deem to mean judge in both senses, ‘appraise’ and ‘pass sentence’ – and both senses date all the way back to Old English.

OK, but then, if deeming is judging, why is redeeming re-deeming? Is it a second estimation? A revaluing? Is it from a sense of deem meaning ‘ascertain value’? And also, does that mean that any act of deeming is demption

If I were you, I would think twice before buying any of that, but, you know, caveat emptor – “buyer beware.” 

Oh, do you see that emptor? That’s the word that means ‘buyer’; it comes from Latin emo ‘I buy’, and not because shopping is an emotional experience (emotion is not related – it’s from e- ‘out’, a variant of ex-, plus the same root as motion, and the original sense was to do with stirring or disturbing). This emo – em-o, the root is em- – appears in emporium, and the empt version (which is unrelated to empty) shows up in English words such as exempt (from Latin roots meaning ‘buy out’ – because ex- is ‘out’), preempt (backformed from preemption, ‘buying first’ – i.e., before someone else can), and redemption.

Yes indeedy! But then if emption is buying, what is red-? Does redemption mean ‘buying with something red (e.g., blood)’? 

You know it doesn’t. Redemption means ‘buy back’ – because re- is ‘back’ (it doesn’t always mean ‘again’). The d shows up just because re- is red- before vowels (like e- vs. ex-).

So, yes, when your eternal soul is redeemed, it is because your debts incurred by your misdeeds have been paid – bought back. And when you redeem a bond, it’s… well, originally it’s that the bank is the one redeeming, technically: they’re buying it back. But since the transaction is redemption, the sense acquired a reversal of direction.

And where does that leave our verb deem? Out of the question altogether. It’s from an old Germanic root for ‘judge, decide, believe’ – originally applied to literal judgement done by literal judges, who were there to sentence – to deem – but not to redeem. They were the deemsters! Or, with shortening of the vowel and an added p thanks to voicing pre-assimilation, the dempsters

Yes, that’s right, it’s the dempster who dooms you to the dumpster of death. And yes, doom is from the same root as deem and dempster – the heavy eyes ee of the judge who may deem are swapped for the popped eyes oo of the person facing doom… unless they are redeemed. Which may be possible, but is not etymological.

pernickety, persnickety

There are some pervasive misconceptions of editors. The humble word midwife is pictured as a kind of sneering, snickering, prickly, pernicious nitpicker, peering down every grammatical snicket in search of perfidy with a prurient, almost pornographic concupiscence for impertinent solecisms, enforcing pure lexical trickery purely for the purpose of putting poor scribes in the nick. In short, editors are thought of as persnickety.

Or should that be pernickety?

Hmm.

First of all, I must protest that it’s not true: a good editor is not a grammar numpty. A decent verbal massage therapist is in no way a textual Torquemada. Good editors are kind people who want the author to do as well as possible. (Yes, there are bad editors, but they are very much the minority. And those people who freelance unsolicited with markers on signage are rank amateurs, and pretentious creeps at that.)

Now let’s move on to the word – or words – of the day. Perhaps you have only ever seen one of the pair, or perhaps you have seen both; you more likely than not have at least a preference. But which one is the right one?

Ha. Both are established and accepted. I will brook no pernicketiness or persnicketiness about this. If you accept that we have both person and parson, both vermin and varmint, both further and farther, then you can accept that we have both pernickety and persnickety.

Which came first? That we do seem to know: pernickety was seen in print as early as 1808, according to the Oxford English Dictionary, while persnickety is known just since 1885. Also, pernickety started in Scotland and still prevails in Britain, whereas persnickety seems to be predominantly North American.

It’s not entirely clear why the s appeared. Some have suggested influence from snicket, though the first sighting of snicket as such was not until 1898 (it’s a narrow passage between buildings, in case you’re not sure – and it is not obligately lemony, either). Others have suggested that it’s just a substitution of snick for nick, which, yes, OK, but what does that mean? Well, snick could be an alteration of sneck, which is a Scottish word for – not snake! – ‘cut, notch’… which, as you may connect, nick also means.

Of course, nick means a number of things. You could be in good nick with Saint Nick, or down in the nick with Old Nick. But while Nick comes from Nicholas (which comes from Greek for ‘victory’ plus ‘people’), the other nicks come ultimately from the same nick, which, as noted, has to do with a notch, possibly (you’d think) related to nock as in the place at the back of an arrow where the bowstring goes. That in turn is… apparently not related to various similar-sounding words in various other languages, such as German nicken and Swedish nicka, which both mean ‘to nod’.

OK, but what about the per- and that rickety -ety ending? As far as we can tell, this is the same per- as in perfidious and pernicious and perseverant and even perhaps: a Latin root meaning ‘through’ that has gotten all through our language, even mixing with non-Latin roots (like hap). It seems perfect for the task. And the -ety certainly has a somethingness to it, but to some extent this word is responsible for that. It’s suggested that the -et- comes from -ed, as in the past participial suffix. The -y is the usual adjectival suffix, as in funny and happy (oh, hi, hap, what luck to see you again).

And it all comes together nicely enough. And frankly, the sound of it happens (happens!) to work with it too, in its polysyllabic way: per(s)nickety is to fussy what discombobulate and absquatulate and copacetic are to break and scram and fine – it expends needless additional energy in a punctilious display of painstaking conscientiousness, like some wanton nitpicker seeking to make a victorious mark on public signage. It presents itself as fastidious by being slow and tedious. But it’s plenty of fun to say.

calamity

What do you get when calm amity is alarmed by calumny and a call to military arms? Why, calamity, Jane.

Calamity names a bad thing – just about the worst – but it sure has an appropriate sound. To me it’s like a metal pot and lid falling to the floor, or perhaps an alarm bell on the wall in the hall ringing us all to panicked action.

But what is a calamity? If a house burns down, is the calamity the fire, or the loss of house and home? Or was it the match and the wooden timbers awaiting ignition? Per the Oxford English Dictionary, in English, at least, calamity was the effect first, and after that the cause: by 1490 calamity meant “the state or condition of grievous affliction or adversity; deep distress, trouble, or misery, arising from some adverse circumstance or event”; by 1552 it also meant “a grievous disaster, an event or circumstance causing loss or misery; a distressing misfortune.” So the loss of home is a calamity, and the fire that causes it is a calamity; but then we could also say the fire-prone conditions in presence of loose matches were a calamity, since they were the cause of the fire.

And, perhaps, so on. “Fortune is not satisfied with inflicting one calamity,” as Publilius Syrus is often quoted. This is not to say that bad luck comes in threes, but at least it’s either none or more than one. But can you separate cause from effect? Does not one carry within itself the seeds of the other, and the other on its branches bear the seeding fruit of the one? Thought, word, and deed come in order, but deeds lead to more thoughts, and so to words… Once you start the cycle, it keeps going – enough is never enough. Better to break the cycle… if you can. 

Can you? And how? Hamlet had thoughts:

There’s the respect
That makes calamity of so long life.
For who would bear the whips and scorns of time,
Th’ oppressor’s wrong, the proud man’s contumely,
The pangs of despis’d love, the law’s delay,
The insolence of office, and the spurns
That patient merit of th’ unworthy takes,
When he himself might his quietus make
With a bare bodkin?

That seems piercingly drastic, though. Why not simply elect to say enough is enough? If you go into one undiscovered country, after all, there may be more to follow. Laozi (Lao Tzu) – if there was such a person; he may just be a convenient fiction for the assembly of truisms – in his Dao de jing (Tao Te Ching; number 46) wrote,

禍莫大於不知足;
咎莫大於欲得。
故知足之足,常足矣。

Which can be translated variously, but Mary Barnard rendered it this way:

There is no calamity greater than lavish desires.
There is no greater guilt than discontentment.
And there is no greater disaster than greed.
He who is contented with contentment is always contented.

And John C.H. Wu made it this:

There is no calamity like not knowing what is enough.
There is no evil like covetousness.
Only he who knows what is enough will always have enough.

Calamity in both translates 禍, huò, which can also be rendered as disaster or catastrophe; 禍 is formed from a radical 示 that, to quote L. Wieger’s Chinese Characters, has the sense of “influx coming from heaven, auspicious or inauspicious signs, by which the will of heaven is known to mankind” – it was formed from two horizontal lines signifying heaven and three vertical lines representing what is hanging from heaven (the sun, the moon, and the stars). The other translations of 禍 give us some pictures: disaster is from Latin for ‘bad star’ (like Romeo and Juliet, the star-crossed lovers – as Friar Laurence said to Romeo, “Thou art wedded to calamity”); catastrophe is from Greek for ‘down-stroke’ or ‘overturn’. But calamity

There’s the respect that makes calamity of etymology. For, you see, calamity comes from calamitas, which means ‘loss, damage, harm, disaster, misfortune, et cetera’, but we’re not sure what calamitas descended from. Latin writers seemed to think it had something to do with calamus ‘straw, cornstalk’, but their explanations were a bit of a shipwreck. More modern scholars have reckoned it comes from calamis ‘damaged’, which seems right, but the problem is that it’s really *calamis – it’s a deductive reconstruction of a word that has not actually been seen in historical sources.

Meaning it came from somewhere, but, as with many a calamity, we’re not entirely sure where. The chaos of linguistic history is like the chaos of climate or of myriad other things: a butterfly flapping its wings – or a cornstalk breaking – might set in action a chain of events that lead to history-altering calamities. Or, on the other hand, it might simply be absorbed in the quotidian noise. And who knows which will eventuate?

Perhaps fortune does. …wherever fortune comes from. And as Darius Lyman’s version of Publilius Syrus’s Sententiæ says, “Fortune is not satisfied with inflicting one calamity.” The Latin original for which is…

…nonexistent. Sorry, you can (as I did) go at length through the original, searching and searching, and you won’t find a Latin equivalent of that. It turns out that Lyman was, hmm, fortune’s fool, or anyway fooling with fortune. The point is that he managed to include various verses in his version that can’t be traced to the source. They’re just convenient fictions, it seems, spontaneously generated.

Well, at least they’re true. Or are they? They’re sententious, but, you know, “words, words, words…”

Or, as the Duchess of York (the woman who gave birth to Richard III) said in Shakespeare’s Richard III, “Why should calamity be full of words?” And, I suppose, for the sake of conversation, why the converse as well?

canny, canty, uncanny

You know uncanny, of course. It’s sort of like what you experience when your grip on reality is tested – when your even-canning factory is offline so you just can’t even. But can you say what canny is? And, for that matter, do you know what canty is? Allow me to descant on this triad.

You probably haven’t encountered canty, though if you say it’s the opposite of canny, you’ll oddly be about right. Naturally, you would expect the opposite of canny to be uncanny, and at one time that was true – though not any longer – but it does not follow at all that canty is a synonym for uncanny. In fact, there is a clear line that can be drawn between the two (unless one is uncannily canty, which would be a real edge case).

Let’s start with canny, can we? The can in canny is not beyond our ken; it is and is not the same can as in Yes, we can. It is not, in that can is now a modal auxiliary conveying ability (that other can, the container, is a whole other can of worms, etymologically unrelated); it is, in that the auxiliary can comes from the same source as canny: the Old English verb cunnan, ‘know how, be able to’. If you know German, you know cunnan’s cousin kennen, which has the same meaning – which also reminds us that ken is from the same (d’ye ken? Oh, and the name Ken is unrelated; it’s short for Kenneth, as you may know, and Kenneth is from a Celtic name that has to do with fire and old flames, perhaps from someone’s Barbie).

Anyway, canny can have the sense ‘knowing, astute’, and it’s from that that we get the sense ‘prudent, cautious’, which is the more common usage now (meanwhile, there’s a Scots use of it to mean ‘friendly, pleasant’ – “a canny lad” is a nice fellow, not a cagey one). But the negation, uncanny, has come to mean not ‘unknowing’ or (as it once did) ‘incautious, careless’ but rather ‘unknown’, i.e., ‘beyond ken’ – in that eldritch realm of impoverished knowledge (and so also an uncanny, weird person – or, for that matter, a robot from the uncanny valley – is untrustworthy, opposite to a canny one). Something odd, not right, probably best left, even.

And how about canty, then? Well, that’s not just cant (cant meaning ‘slang’ comes via French from Latin canto ‘sing’, as does descant). But it’s also not just can’t – o, turn away from that apostrophe! It comes instead from the adjective can’t meaning ‘bold, courageous, lively, hale’, and in Scotland also ‘merry, cheerful’ (meaning that a Scot may be both canny and canty – don’t say it cannae be so). This adjective in turn comes from a German and Dutch word kant meaning ‘edge, line, border’ that, purely reasonably I’m sure (and according to a manual), came from Latin canthus ‘wheel edge’. The route from kant to can’t appears to be via senses of ‘neat’ and ‘sharp’. (And we are inclined to think it is also related to cant meaning ‘tilt, bevel’.)

And yet somehow a person who is canty is not edgy, but someone who is canny is! It’s just uncanny how language can do such things, you know?

Whither English?

Once again this week I guested into the editing class my friend teaches online at a local university. And this time, along with the usual questions about specific points of usage, one student asked what I think will change in English usage, and what changes editors should resist.

Which is a really interesting question! Predicting language change is fun and occasionally one gets it right, but there are always innovations that you just can’t predict – and social and technological changes, too. When you look at how things have changed in the past, it gives some sense of the usual forces of change. As I said in one presentation on the topic (more than a decade ago now), we tend to change language for four general reasons:

  • to make life easier
  • to feel better
  • to control
  • things slip

Fads that become accepted are common. The shifts in the pronunciation of the letter r – and their shifts in social status (for example, the advent of r-dropping in England, its adoption in America as a sign of higher status, its shift over time towards more of a working-class signifier in America but not in England) – are emblematic of this, as I wrote about in an article for the BBC. A lot of it has to do with signifying various kinds of social group belonging.

On the other hand, sometimes changes are invented and propagated – such as the ideas that you can’t split an infinitive (I’ve written about this more than once) and can’t end a sentence with a preposition, and the prescribed distinction between less and fewer. A few of these “rules” have become undisputed standard English now (such as the proscriptions of double negatives and double superlatives); others (such as the ones I just mentioned) are often waved around as rules but aren’t universally accepted, and serve mainly to license social aggression (as I wrote about in another BBC article). I did a whole presentation on when “errors” aren’t some years ago, and a bit more recently on when to use “bad” English.

None of which yet answers the question. Let’s see… 

  • I think that social media will continue to be a good vector for the rapid spread of new usages and references (everything is citational, after all), but of course I can’t predict which ones. 
  • I think that punctuation will get to be used more and more variably for subtle significations (after all, the presence or absence of a period at the end of a message can convey tone, sometimes importantly). 
  • I think emoji will keep getting used, including as nouns, verbs, and adjectives, not just interjections, but how far into formal writing they will spread I don’t know. 
  • I think capitalization will continue to be basically haywire, because it’s weird and complicated in English anyway (here’s something I ghostwrote for PerfectIt’s blog about it).
  • I’ve noticed what seems to be a shift (yet another!) in the pronunciation of r among younger people, at least partly under the influence of pop singers who are avoiding the retroflex sound in favour of something closer to a mid-high mid-front vowel. I’m not sure where that’s going, but keep an eye on it.
  • I suspect we will, at length, start using they-all or something similar to convey that we’re speaking of a group of people, rather than a single person of unspecified or neutral gender. I am very much on board with singular they – if you have an hour, watch this presentation I gave on gender in language, including the vaunted history of singular they and the deliberate reactionary imposition of the idea that he is the natural generic default. But singular they can bring the complication that we aren’t always sure of the number of people signified. When we started using you for all second persons rather than distinguishing between singular thou and plural you, various people in various places innovated y’all, youse, yiz, yinz, and so on. So why not the same with they?
  • And, because identity is important to people, especially when threatened, and because language is a key means of conveying that identity, I think Canadian usages and in particular Canadian spellings (centre, colour, you know), which have been slipping a bit in general Canadian usage, will come to be increasingly emphasized in response to threats to Canadian sovereignty. That’s not a change so much as a revitalization. But keep an eye out for innovation of Canadian signifiers too!

And as to the question of what changes to accept and what to resist, as I said in my “when does wrong become right” presentation, there are five questions we should ask when evaluating a change:

  1. What is the change? Really? (Sometimes the “change” is the original form and the “traditional” usage was invented and propagated more recently.)
  2. Where did it come from? When?
  3. Where is it used? By whom?
  4. Who is your text for? (Usages that annoy one audience may charm another.)
  5. What are the gains and losses – what does the change add in expressive value and clarity, and what does it take away?

Oh, and I am definitely in favour of being pragmatic to the point of deviousness in our choices. As Machiavelli said, “consider the results.”

If you were to use the subjunctive…

It’s March fourth. Happy Grammar Day! Today is a day when certain people who like to loudly declare their love for grammar put extra energy and volume into promulgating their favourite rules. Which is kind of a giveaway about their motivations: It’s not grammar itself that they love (since “bad grammar” is also grammar, adhering to a coherent underlying set of rules, just not the rules that they prefer), it’s security in an imposed order. It’s authority, as long as they get to be the authority. It’s like if someone were to say “I love flowers!” but simply could not stand the disorder of a meadow of wild flowers and had to have the tidy order of a strictly planted garden, with no flower out of place.

But there is an important difference here: Many neat grammar rules do have an organic basis in the language, and the imposed rule is intended to keep usage from drifting away from that. (This is not true of all grammar rules, mind you; for example, we know exactly when the strict distinction between less and fewer was invented, and we do not in fact owe allegiance to its inventor.)

But usage does drift. For most English speakers, for example, whom is effectively a foreign word; they have no natural feel for its usage, and so they use it in places where it’s inappropriate according to the rules they’re attempting to preserve. So a bit of freshening up on the established rules, for those who want to follow them, is not unreasonable. And – to get to my subject for this Grammar Day – for many English speakers, the subjunctive is also a strange thing that, even if they use it sometimes, they don’t altogether “get.”

Which isn’t that big a problem in most contexts. But if you were to use the subjunctive, you would need to know not just how to use it but when to use it. And the available guidelines for it are sometimes so detailed as to be confusing. Wikipedia, for example, says, “Subjunctive forms of verbs are typically used to express various states of unreality, such as wish, emotion, possibility, judgment, opinion, obligation, or action, that has not yet occurred.”

Part of the problem is that these do not all require the subjunctive, but they are things it can be used for. Another part is that people get confused about what’s real versus unreal and what you can and can’t use the subjunctive for. So – as it is Grammar Day (or, if you are reading this on another day, imagine it were Grammar Day) – let me give you the quick and easy way of thinking about the subjunctive mood: It can be thought of simply as a hypothetical mood. Note that I say “mood” – it’s not a tense; it’s a perspective that can be applied to any tense, just like the indicative mood (which is the usual mood, talking about things that definitely do or don’t exist). 

And this is where some people get confused, because hypotheses operate differently in the past and present than they do in the future. When we’re talking about things in the past or the present, something that’s hypothetical hasn’t happened and isn’t happening, whereas something that’s indicative has happened or is happening. To use Wikipedia’s term, in the present and the past, the unreal is known to be unreal. But when we talk about the future, it’s all hypothetical; none of it has happened yet, even when we’re using the indicative. None of it is real yet. Which means that the effect of the subjunctive in the future is not the same as in the present and the past. 

Let’s look at some examples:

Past:

Subjunctive: “If you had helped me, I would have been grateful.” (You didn’t, and I wasn’t.)
Indicative: “If you helped me, I was grateful.” (You might have helped me; I just can’t remember. If you did, I was grateful.)

Present:

Subjunctive: “If you were helping me, I would be grateful.” (You aren’t, and I’m not.)
Indicative: “If you are helping me, I am grateful.” (I’m not sure if you’re helping me; if you are, I’m grateful.)

Future:

Subjunctive: “If you were to help me, I would be grateful.” (I’m proposing that you help me, but I’m doing so indirectly, so as to make it clear that it is not expected but merely possible at your discretion.)
Indicative: “If you help me, I will be grateful.” (Just a straightforward conditional, laying out a possible course of action and a consequence of it.)

You can see that both ways of speaking of the future are possible, and both refer to the same case, but one is using the hypothetical framing to put in more distance so as to disavow any air of expectation or transaction – in other words, it’s being more passive and polite – whereas the other is simple and direct.

And this is where we see that choices of grammar are not just about what is technically correct; they are also about negotiations between people. Everything we say, we say to produce an effect, and part of that effect is a negotiation of status and expectations between us and the person(s) we’re speaking to. (Unsolicited corrections of other people’s grammar are an exemplary case and their intended effect is left as an exercise to the reader.) In the case of my example, “If you were to help me, I would be grateful,” the subjunctive is used to make a suggestion or implied request, or wish – none of which, by the way, asserts or implies that the thing is outside the realm of possibility; it simply uses the hypothetical framing to emphasize that it is not a certainty, and it does that so as not to impose or make a claim on the other person.

One more thing, though: All of this is just if you use the subjunctive. You don’t, in fact, have to; there is a version of English that simply doesn’t use distinct forms for the subjunctive. In it, you never say “if I were you”; it’s “if I was you,” even though I have never been you. This version is more common and more accepted in England than in North America, but it’s available everywhere… though it does have a less literary air to it, and it allows the occasional ambiguity, though that’s usually resolved in the next clause with the choice of tense. For example:

Subjunctive user speaking hypothetically: “If I were finished, I would stop writing.”

Subjunctive non-user speaking hypothetically: “If I was finished, I would stop writing.”

Subjunctive user or non-user using indicative: “If I was finished, then obviously I stopped writing.”

drift

Snow White was a drifter.

OK, you say “ha ha,” but there are reasons I say that beyond the obvious pun. How did she end up with those dwarves? It was a blow of the winds of fate. And snow drifts when the wind blows. But where does it drift? Not so much on the plain, where things are smooth and crisp and even. It drifts against high points and it drifts in low places and it drifts at the edge of shelter; it does not decide on its own, even if it takes a fence. Snow that’s on a peak can be blown off a cornice and land just where it catches enough interference to stop – which may well be a very humble location indeed, such as among the dopey, sleepy, and grumpy. And when we say snow is blown, what that means is it’s driven. Driven by the wind, yes, but driven as surely as if it had been in a car on a highway. 

Snow White was driven out by the queen and driven on by fate, but in more modern times she also could have been driven on a highway on her way to where she would settle down: She could have been a hitch-hiker. And hitch-hikers are by definition drifters. I’ll explain in a moment, though I doubt you doubt me.

Snow White, you may object, was hardly what we think of when we think of a drifter. She was pure as the driven snow!

But exactly. What is the driven snow? It is the snow in drifts.

Drift, you see, is the secret twin of driven. Both are past-tense forms of drive (or of an older form of drive). Drive originally meant (and in some uses still means) ‘send forth, push forward, cause to move forward’. The noun drift names something that is driven – as in pushed forth by the wind. A drift is made of snow or sand that has been caused to move forward – by the wind – and displaced from where it first lay to a new formation, typically at a change in the landscape. We’ve lost sight of that sense when we talk of driving a car because we envision a sort of cybernetic relationship between the steerer and the steered, but you can see it when we speak of a cattle drive – yes, including steers, as in bulls that have had their drive cut, if you catch my drift. Oh, and yes, “my drift” means what I’m driving at – where I’m guiding the sense to (and it’s a turn of phrase we’ve had for half a millennium now).

So yes, the driven snow is the drifting snow. And anything that drifts is driven. Which means that drifters are people who are driven, not by their own inward forces but by the winds of chance and change. And hitch-hikers are, of course, driven. The fact that the first ‘driven’ refers to the way the wind blows and the second ‘driven’ refers to the easy come, easy go riding in a car that is controlled by another person doesn’t really matter (to me).

And Snow White, who was, we all are sure, pure as the driven snow, and who was driven out and driven into cohabitation with miners, at both a low point and a high point in her life (hi-ho!), was plainly a drifter. Snow doubt about it.