BIBBO

Back in high school in Banff, in the early 1980s, I had a math and computer science teacher named John P. Stutz. He was quite the character (still is; between then and now, he has been mayor of Banff, and now he performs weddings for a living). One of his lessons I remember best is the classic computer science axiom: GIGO – garbage in, garbage out. In other words, no matter how good a program is, if you feed it garbage data, you will get garbage results. The fact that it’s a well-designed program on a good computer doesn’t automatically transmute rubbish into gold.

Back then, machine learning and artificial intelligence were still in early days. The sophistication I now get from my iPhone – or Google, for that matter – would have blown my head clean off back then. Stutz’s too, probably. But these days, you can seriously propose and produce things that would have been pie in the sky at the time: say, having a computer read job applications and automatically filter candidates, or getting writing advice from a computer application that had learned from a large number of published articles, or having software use security cameras to scan faces and flag people whose appearance matched faces with a higher-than-average likelihood of interaction with police. And when you train that kind of thing, GIGO matters. But something else matters even more: BIBBO.

BIBBO? That stands for bias in, bigger bias out.

Why not just BIBO? Well, here’s the thing. Garbage is garbage, but bias is scalable, and repeated bias compounds. When computers learn biased things from biased data and then put those biased things in the real world, that has real-world effects that feed back and increases the bias in the model. And also, if bias is going in, it’s because that bias can be found in the real world, and the machine’s biased output will confirm and strengthen that real-world bias. Not just the machine but the people who use the machine will have bigger bias.

Consider the job application filtering program I mentioned. The machine learning will look and see that certain kinds of applicants have been more likely to get hired, and will filter those kinds of applicants in and other applicants out. Seems fine? Not if the hiring choices in the original set were influenced by factors such as race, gender, religion, prestige address, prestige school etc. For most jobs, none of these factors have a direct bearing on ability to do the job. But the machine will see a certain kind of name and address and so on and downgrade the applicant on the basis of other similar people not getting hired previously. And then as such applicants are hired less and less, the machine’s bias is reinforced – as is the bias of those doing the hiring.

Consider the writing advice idea. An application that has looked at thousands of academic journal articles will have identified various stylistic features of academic writing. What it won’t know is that many of those features are actually functionally bad: they obscure key details and bury essential points in circumlocution and uncommon terms. Many editors are trying to work to undo these practices, but it’s an uphill struggle, as academic authors often think that if text sounds too clear it’s not erudite enough, and if it takes into account the author’s specific role and position it’s not objective. Throw in software that reinforces these habits and you get academic authors having these prejudices reinforced and being told to do more of what the editors want them to do less of.

And consider the security cameras. If the data from the cameras is used to advise police on who to do a stop-and-check with, and each stop-and-check is counted as an interaction with the police, then obviously it produces a feedback effect. If in one month people with red hair happen to have interactions with police at twice the rate of people with brown hair – statistical anomalies do happen – and that data is fed back into the system, in the next month people with red hair may be more likely to be stopped and checked, which will increase their interaction statistics. And even if the system only counts actual arrests, not interactions, anything that increases the likelihood of someone being stopped by police also increases the likelihood of their being arrested, assuming they have the same likelihood as anyone else of happening to be doing or possessing something illegal, and the same likelihood of not responding well to being stopped by police for no obvious reason. And then the system becomes more biased, and so may the police officers – and perhaps society in general.

These aren’t made-up examples, either. They’re taken from the real world, from products and applications promoted by software companies. I’m not naming names just because it’s tiring (and occasionally expensive) to deal with angry techbros.

There are ways to correct for these biases, of course. You can work on evening out the training data; you can correct for biases in the data and the output. Above all, though, you need to know what to watch out for, and how to deal with it. You need to know BIBBO. Because if there’s bias in the system, bias is the system.

It will naturally help if you yourself, as a designer of the machine learning, do not also have uninspected and uncorrected biases. A problem we face today with many of these applications is the idea that if it’s large amounts of real-world data processed by sophisticated programs, then it is objective and not subject to human biases. This is false, of course – it’s programmed by humans and the data is taken from humans in a society with its own biases – but there are people in the field who do not seem to see that it is false, because they have an ideology that science (including math and computer science) is hard and strong and intelligent and objective, while things that study humans – sociology, philosophy, etc. – are soft and weak and wishy-washy and tendentious. They come through education with this bias, and they use it to filter the information they get, and they design computer applications with that bias. And so you get these things that reinforce bias. All because they thought they could avoid bias by avoiding inspecting bias. But BIBBO.

widthy

You need not be lengthy if you can be widthy.

Have you seen this word widthy before? You probably have not. Have you considered why that might be?

Widthy seems, plainly, to be a natural counterpart to lengthy. Spatial dimensions include both length and width, after all. And yet.

Is it that the counterpart to lengthy is in fact breadthy? After all, broad – though not as commonly used here and now by us as such – is the true old direct counterpart to long; look at modern German: for ‘long’ it has lang, but for ‘wide’ it has breit, which is related to broad. And broad, via umlaut, is the source of breadth just as long is the source of length.

But breadthy is not in use either. And, like widthy, it really never has been. Until now, that is.

So, first of all, why lengthy? Why not just long

For the sake of mild euphemism, it seems. Length is, after all, not always so good. Consider some of the Oxford English Dictionary’s earliest citations for it. John Adams, 1759: “I grow too minute and lengthy.” Benjamin Franklin, 1773: “An unwillingness to read any thing about them if it appears a little lengthy.” Thomas Paine, 1795: “In the mean time the lengthy and drowsy writer of the pieces signed Camillus held himself in reserve to vindicate every thing.”

Did you notice another thing? Yes, all three authors I’ve quoted were prominent figures in the American Revolution. The word lengthy first appeared in America in their time, and was for a long time (and still to some extent is) considered an Americanism. And all three seem to have prized brevity in speech and writing, or at least to have allowed that cogency is valuable. But when apologizing or deprecating, rather than use the blunt long, they use the, uh, lengthier (and thus slightly more cushioned) lengthy.

It’s not that lengthy is always negative, especially lately. But consider whether you’d use it in place of long for things that are undoubtedly better for being longer. “A lengthy baguette”? Well, that sounds silly because we more often associate lengthy with less physical characteristics, perhaps (nothwithstanding “a lengthy journey,” which could be referring to long time as much as distance). But “a lengthy slow dance with my darling”? Ouch… sounds like you’re impatient to move on from dancing.

So is this why we don’t (or didn’t) have widthy and breadthy? I suspect that has something to do with it. Width and breadth are not so often apologized for. One can have a great breadth of knowledge and education, for instance, or of perspective. A store or a library may have a breadthy selection. Breadthiness could be intoxicating; it could leave a person breathy – or breathless.

And why, by the way, do we have width when we have breadth? In truth, as I’ve pointed out, breadth is the more natural original counterpart to length; width referred originally – and still does, in some contexts – to expanse or extent in any and all directions. You travel far and wide, for instance, over the whole wide world. If your eyes are open wide, they are open top to bottom as much as side to side. Certainly wide has always had an available sense referring specifically to side-to-side dimension, and that has increasingly come to dominate, but it still has a, uh, widthier range.

And anyway, when you see widthy, you know what it means, don’t you? If you look at breadthy, you might misread it as breathy, or perhaps you might get stuck on the bread part. So I like widthy. It seems that we didn’t get the word 250 years ago because no American statesman ever felt the need to apologize for going too wide (hmm, perhaps if they had instead been sportsball players…). 

And it’s true: don’t apologize for being widthy. Just do.

ain’t

Ain’t ain’t a word.”

Obviously, that’s functionally false, and the speaker knows it: if ain’t really weren’t an understandable lexical unit, the sentence would make no more sense than “Zcvny zcvny a word.” But what some of us miss – but the people who declare the unwordness of ain’t (and other words) know at least implicitly – is that they don’t mean “not usable as a word.” They mean that it’s not a word in roughly the same way as someone in, say, 1850 might say that an obvious human adult was “not a person.” 

It’s not that the human couldn’t speak, eat, run, or do other things that any human could do. It’s not even that the human wasn’t, in the broader and more common sense, a person. It’s that the human was not legally a person: she or he couldn’t vote. The human was not of the right sort. The human did not belong in certain places, and could not fill certain functions, that were open only to those who were duly enfranchised.

This question of what humans are legally persons has not been so contentious since all adult human citizens regardless of gender or race (though not necessarily regardless of certain other statuses, such as criminal or mental) have been eligible to vote. But the question of what words are words has not gone away, not least because it’s not a question for courts decide, nor for dictionaries, and especially not for linguists (if you assigned the task to linguists, they would refuse it, run away and hide, or arm up and fight you).

It ain’t for dictionaries to decide? Nope. And I say that not just because dictionaries are field guides, not legislation (you don’t say something that just flew past is “not a bird” just because it’s not in your pocket guide); I say that because even the people who appeal to the authority of dictionaries reject that authority when they don’t like what they find. Such as “ain’t contraction 1 : am not : are not : is not 2 : have not : has not.”

Ain’t is not legally disenfranchised, no (though I suspect its ingenuous use in legal documents would be frowned on). But it is pointedly socially “not our sort, dear.” It is a word that “the better people” want it to be understood they would not consort with. It would not be invited to society weddings. But it would work in the kitchen with the caterers.

And as it worked there, those in the wedding party would studiously avoid seeing or acknowledging it, just as they would any fallen poor relation. “Do not say that Uncle Frederick is working in the kitchen. I won’t have it! That man is just Freddy, a local ne’er-do-well to whom we try to give a bit of charity work from time to time. And he should be kept away from the guests.” Never mind that Freddy, the erstwhile cadet of the family, is doing quite well and in fact the wedding is entirely relying on his skill as a saucier.

Erstwhile cadet? By that I mean younger brother of the heir. But younger brothers, as louche as they may be, are still normally permitted entrance to society, and so was ain’t, at first.

You might think that ain’t was illegitimate, since it doesn’t match anything clearly: not am not, not are not, not is not, not have not, not has not. But if you spend a little more time with the matter, I think you won’t be of that mind for too long. Contractions can change form. Am not became a’n’t for some people for some time, as did are not, and have not and has not became ha’n’t and even ’a’n’t (with varying numbers of apostrophes). And then, with shifts in vowel, that lengthy a came to be a “long a” – the sound that is represented by the ai in ain’t. We also know that respected writers and assorted rich persons were using it in the late 1600s and into the 1700s. The debate has not been concluded as to which sense of it came first, or exactly how it came to cover so many different senses; it may have arisen independently for multiple forms and merged. But its ejection from polite society came as a result of several transgressions to the rigid and fragile roles and rules of privilege.

For one thing, it simply wouldn’t stay in its place, or even know its place. It covered too many senses. This was a problem for reasons of ambiguity, perhaps, though in truth only rarely, since its use for hasn’t and haven’t is only for the auxiliary: “I ain’t a dog” can’t mean “I haven’t a dog.” It was a bigger problem for reasons of flagrant promiscuity, which is frowned on. And – to put it plainly – it was too easy. Which is a terrible sin in English. All of the worst mistakes, my darling, come from trying to make a spelling or inflection too easy. “I goed”? Wretchedly childish. “I been?” Sloppy and lazy. And simplified spelling? Beyond disgusting. It also ran up against an increasing prejudice against contractions, which – starting not too far into the 1700s – were increasingly seen as too informal and lazy, making one syllable where our illustrious forebears had seen fit to make the effort of saying two.

And then it started being associated with the wrong sort of people, which is absolutely death, darling, death. It was heard on the tongues of those rural sorts from the farther reaches of the countryside, and those lower-class sorts from the poorer neighbourhoods of the city – those unpleasant people who sold fish and made deliveries and took away rubbish and cleaned gutters and, in short, did all the essential work without which all the fashionable people would be wallowing starving in the muck – and then it was done. No decent person could be heard to use it.

Except when slumming, of course. Your school teacher, socially vulnerable, might studiously avoid association with the lowlifes, but the assorted lords and barons could afford to consort slyly on the side with the riff-raff if they were the fun or useful sort of riff-raff. And ain’t has become the classic slumming word. With this one word, you can shift the tone and attitude of a whole sentence – “Sir Peter? He ain’t here, darling, so off with you” – or even set the tone for a song in the title – “Ain’t Misbehaving,” “It Ain’t Necessarily So.” It is, in short, an expert saucier. With its fall from grace came an ability to season a sentence as quickly and effectively as any pepper or aged cheese.

And that is a role it is happy to fill. In fact, it has far more effect and power than any of its more respectable siblings and cousins. It’s not just that it can instantly set the tone as casual, folksy, and thus (thanks to our ideologies around class and language) more honest; it’s that it does not shrink from respectable companions, but they can be frightened by it – one incursion of ain’t into the wrong place could be like a fly in the pudding: “In submitting this update, we acknowledge that we ain’t achieved our goals yet, but we hope that with further funding we will be able to provide conclusive results.” In short, ain’t is misbehaving, and that’s the point.

So I am not making an impassioned plea for the acceptance of ain’t into formal discourse. That would take away its power. It would be telling the best saucier in town that he must rejoin his starchy family and spend the afternoons discussing bank drafts and society weddings and never cook again. But I am saying to stop saying that it’s not a word. A word that is casual is still a word, and it does not demean or degrade anyone to use casual language when the situation calls for it. Our language is capable of almost infinite variety and nuance in tone; let’s make use of it unashamedly. And wave hi to Uncle Freddy in the kitchen.

gonna

I’m gonna lay down a three-part fact here:

Eye dialect is hypocritical, handy, and hazardous.

What’s eye dialect? It’s when you spell something “incorrectly” the way pretty much anyone would say it rather than the way it’s officially spelled, to indicate something about the speaker to whom it’s attributed and/or the context in which it’s presented. And by “something” I mean typically a lack of education, or at least a very informal, “folksy” context, which is just a positive tinge onto the same lower social position. 

So if, for instance, a character in a book says “I seen my reflekshun,” the “I seen” is nonstandard grammar, but the “reflekshun” is eye dialect: it’s exactly the way everyone says it, so the implication is just that the speaker would spell it that way if they wrote it down because they’re, you know, [makes deprecatory hand gesture].

Among the most common – and consequently least negatively toned – bits of eye dialect are woulda, coulda, shoulda, and, of course, gonna.

Everyone says going to as “gonna” most of the time when it’s used as a modal auxiliary. For one thing, frequent and unstressed grammatical widgets are usually uttered with the minimum necessary effort – heck, I often really say “I’n a” rather than even “I’m gonna”; for another, it allows us to differentiate between auxiliary and main-verb uses, for example between “Are you gonna get that now?” and “Are you going to get that now?” (the latter, spoken with full value, meaning “Are you going now to get that?”). You wouldn’t say or write “I’m gonna the store.”

But, because this is English and we just love showing where things came from and how they’re put together, and – more importantly – we love using spelling as a torture test and badge of virtue, we still insist on the “correct” (socially valued) spelling being going to – and would have, could have, should have – even when we say it in the reduced way.

So I think it’s plain why I say eye dialect is hypocritical: we use it to look down on people for doing exactly what we – and everyone we consider the “right sort” – do on the regular. (Do you protest? OK, tell me what your reaction is when you see that someone has written “I would of done it if I’d of known.” And then tell me the difference between how you would pronounce that and how you normally pronounce “I would have done it if I’d have known.” If you see “would of” in a novel, it’s because it’s attributed to a character who would write it that way.)

Why do I say eye dialect is handy? Ah, because that very class connotation – the one that is arrantly hypocritical when we use it to look down on others – lets us establish tone when we’re using it in our own voice: we can present ourselves as “casual,” “folksy,” “honest” (honesty is a virtue typically viewed as inversely correlated with sophistication – yes, it’s been studied: we tend to see it that way; and yes, we’re wrong about that: in reality there’s no correlation one way or the other). 

Yes, it’s still hypocritical, maybe even doubly so because we’re using it to avail ourselves positively of a distinction we otherwise wield negatively: when other people do it they’re unintelligent, but when we do it we’re folksy and honest. But ya know what? The more we use the spelling gonna generally as a colloquial usage, the more it loses the “unintelligent” connotation, so I’m not opposed to it. Which is fine, because everyone’s gonna use it anyway.

OK, so why do I say eye dialect is hazardous? I don’t mean as a further elaboration on the class distinctions. I mean for people learning English as a second (or later) language. I’ve known people who learned English as adolescents or adults who hadn’t quite processed that gonna is informal when written and relaxed when spoken. A professor I had would use it in comments and letters written in otherwise academic English. A co-worker always said it (in her slight German accent) with very clear and deliberate enunciation: “Are you gun na do that?” – which sounded more odd and awkward than if she had just full enounced the formal version, “Are you going to do that?”

So how long, by the way, have we been doing this?

That’s a two-pronged question, and the answer to the first prong – how long we’ve been reducing “going to” to “gonna” in speech – is that I have no way of knowing exactly for sure, but the odds are good that it’s just about as long as we’ve been using going to as a modal auxiliary. There are four very common phonological processes involved: 

  1. place assimilation, wherein the /ŋ/ moves to the front of the mouth and is realized as [n] because it’s between a front vowel [ɪ] and a stop at the tip of the tongue [t] – either one could be enough to move it forward, as we see from the common and long-established practice of saying -ing as [ɪn]; 
  2. assimilation and deletion, wherein the [t] just gets Borged right into that [n] and disappears – we do tend to reduce /t/ very often, turning it into a flap as in [bʌɾɹ̩] for butter or into a glottal stop as in [bʌʔn̩] for button, and this deletion is just the ultimate reduction;
  3. deletion again, in this case the [ɪ] before the [n]; and 
  4. reduction, when we make the minimum effort in pronouncing the o and it comes out just as [ə] (an argument could be made that the deletion of the [ɪ] is part of this reduction).

Basically, we say it as “gonna” because we naturally conserve effort when speaking – there’s a trade-off between conserving the effort of articulating the word and conserving the effort of being understood, and with modal auxiliaries, the effort of being understood is usually the lesser problem.

The answer to the second prong – how long we’ve been writing it as gonna – is just over a century in North America, but about a century longer than that in Scotland, if the available published citations are to be believed. Eye dialect did have a bit of a vogue in the US in the late 1800s and early 1900s, and this spelling was likely encouraged by that.

So there you have it. One of the most common bits of “wrong” spelling, so entrenched that in some contexts these days you’re making more of a point if you spell it the “right” way: picture Janet Jackson’s “What’s It Gonna Be” as “What’s It Going to Be,” or Led Zeppelin’s “We’re Gonna Groove” as “We’re Going to Groove” (and then why go halfway? why not “What Is It Going to Be” and “We Are Going to Groove”?). Eventually it might even qualify just as nonstandard spelling, not eye dialect. But my points about eye dialect are still gonna stand…

unkempt

Ambrose Bierce would surely have said that he liked English to be well-kept and not untidy. But he would never have said that he did not want it unkempt.

Do you know who Ambrose Bierce was? His best-known writing these days is probably the book most often called The Devil’s Dictionary, a lexicon of cynical definitions; I’ve quoted it in my tastings of ambsace and jape. He was, in his time (the late 1800s and very early 1900s), a noted newspaper columnist and short-story writer in San Francisco. He was known for his caustic wit. And, as Theodore M. Bernstein writes, “at the age of seventy-one, he departed for Mexico, where he mysteriously disappeared.” But a few years before he did, he wrote Write It Right: A Little Blacklist of Literary Faults, which Bernstein handily included at the back of his book Miss Thistlebottom’s Hobgoblins (a book greatly beloved of those editors who know it, and one that runs contrary to the spirit of Bierce’s prescriptions).

Bierce’s little work is, as the subtitle says, a blacklist of usages that were untidy, careless, thoughtless, et cetera. He dives in boldly: the very first entry is flatly wrong – he inveighs against “a hotel” and “a heroic man,” insisting that before unstressed “h” one should use “an.” The rest of the list is a nearly unrelenting jeremiad of similar twaddle. But many of his bugbears have survived to be bugs in the ears of more recent cranks – for instance, his objection to “a healthy climate” (“only a living thing can be healthy,” he writes), his insistence on using “fewer” rather than “less” in referring to numbers, his fussiness about the placement of “only” in a sentence, and his hatred for “very unique.”

And then there’s unkempt. Few people fuss about this word these days, but I’m sure a survey of the undersides of mossy rocks in the vicinity of libraries might turn up one or two who do. The objection? As fierce Mister Bierce says, “Unkempt means uncombed, and can be said of nothing but the hair.”

Oh deeaaarrrr.

Just as genetics are not culture (discovering that a previously unknown ancestor was Irish does not instantly make you Irish, for instance), etymology is not definition. But just like in those ads where someone discovers a bit of their genetic history (because they paid to give a company their genetic data) and suddenly is enthusiastic about a culture to which they had previously been indifferent, many a person, discovering the etymology of a word, is suddenly aflame with a passion for the word having to mean exactly literally that thing – and of course this combines with the rule-seeking behaviour common among those who see tidiness as among the greatest virtues. Dilapidated means the stones are falling apart? Then you must not use it for wooden buildings! Decrepit comes from Latin meaning ‘creaky’ as in a person’s joints? Then it cannot be used for a stone building! And unkempt is the negative of kempt, which is the past participle of kemb, which is an alternate form of the verb for comb (splitting in Old Germanic from kamb, which evolved to comb)? Then if you are not speaking of hair, things are about to get hairy!

I’m not entirely sure which is Bierce’s greater sin in this instance: the etymological fallacy or a disregard of metaphor. I’ve written at length about the etymological fallacy (including one of my favourite pieces, first published on the blog of Merriam-Webster), so I need not flog that horse here (dead or alive). But even if we want to allow the word unkempt still to have the literal meaning ‘uncombed’, who says we must always use words in their literal sense?

Not Edmund Spenser, just for starts. In his renowned 1590 Faerie Queene, he used the word unkempt to refer to uncouth words. Others after him were similarly liberal with it. If words can be unkempt, why not other things that are also not hair? 

Of course, Bierce knew quite well the use of metaphors, just as he knew that his list of no-noes was really at most a list of maybe not-maybe nots. You see both facts evidenced in his introduction, where he writes, “In neither taste nor precision is any man’s practice a court of last appeal, for writers all, both great and small, are habitual sinners against the light” – Bierce himself included.

That said, I do think he’s chosen the weaker flame to turn his eyes towards. Certainly, tidiness has its value – a fact that, in household matters, my wife reminds me of frequently, and a fact that, in literary matters, earns me an important part of my income. But the world of words is not a brush-cut or even a decent perm. It is fabulously unkempt, and if you comb it too hard, it will tangle with you.

booger

Does this word make you giggle?

It makes me giggle, even though I still use it on a quasi-regular basis, because, due to habits from my youth, one of my usual terms for what other people call tissues or Kleenices (that’s the plural of Kleenex, of course) is booger rags.

Booger is as close to a bad word as you can get without actually being a bad word. It’s… well, it’s not vulgar, not exactly; it just names something gross. Super gross. In spite of the fact that we all have boogers and deal with them fairly regularly. (I mean, same with shit, but that’s worse.)

For some people, in fact, booger actually is a bad word, or at least intolerably indecorous, which is about the same. Those of us who loved (and even still love) WKRP in Cincinnati may recall its prominence in the show’s opening episode. Andy Travis, the new station manager of a failing music station in Cincinnati, arrives to discover that one of its deejays, Johnny Caravella, used to be a major radio star in LA. Johnny is stuck at this nowhere station playing moldy oldies because he got fired from his headline job for saying “booger” on the air.

On the other hand, booger is not so outré that you can’t play it in Scrabble, even in the current version of the app with its excessively bowdlerized word list. Just a couple of days ago, starting a new game with a friend who is also an editor and linguist, I looked down and saw the letters in my rack to open the game by laying down BOOGERS, for 76 points. (Do not say “Well, you could have played GOOBERS.” First of all, that would have been worth only 74 points because of the location of the double letter score; second, I play Scrabble to have fun, do you mind?)

Sure, booger is a childish word. Giggling at the idea of socially inappropriate bodily excretions is childish. But this word has a certain power to it to make grown adults momentarily childish again (and I don’t trust anyone who just looks down their nose at it) (“down their nose,” heh). This power needs to be used for good effect: like a little cherry bomb in just the right place (not like a booger in just the right place, that’s gross, stop). It signals to your readers that you are not as starchy as all that. And it’s somehow more gleeful than, say, fart.

I’m sure that the sound of it has some effect on how we receive it. There’s something goofy and ugly about those two voiced stops, that “oo” vowel, and that final syllabic retroflex “er.” Goober is a goofier name for a peanut (and sometimes for a booger); boggart and hobgoblin are bug-ugly beasties; bogeyman is menacing but perhaps more prone to going “Boo!” than to cutting you.

And booger may be related to some of those. It was used as a term for a goblin or bogeyman before it meant snot, and it seems to be descended from bogy or bogey (which in fact is also a British equivalent to booger) and boggart or boggard, all of which have some relation to either bog (as in where the evil spirits dwell) or bug (in the sense of ‘evil creature’, and we don’t just mean a mosquito or wasp). The trail is hard to trace for sure, just because words like this one don’t mix in polite company and so don’t show up in print as much. But we do know that its use for that thing in your nose showed up first in print in the US in the late 1800s.

And its first use on television? Well, I’m not sure about first, but its moment of greatest fame was definitely first broadcast on September 18, 1978, in the opening episode of WKRP in Cincinnati, when Andy Travis told Johnny that the station was going to a rock format and gave him free rein – here, watch:

impact

Have you ever had trouble remembering where to use affect and where to use effect?

Have you ever had trouble remembering how to spell impact?

I don’t have actual evidence, but I have long suspected that those two questions reveal much of the cause for the common use of impact where effect or affect would serve.

Of course that’s not the only reason. Impact is a more vivid word: it presents an action, something not only specific but forceful. In its sound it’s even reminiscent of something impacting – for instance, a fist into a palm or, hmm, a space fighter into a planet surface. And it doesn’t have any schoolroom fussiness flavouring it. No wonder high-powered businesspeople (and those who want to seem high powered but aren’t really) seem to like it.

But that leads us to a reason so many people dislike it, at least in those uses where affect or effect would serve as well (if not as impactfully): it’s associated with business jargon. Which is associated with businesspeople of the sort who blather endlessly and think far too highly of themselves. Because when someone hates a word, it’s nearly always really because of who they associate the word with – those undesirable, inferior, ignorant and/or too-big-for-their-britches kinds of people. And this forms a sort of reinforcing circle: you dislike the word because of who you imagine using it, and you dislike who you imagine using it because of these dreadful words they use.

There is another factor, of course: conversion. Nouning and verbing. Although converting nouns to verbs and vice versa (and often to and from adjectives and even adverbs as well) is one of the glories of the English language, not everyone glories in it, and some people insist they will never be converted to converting, even though many of the words they process without a second thought are products of the process. They see conversion as something barbarians do (it is a barbarism because barbarians do it, and they are barbarians because they do this barbarism). And it plainly comes into the question with impact.

Impact, as we all know, has been converted. It entered the language and was used clearly and simply (impactfully, even), and then some people got their hands on it and started using it a different way. And this is true… but not quite in the way most people think.

Impact entered English as a verb by the early 1600s, used especially in the past participle, to refer to packing in or pressing close. Within the century, impact on was also in use to mean ‘impress upon’. It was not until nearly two centuries later, in the late 1700s, that it was first used as a noun, to mean what we usually use it to mean literally now: “the striking of one body against another,” as the OED puts it. By the early 1800s it was in use figuratively in contexts where effect could equally be used (if less effectfully). By the early 1900s this more action-oriented sense was established for the verb as well, literally and figuratively, with and without on following it. Also before World War II we got the word impactful.

So it’s not altogether wrong to say that the least liked uses of this word are modern, if by “modern” you mean dating to a century ago (or two, depending on the usage). But still, it is interesting to see it get such bad press, while a word that could have come to be seen as “bad press” slips through. 

I mean compact – which English knew as a verb in the early 1500s, and also as a noun by the 1600s (with various uses being added over the years – the makeup sense dates to the 1920s), but which first joined our language as an adjective in the late 1300s. Imagine, an adjective turned into a verb, and when we already had suitable words such as press and pack available! And then into a noun!

It will not have escaped your attention that impact and compact are related. (Well, the sense relating to packing is related; the sense related to pacts and agreements is not.) The only difference is im versus com, which is really in versus con but the p causes the n to assimilate. The pact root is a past participle of Latin pingere, which also shows up in impinge (and impinging can lead to impacting). So you might think we could have an adjective of impact as well. But I dare you to try “this is very impact.” You might get attact.

The impact of this word is certainly varied and variable. It also behooves those of us who work with words to be aware of the contexts in which it is welcomed and those in which it may be shunned. Nothing forces you to use it when you don’t want to, of course. But its use in place of effect and affect is becoming ever more impacted in our language, and you won’t be able root it out of others’ mouths like a wisdom tooth.

nother, nuther

OK, here’s a whole nother one. Or is that a whole nuther one?

Well, those who say it don’t so often write it. And in part that’s because the moment they go to set it down – and perhaps well before that – they are aware that it’s “wrong.”

Oops. Did I use scare quotes? I did. Yes, yes, we know, another is from an plus other – we write it as one word, just as we don’t with asingle, only sometimes do with anymore, and never ever ever do with alot. (Well, hardly ever.) But anyone who wants to get all high-horsey about how this means that it absolutely must be split as an other should look out, lest their high horse be buzzed by Ned in his orange gyrocopter.

By who in his what, now? Well, if the stork brings babies, Ned in his orange gyrocopter brings lessons in historical word redivision. The lesson he teaches is that we are much more attentive to how words seem to divide on the basis of sound patterns than we are to their actual historical divisions.

That gyrocopter is a prime example. The word gyrocopter is (obviously) from gyro plus copter, and we know what both of those things are. But many of us don’t know that helicopter, from which we get copter, is not made from heliplus copter; it’s from helico (i.e., helical, which means spiral) plus pter (Greek for ‘wing’ – a pterodactyl is literally a ‘wing-finger’). Splitting helicopter at pter is illegal in English phonology. But even in cases where a split is allowable, we don’t always go with history; consider chocoholic, which blends chocolate (historically a one-piece word) with alcohol (where the historical divide is after al, which means ‘the’ in the Arabic source).

And then there’s Ned. Ned, like many Neds, is officially Edward. There was a time in the history of English when we would say mine rather than my before vowels, just as we say an rather than a, and mine Edward (a way of addressing an Edward with whom one is familiar), shortened to mine Ed, became my Ned – at first probably consciously, in a jokey way, but later on not so much. That happened because even though mine and Ed are two words, phonologically we tend to want a consonant at the start of a syllable more than at the end of one. So “my Ned.”

However, we are also often aware of that habit, and we mentally correct for it – and sometimes overcorrect for it. That’s why a napron became an apron, a nadder became an adder, and a norange became an orange. (And the adjective orange in its turn came from the noun.)

So we have plenty of historical precedent for shifting word divisions. Indeed, if the word other had become nother, we would have a justification for that. And in fact for some people at some times, it did. There’s a goodly historical record of its use stretching back to the 1300s. But it never took over from other, and over time it came to be associated with nonstandard varieties of English – regional dialects, the speech of people who had neither the money nor the standing to learn the kind of English that gets you into marble hallways.

It also came increasingly to be used in just one context: a whole nother. Every nother place we might use it (any nother dog, the nother shoe, a distinctly nother place, a – now that you mention it – nother reason), we use other. (And, in truth, we also use a whole other more than a whole nother.) 

And, yes, because it’s officially “wrong,” it’s considered “uneducated” even though many people with plenty of education use it… and so some writers have gotten into the habit of using what’s called “eye dialect” when setting it down in the mouths of fictional characters: spelling it more phonetically with the imputation that the speaker would spell it thus (or, to put that in eye dialect as though some yokel were saying it, “spellin’ it moar foneticly with thee impewtashun that thuh speeker wud spel it thuss”). Which means nuther, as in a whole nuther.

And, I suppose, fair game. I have it on good authority that plenty of people grew up spelling it that way without negative imputation. After all, nother looks like it rhymes with bother. (But on the other hand, nuther doesn’t rhyme with Luther.) This is all good reason to be skittish about putting it down on paper – English spelling is a whole nother thing.

ahold

Sorry if you were trying to get ahold of me. I was… What? Did you say something? No? OK. Well, as I was saying, I was trying to get ahold of some parts for my… Yes? I’m sure you said something. Well, you look like you’re about to say something. Come on, grab ahold of yourself. I…

What’s not a word?

Are you kidding me?

Look, I grew up with the phrase get ahold of. Yes, one word; I’ve seen it in print enough times.

Get hold of? I mean, yes, I guess people say that too, but what’s your point? People say come around and come round, and no one is getting all twisted up about it or saying the English language isn’t big enough for both of them. Because, come on, the English language is big enough for anything. It’s as capacious as a suburban American big-box store parking lot. If we don’t have more than one word for something, it’s weird.

Well, I’m not going to force you to use ahold. But I wouldn’t say that hold is an exact equivalent. Sure, there are places where ahold can be replaced by hold – as in grab hold of this or grab ahold of this – and the main difference is just the rhythm and that handy extra grip of the a-, like a thumb adding to the four fingers of hold. But that a- also has a sort of prepositional sense to it – there’s a sense of a- that means ‘to’ (seen in a-hunting we will go and also as in well-known words such as ahead) – and so can have more of an implication of motion and of presence. And, on the other hand, because of patterns of usage, it might tend more to bring to mind interpersonal contact. 

Consider this sentence, from the Journal of Biblical Literature in 1934 (thanks, Google Books!):

Haggai had no further interest in the nations than to get hold of their money.

What if it were

Haggai had no further interest in the nations than to get ahold of their money.

If it’s not making you think of trying to phone their cash, it at least might more clearly posit that there definitely is money, and a defined amount of it.

How about this sentence, from the 2019 book Forever Alpha by John K. Balor:

In Earthbound, he is willing to kill all the Kaldorians, to get ahold of their ship.

How about without the a?

In Earthbound, he is willing to kill all the Kaldorians, to get hold of their ship.

OK, yes, the differences are subtle. And they’re not lexicalized – a dictionary won’t tell you that one means one thing and the other means another. It’s not even like further and farther, where there is an acknowledged tendency for one to be used figuratively more than the other. This is partly because many subtle differences in usage aren’t reflected in dictionary definitions. And it’s partly because your dictionary might not have ahold – but then again, it might. 

The Oxford English Dictionary has it, though it notes that it’s generally “colloquial” and “nonstandard.” Its earliest citation is 1850. I can show you a use of it by Walt Whitman in the 1892 version of “Song of Myself”: 

Lads ahold of fire-engines and hook-and-ladder ropes no less to me than the gods of the antique wars

But if you check Google Ngrams, you will get a chart showing evidence that it really got into regular published use in the 20th century, and especially the later 20th century, and has not overtaken hold in the contexts it’s used in. But, then, so did quite a lot of other words we have no issue with.

If you’ve clicked on that Ngrams link, you’ll notice a third line, for get a hold of. Of course that makes sense; in fact, if you’re not familiar with English idioms, get a hold of makes more sense than either of get hold of or get ahold of. Notwithstanding that, in my youth I assumed that get a hold of was an error for get ahold of. (I didn’t think get hold of was an error; I thought it was just like come round versus come around.) I don’t think it’s an error now. It’s just the third option! 

And why not? English likes to have all the words and usages it can get ahold of. Some people find our language untidy, but that’s just because it is. (You want tidy? Try Esperanto. Or maybe Finnish.) We have as many words as we can get hold of. Adding a word to English makes me think of that scene at the end of Raiders of the Lost Ark, after Indiana Jones has managed to get ahold of the Ark of the Covenant and it’s being added to a vast storeroom of treasures looted from around the world:

But ahold isn’t a word we pilfered from anywhere else. It’s home grown. And I grew up with it, and I still use it and like it. So there.

fugu

I’m finally going to taste fugu. I’ve been wanting to taste it for a while, and today’s the day.

What? Ha ha, no! I’m not tasting fugu, the puffer fish. I’m tasting fugu, the word. These are word tasting notes, remember?

It’s not that I wouldn’t enjoy having fugu. It’s just that I’m not in Japan and I’m not going to spend a couple hundred bucks on sashimi.

I’m tempted to say “Also, I don’t want to die.” But these days you have a higher chance of being killed by undercooked turkey. Sure, half a century ago up to a hundred people a year died from eating fugu in Japan. Quite famously, in 1975 it killed one of Japan’s most famous kabuki actors, Bandō Mitsugorō VIII. But these days they’re much better at preparing it. Also, there are low-poison versions available. So the annual deaths are in the low single digits.

That still doesn’t sound inviting, though, does it? It sounds like, uh, bad marketing.

Except when it’s very good marketing. You narrow down your target market, sure. But you can charge a lot for the product. Fugu is a luxury food in Japan, and there are hundreds of restaurants that specialize in it – every one of them with a chef specially trained and licensed in the art of not killing you. (The people who die from eating fugu these days are pretty much always people who tried to prepare it themselves or had an untrained chef prepare it.)

So, you know, you almost certainly won’t die. But you have the idea that you could. It may even give you a little tingling in your tongue and lips as you eat it, just as a reminder that you’re eating trace amounts of an extremely potent neurotoxin. Fugu? Yolo! It’s like an FU to death (or, I suppose, as they say in Italy, fanculo!).

Anyway, the word fugu is fun in the mouth. It may not give your lips and tongue a light tingle, but it does feel like it might be risky. It also makes a good vocal gesture: a little puff of air through the teeth, and then blowing through a tunnel (“oo”) with a little echo knock at the back of the tongue. It could be good for blowing out a candle (out, out, brief candle)—slightly less so in Japanese, by the way, because they don’t round the lips for the vowel.

It’s even more fun in Japanese writing. Just as preparing fugu requires special knowledge, so does reading the kanji for it. Japanese has multiple writing systems used in parallel, and while the hiragana and katakana systems are phonetic (and fugu is most typically written with katakana, as フグ), the kanji system is borrowed from Chinese, and the relation between what you see and what you say has to be learned carefully. And the kanji for fugu is a good example of this.

In kanji, fugu is 河豚. Now, if you read those characters one at a time, you will say “ka buta,” which means ‘river pig’ (which is what the Chinese name of the fugu means; in the pinyin representation of Mandarin it’s hétún). If you say them as though they’re one word rather than two, you’ll say “katon,” which is the other way of saying the name of the puffer fish – if you call it that, you can reasonably expect to be understood. But normally, when you see 河豚, you say “fugu,” which is the usual Japanese name for the fish – it’s probably derived from the Japanese word for ‘blow’. It’s kind of like if in English we wrote aubergine but said it as “eggplant.”

So when you come to fugu, you have to be prepared. And when you come to fugu, you have to be prepared, and so does the fugu.

What happens if you catch a fugu unprepared? It’s a puffer fish, so it inflates and gets all spiky. It doesn’t do it in an instant, like in the cartoons; it takes several seconds. But it’s what makes these fish famous, even more than their toxicity. I’ve often said that if you say certain things to me or raise certain topics I will turn into a puffer fish; in Japanese, a normal sense of fugu is ‘someone with a quick temper’.

And puffing up is typically the last thing a fugu does, because the chef fishes it out of the tank live right before preparing it. It gets pulled out of the water and carried to a cutting board, so yeah, you can expect a reaction. And then, gradually, it de-puffs. (You might want to think it makes a sound like “fuu… guu…” as it does that, but no.)

As to what follows, well, there are plenty of videos on YouTube of chefs preparing fugu, if you feel like seeing someone cut up a very newly dead fish. You think eating fugu takes guts? Cutting up one takes lots of guts… out of the fish. And, by the way, they’re all extremely toxic. Don’t cut yourself. The liver is especially full of tetrodotoxin. You should never eat it. Not even if you’re Bandō Mitsugorō VIII and sure you’re immune to it. Because no, you’re not.

And then the flesh is sliced into many very thin slices with a very sharp knife and served on a plate in a pattern like flower petals. You dip them in ponzu (a sauce made with soy, citrus fruits, and a few other things) and you eat them raw. You can also have them in a hot pot. What does it taste like? According to my friend Daniel, who had it several times while living in Japan, “The closest I can think of is hirame (flounder). Very delicate. You savor this. It’s exquisite.”

Oh, and also there’s that tingling on the lips and tongue. Apparently you can’t count on that; if it’s been very carefully prepared, you might not get any noticeable amount. But why pay all that dough if not for at least a little taste of death, a memento mori?

If fugu hasn’t been carefully prepared, it’s still edible… but only once. It doesn’t kill you instantly; it takes several hours. Your whole body is gradually paralyzed, and you die of respiratory failure while fully conscious (not even in a state of fugue) and unable to communicate. That’s how that kabuki guy snuffed it. Oh, and there’s no antidote.

Yeah, yolo, You Only Live Once, but you know, I’d rather make it last as long as possible. And, come to think of it, my bank account too. Because if you’re gone fugu, you’re gone for good.