Category Archives: word tasting notes

objective

The word objective has several senses, and what you mean by it depends on where you are and what you want. It’s all a matter of perspective.

The usage that comes perhaps most readily to mind these days is to signify ‘purely factual, unbiased, without any subjective element’. We use it to mean that what we are talking about is absolutely reliably true and indisputable and has no element of personal position or choice to it. Like 2 + 2 = 4.

Approximately 100% of the time any of us use objective in this sense, we are doing so to foreclose further discussion or inspection of the matter. In so doing, we are leaving out or misrepresenting relevant details. In other words, nearly every time anyone says something is “objective,” they’re lying to you, themselves, or both. They want you to take their personal position as unquestionable.

I’ll get back to this.

That sense is not the original sense; indeed, it’s not really all that old – only about two centuries. The more classical sense that is the source of it is ‘external to the self’ – the converse of subjective. Things that are internal to you are subjective: your emotions, your tastes, your thoughts; you are the subject of everything, and everything that starts in your head is subjective. You are the subject of your world, after all, the conjugated noun of every sentence in your own narrative. Whatever is not you is an object of your perception and action, and the world of things external to you is the objective world. (The word object comes from Latin for ‘thrown against’ – perhaps like a wall or an enemy. Subject is from ‘thrown under’; no bus is specified.) 

Now, as Immanuel Kant spelled out for everyone (though not everyone agrees anymore), the real world is out there, away from you: we are in a world of objects, literally every thing you encounter and move in. That is reality. If you bump your foot into a metal bucket, your senses convey to you the impact, and so you know it. Light bounces off it and enters your eyes, and so you see it; sound comes from its vibrations from the impact and enters your ears; you know all these not-you, non-subject things, from what your senses tell you, and all of those inputs originate outside of you. The existence of that hard, bucket-looking, bucket-sounding thing is objective.

In line with this sense we also get the grammatical sense: objective is the noun case that is also called accusative (it is sometimes extended to the form called dative, which has to do with giving, not with calendars). This word accusative makes it sound like you’re placing the blame on the object of a sentence (in “I blame him,” him is in the accusative – it’s the object), and that’s what the Latin means, but the Latin was a mistranslation of the Greek original, αἰτιᾱτική, which really just meant ‘expressing an effect’ – that is, what is in the accusative case is affected. It’s the object of our affecting.

Which adds a new perspective: the object isn’t the thing giving input to us; it’s the thing we act on. And that’s also a reasonable view of the objective in the real world. We know the bucket is there because of our act. We receive sound and light because we put ourselves in the way of the waves and receive just the ones that are coming to our position. And we interpret them, both consciously and automatically. Anyone who skis knows what it’s like coming inside after a few hours outside, taking off the goggles, and seeing everything with an unusual colour cast for a while until our eyes adjust. Anyone who has ever gone outside on a dark night – or lived through a nighttime power failure – knows that the eyes take a bit of time to adjust to the darkness, so eventually you see things that at first were invisible (or looked like monsters). And if you’re walking around your dark house at night, with its various normal sounds (fridge, HVAC, etc.), you know that an almost inaudible floorboard creak or breathing sound can be extremely prominent. You don’t choose what you hear, but you choose what you listen to.

Taking the sense of ‘acted on’ further, we have the noun sense of objective meaning ‘goal’ – as in thing to be achieved (learning objectives) or possessed or even destroyed (military objectives). Yet again, we have a thing external to us that is viewed in terms of our actions and our desires. We know that it is an object because we are the subject, and we subject it to our actions (unless, or even if, it objects). In all of this, the objective exists as objective precisely because we are subjective. Your objective world is the other end of the sentence with you as the subject. And there is also a whole world of other possible objects out there that you have nothing to do with, either by choice or by ignorance.

But there is one more sense of objective that really puts this all in perspective. To me, this usage always felt ill-fitting, but to native speakers of many other languages it’s absolutely natural. When in Italian they say obiettivo, in French objectif, in German Objektiv, in English we usually say lens, as in what is on the front of your camera. What allows the subject of your camera (the film or sensor) to record the external, objective world. The reason it is called the objective is that originally the term refers, in telescopes and microscopes, to the lens that is on the end towards the object of observation, as opposed to the ocular lens, which is on the end towards your eye. But in photography, the entire multi-element lens is also called the objective – the ocular would be the eyepiece you look through on the top of your camera body.

In a way, this seems quite apt. Photography, after all, has always been the “objective” art form, not involving the interpretation of brush strokes and paint choice and framing and composition and so on. You just point the camera, click, and what you get is objective reality.

And, in fact, it is objective, but not in the currently most popular sense. In all the other senses. In a sense that reveals the truth of the objective.

Let’s say you’re on the street and something happens: one person rushes forward and pushes another person. You have a camera in your hand and you click the shutter. Hey presto, objective reality, a record of the objective fact of the occurrence at that instant.

A record of one specific moment (maybe 1/500 of a second – and if it’s a focal plane shutter, not the same 1/500 of a second all the way across the frame, but I’ll leave that aside or you’ll glaze over) cut out of all the moments before and after. Facial expressions frozen in time, in a way we never see them or interpret them in real life. One moment in a chain of physical actions, not showing how it started or how it ended (did the person shoving trip? run from a distance? interact before the shove? did the person shoved fall down? stumble in front of a car? fight back? just keep going?). A frame that includes everything in it seen from a certain angle but nothing outside its bounds (were there other people watching? shouting things? also running to shove? was there a car heading towards the person? were they being shoved into its way or out of its way?). A frame that shows only what light came to an exact specific position, from which the majority of everything that shed light could not be seen (the other sides of the people, what was behind them, how even things visible from the one angle appeared different from another angle). A frame taken with a specific angle of view (what kind of lens – wide, normal, long? how close to the action?). Captured with a certain exposure and a certain colour balance (was it really that dark or bright? was that dress bright red or really a more subdued orange?) – or perhaps even in black and white, which by choice removes all the colour information. 

Black-and-white photos are a particularly good example of the objective. We often talk and think of “black and white” as pure objective fact, but it’s objective in that it’s been acted on. The world of light, after all, is not monochrome. The choice of what shade of grey each colour comes out as is up to the film manufacturer, or the software that converts the image, or the person controlling the software. When someone presents a black-and-white photo as objective, they are expecting you to accept something with important available details removed; they are expecting you to take it as absolutely true, as though less information were more truth.

And, whenever we say something is “objective,” that is, almost without exception, what we are doing. We can say that 2 + 2 = 4 is objective fact – but so very often we’re really dealing with a 2 that is rounded down from 2.4, and another 2 that is rounded down from 2.3, and if we had added the originals together we would have gotten 4.7, which would round up to 5. Even when we teach “objective facts” in schools, we are choosing which facts to teach and which not to teach, and how much emphasis to give each, and how to present them and their implications. And so very, very often, we are taking personal judgments as unalterable facts (any statement of the aesthetic quality of some work of art is strongly affected by cultural background along with personal preferences, and yet we love to talk about “good” and “bad” art as though it were intrinsically and constantly good or bad).

It does not follow from this that “there is no truth” or “you can say whatever you want” or any such thing. Statements presented as objective may be verifiably true within their constraints – any person in the same position will encounter the same inputs. But they’re inevitably incomplete, and things that are left out may be important to understanding the matter. 

Even when scientists conduct experiments, they’re choosing what to study and not to study (they choose things that are interesting to them, that will probably get results, and that there is funding available to study), and how to measure it and interpret the findings; the facts they set down are objective in that they are (or are supposed to be) reproducible results, but they are also objective in that they are the result of subjective desires and actions, and they approach the matter from a specific angle – they answer the questions they were set up to ask.

And, likewise, a reporter covering a news story may get all the quotes right, but there are countless uninspected choices: what to include and what not to include, what to describe and how, when and where to tell it, what counts as newsworthy and why… And the only way you can have a useful, reliable understanding of that is to know the position of the reporter and the publication. Everyone has a position; everyone is a subject, making choices of object. When they say they’re objective, they are implying that they made no choices and that their position is the default from which all others are deviations. But you can’t see why the people in the picture were there doing that, you can’t see that person just out of the frame pointing a gun, and you can’t – ever, by definition – see one thing everyone else there saw and reacted to: the person with the camera taking the picture.

decart

At the end of our shopping hour, we have willed our representations into the cart, and expense will thus follow. But whatever our haggling has determined the price as being, we can’t reach the transcendental ideal; with a moue, we at last admit the emptiness, and come to a gradual awakening with the dog-ends of the day. What we have seen displayed and have carded for payment – ah, the plasticity of currency – we sooner or later card from our lives and discard, taking out of play. Descartes said Je pense, donc je suis, but Decart says Je dépense, donc j’essuie: I spend, so I wipe away. What I have thought into being I now unthink it out of being and time. And so it goes from carted to decarted.

And so boxes both small and big end up empty, and even the tired is retired. Decamp? Decart. And what is decart? According to Oxford, it means (a) ‘discard’ and (b) “turn out of, dislodge or expel from.” The first sense is directly related to discard – which, by the way, is a literal reference to card games: if you take a card out of play, you discard it (not display it). The second sense appears to be taken from cart, the kind of wheeled conveyance, which is from an old Germanic root.

But both senses of this word are empty now, decarted, their roots still grounded but their branches decorticated. And so it goes: nothing stays as it was; everything moves on. And as it passes, we follow. Il passe, donc je suis. There will be something new. There always is.

comme ci, comme ça, see-saw

“How are you doing?”

“Oh, comme ci, comme ça. You know, like a see-saw: up and down.”

“Wait. Is that where that comes from?”

“Which?”

See-saw. Does it come from comme ci, comme ça?”

“Nah, but that’s neither here nor there.”

“Did you look it up?”

“Yeah, I tracked it down. I went to Wiktionary and typed in see-saw and it said ‘Alternate spelling of seesaw.’ But when I went to the Oxford English Dictionary and typed in seesaw and it took me to the entry for see-saw.”

“Well, that’s kinda back and forth.”

“Yeah. But the etymologies were about the same. Wiktionary said it’s probably imitative reduplication, like teeter-totter, flip-flop, ping-pong, et cetera. Oxford also said ‘A reduplicating formation symbolic of alternating movement.’”

“Alternating. Well, so why isn’t it saw-see?”

“Same reason it’s not totter-teeterflop-flip, and pong-ping. We have a long tradition of using front-back rather than back-front vowels to signify the alternation.”

“Hmm… I feel kinda so-so about that.”

“Well, there’s always this and that. If you find an exception, it may seem like a big hoo-hah, but I wouldn’t shout ‘Whoo-whee.’ In general, it seems to start near and go to far, if the front of the mouth is near and the back is far. Just like this and thatneither here nor thereins and outs, and of course, in French, comme ci, comme ça: ‘like this, like that.’”

“Oh, right. That’s why it’s not ‘come see, came saw.’”

“Well, I’m glad you’ve conquered that one.”

“Yee-haw.”

hoar

Hoar is one of the legends of the fall.

As the autumn ages, and leaves turn and let go and settle together on the soil, the temperatures fall, too, and the donzerly dew crystallizes into a beard. The day is greyer. We are on the downslope of the year, when all that has grown and produced is ready to nourish people and animals and then turn back to nourish the earth itself. And you know the fall begins by the turning of the weather; and you know it continues by the turning of the leaves; and you know it grows old by the morning hoarfrost; and you know it has fallen all the way by the shadows of the shortest day. These are the legends that tell you the time of the fall.

We think we know the fall of people that way, too: the passions cool; the green freshness is gone; the fringe goes grey; and at last there is the shortest day and the slide into the shadows, when even the legends must fall. But it is not as simple as all that, for age can bring depth and understanding as well. We may lose our literal scotopia, our night vision, but we can see better in darker times; and our scotagons, our unseen nemeses known only through struggle, come to reveal their shapes to us. It is true that the course is not the same for all – some people grow in foolishness, some in complacency, some in selfishness, some in hostility – but grey hair is at least a sign of having had the time to grow in mastery and even nobility. Not all that has hoar is hoary.

This word, hoar (and its derivatives hoary and hoarfrost), is not new, that’s for sure. And we know where it comes from. In Old English it (in its old form har) meant ‘grey’ or ‘old’; it came from a Proto-Germanic root reconstructed as *hairaz meaning ‘grey’, and that in turn came from a Proto-Indo-European root *(s)ḱeh₃- ‘grey, dark’. If we follow the descending lines of these roots, we find many, including Ancient Greek σκότος (skótos, ‘darkness’, seen in words such as scotomata and scotopia), English shadow, and, more closely related to hoar, German hehr ‘noble’ and Herr ‘lord’.

And so it is. In the inexorable cycle of the seasons, the leaves turn and fall, and the frost forecasts a full shutdown. But in human life, although we also have our seasons, even with grey hair we are not necessarily dimming; we can still be fresh and may even turn over a new leaf.

hamesucken

Many of us are stuck at home, and it’s sucking. It’s not so much working from home as living at work.* Imagine waking up and finding that your boss – and even your clients – have snuck in through the window… or through your computer screen. It’s mental assault, a kind of theft of personal time and space, a violation of the one place that a person should be able to feel not just comfortable but insulated and able to let in others only at will and very carefully. “WFH” has become hamesucken.

You may not know this word hamesucken – also spelled hamesoken and hamsoken – but in medieval times it was an important crime, and it remains on the books as such in Scotland. It’s a compound word; its parts are hame, which is Scots for home, which both come from Old English hám, and sucken or soken, from Old English sócn ‘visit, attack, assault’, from Old Norse sókn ‘attack’. So you could waggishly say that hamesucken refers to any visit to your home, but technically it’s assault on you in your home, or even just breaking into your home with intent to assault you – what is often these days called home invasion, though it could also just be (and most often has been) someone you know coming over to your place to kick your ass.

It may seem a bit much to say that having to deal with outsiders via the screen of your computer is like having a medieval English squire (or a modern Scottish one) kick your door open and serve a pottage of whoopass unto you. And indeed, when they are reaching through your computer screen, you can disengage – though perhaps under pain of derailing your career prospects (which may take longer to heal than physical bruises) – and there is no penalty to those who violate your inner sanctum (if there were, telemarketing would have been outlawed long ago). 

But what makes home home is not just the physical box of it; it is the protected space in your mental and emotional landscape. How can you be in control of your own space if you are required to cede control of some important part of it to bosses and strangers? And if you’re about to say “Simply set up a space in that spare room and make it your separate office,” consider that very many people simply do not have “that spare room” and aren’t going to go to the personal fiscal, psychological, and emotional expense of relocating to a new residence just to get one (talk about putting the cart before the horse!).

And so, while some people seem to think that WFH will be the wave of the future, there are many others who see things much more like those most famous Men At Work: 

Who can it be knocking at my door?
Go ’way, don’t come ’round here no more
Can’t you see that it’s late at night?
I’m very tired, and I’m not feeling right
All I wish is to be alone
Stay away, don’t you invade my home
Best off if you hang outside
Don’t come in, I’ll only run and hide

*Those of us who are our own bosses, independent, freelance, have a different position in all this; we have a level of control over this that those subject to the command chain of an organization do not, and we are also more used to mentally compartmentalizing. And even then, for at least some of us, it is still important to leave the home and go do work elsewhere in the wide world.

iopterous

We cannot live a perfect live without a little flash of purple flying past.

In the things of ordinary days – walking, working, shopping, dining – our lives are filled with browns, greys, greens, blues, and whites, yellows at times, and the reds that make us turn our heads. And this is all well and good in the mean quotidian, but the long slender fingers of the day deserve the occasional amethyst ring; the plain prose page of labour and leisure is ornamented best with the flash of an iopterous lexeme.

Iopterous! This is not a linguistic term, not usually – it is better fitted for entomology than for etymology. But I can tell you what ancient wings fluttered to give it to us. You may recognize pter, from helicopter and pterodactyl (and archaeopteryx and lepidoptera and a few others); it comes from Greek πτερόν pterón ‘wing’. But io? Is it from Io, the cow-horned maiden of Greek myth, now the namesake of a Jovian moon? No. Or Ios, an island in the Cyclades? No. It comes from ἴον íon ‘violet’. What is iopterous is violet-winged.

And what is iopterous? A violet-crowned woodnymph, and a varied bunting, and a violet-backed starling. And more kinds of insects than I can count – butterflies, dragonflies, moths, wasps, mantises, beetles, stick insects, and who knows what all else: they all have varieties with purple wings, humble little bugs soaring the air on flitting flashes of the royal colour.

And, of course, many a little – or not-so-little – word, a lexical rarity drawn from the jewel-box to set in a sentence as its most precious thing. Use them rarely and use them wisely and they will glint and glimmer and flicker and flash in the dark letters and white background of your page; use them to excess and you have a swarm of bugs, and you will need a bunting or starling to feast them away.

asymptomatic, asymptotic

Come closer. I’d like to touch on something that has come around recently: two words that at first may seem the same but that have an important difference –asymptomatic and asymptotic. Like concupiscent adolescents, they appear to be kept apart by ma. But that’s not quite it…

Asymptomatic means ‘not having symptoms’; it comes from a-, ‘not’, plus symptom, plus -atic, and symptom comes from Greek σύμπτωμα súmptōma, ‘happening, accident, effect of a disease’, which is from συν- sun-, ‘together’, and πίπτω píptō ‘I fall’. So when you have symptoms and it seems like things are falling apart, well, they may be, but the Greek roots mean they’re falling together.

Asymptotic means, to quote the Penguin Dictionary of Mathematics (edited by John Daintith and R.D. Nelson), “Describing a curve (or surface) that has a line (or plane) as an asymptote.” More to the point, it’s the adjective of asymptote. And what is an asymptote? “A line related to a given curve such that the distance from the line to a point on the curve approaches zero as the distance of the point from an origin increases without bound. In other words, the line gets closer and closer to the curve but does not touch it. See hyperbola.” A further point, provided by the Oxford English Dictionary, is that “a rectilinear asymptote may be considered as a tangent to the curve when produced to an infinite distance.” In other words, an asymptotic curve will never touch the line… until infinity. But once the distance between them is small enough, it rounds off to zero.

And where does asymptote come from? Greek ἀ- a- ‘not’ plus σύν- sun- ‘together’ plus πτωτός ptōtós ‘apt to fall’, and πτωτός in turn comes from πίπτω píptō ‘I fall’.

Oh. Well. That kind of… fell together, didn’t it, at long last? In the fullness of historical time, asymptotic and asymptomatic are the same thing. In our time, however, they touch on different subject areas… one is mathematic while the other has that thematic ma.

All of which made me want to write a poem. So here.

Asymptotic

Sometimes things fall together
Sometimes things fall apart
You turn me like the weather
But I can’t touch your heart

I’m having chills and sweating
My fever’s running high
These symptoms I am getting
Whenever you come by

I’m straight but you are curving
I won’t admit defeat
Love’s labours make deserving
In finity we’ll meet

My plight has made me plangent
That’s no hyperbole
Let me go on a tangent
And reach you verbally

I cannot be neurotic
About this number stuff
If you are asymptotic
I’ll just say “Close enough.”

ærwacol

“Early awake.”

That’s the definition of ærwacol. It comes from ær ‘early’ and wacol ‘watchful, vigilant’. It’s not a Modern English word; it’s Old English, which is to say Anglo-Saxon, and you can find it in an Anglo-Saxon dictionary. You can even find it in Old English texts (or else how would the lexicographers know of it?) – for example, “Đa cwæð se cyngc, ‘Leofe dohtor, for hwi eart ðu þus ærwacol?’” (“Then the king said, ‘Dear daughter, why are you awake so early?’”). But if you Google it, you will first find several pages of references to one thing: a play by Sean Dixon.

And that is how I encountered the word. I saw the play Aerwacol in 2000, staged on a railway handcar on disused tracks at Cherry Street and Mill Street (now when I go by there I cross streetcar tracks, pass a construction site, and transect a burgeoning neighbourhood). I watched it, I am happy to say, at 7 p.m., not at some horrid early hour. There is early rising in the play, if memory serves, but in particular Aerwacol is the name of a bird sanctuary that the characters find themselves in. (From personal experience of staying in rural areas where birds flock, I can tell you there is probably no surer place on earth to be ærwacol unless absolutely nothing wakes you up.)

I was reminded of ærwacol (the thing, not the play, though of course the play follows in memory) more recently when somewhere or other on the web I was offered an article on how to become an early riser.

Now, why the hell would I want to do that.

No, no, you don’t have to tell me, I know: “Early to bed and early to rise, makes a man healthy, wealthy, and wise,” as Ben Franklin said (“Or a milkman,” as someone else said, and then some kid said “A what?”). It is typically presented with the air of a personal habit of the utmost moral superiority. Productivity “gurus” are often quite fond of encouraging early rising. Get a head start on the day! Get those extra hours in! Be productive in the calm peace of the morning! Meet the day with things already done! Don’t be some indolent late-sleeper! Oh, and then make sure to get to bed early so you can meet the next day well rested!

Or I could sleep exactly the same number of hours, arise well rested when the sun is already about its business, set to tidying up all the messes that other people have made already (the early bird may get the worm, but the second mouse gets the cheese), fill my mind with the thoughts and observations of the day, and then, after most people are already sloped off into slumber, take advantage of the calm peace of the later evening to synthesize and produce and have it all ready before even the earliest early riser gets up. I also get to stay to the best part of parties and catch late-night TV if I want (don’t even try to tell me early-morning TV is better). And, as an added bonus, I don’t get to come off like I’m a superior person for doing it that way.

Look, I know my brain. I schedule my work day around its phases. It’s optimal for creativity in the evening, analytical work in the afternoon, and basic reactive application in the morning. There are many people who love writing first thing in the morning, and that’s great for them; when I do that, I generally don’t like the results. People are different, eh?

But think of all the famous early risers! Michelle Obama! Tim Cook! Richard Branson! Benjamin Franklin! Rachel Ray! Napoleon Bonaparte! Ernest Hemingway!

OK, but think of all the famous night owls. Barack Obama. Winston Churchill. Charles Darwin. Carl Jung. Keith Richards. Elvis Presley. Fran Lebowitz. James Joyce. Marcel Proust. J.R.R. Tolkien.

It’s true that there seem to be more CEOs and motivational speakers among the early risers, and I can see where being up in time to set the day’s agenda can be useful for people in some lines of work. But on the other hand, overall I much more enjoy the insights and output of the night owls. So you might as well be who you are, eh?

Still, I appreciate occasionally being ærwacol, when it’s of my own choosing. To run a race, for instance (on my 51st birthday I got up early-ish, ran a half marathon, and then headed straight from the finish line to the spa for a massage and a champagne brunch, and let me tell you, I’d be happy to do that again). Or to go skiing (only because otherwise you only get half a day on the slopes). Or to travel somewhere (so there’s still day left when we arrive). I even occasionally think about getting up and out just to take pictures in the golden hour after dawn when the light is on all those places that it’s not on during the other golden hour, before sunset.

think about it. I haven’t done it yet. I’m sure it’s nice, or something.

And I’m not so fond of involuntary being ærwacol, due to stress, or fire alarms, or eating the wrong food for dinner, or noisy birds. I do not feel happy or ready in those circumstances. Night owls are often associated with depressive tendencies, but let me introduce you to another Anglo-Saxon word: uhtcearu. From a famous poem called “The Wife’s Lament”: “hæfde ic uhtceare hwær min leodfruma londes wære” (“I had grief before dawn about where in the land my leader of men might be”). You can’t have uhtcearu without being ærwacol. And, at least for me, grief, like other stresses, is more tolerable later in the day.

presenteeism

We all know what absenteeism is: not being at work when you’re supposed to be (or in class, or in church, or wherever you’re supposed to be). Bosses often fret about absenteeism. “Calling in ‘sick’? That costs me money! Get your butt in that chair!

But if there’s absenteeism, then of course there’s presenteeism, right?

Yes, right. I didn’t make this up.

And what is presenteeism?

You may think it’s being where you’re supposed to be when you’re supposed to be there, or even going the extra mile and being there longer (and therefore being more productive) than required. And indeed it has been used in those senses in the past. But that’s not the prevailing current use. Today, as you will find if you Google it, presenteeism refers to showing up for work (or class, or whatever) when you really shouldn’t – because you’re not well enough, physically, mentally, or both.

And why would people come to work when they’re too sick to work? See above about what bosses say. See also our general cultural attitude towards productivity, or at least seeming to be productive. And also see your employer’s sick day policy… if there is one. Many people would be risking their jobs to take as many sick days as they truly need. Many more would be risking their career advancement.

So we show up. We surrender to the boxtickery of attendance counting. We know that business is not charity, it’s ass-in-chair-ity. If, on any given day, we just can’t even, then we switch off the even-canning machine for a while and try to look busy. And if that’s not an option, we work ten hours to get six hours’ worth of work done, we collapse at home with a glass of wine and a goblet of headache, and tomorrow we go back, Jack, and do it again. Because that is how we display our virtue.

And if we have a cold, we tough it out, because we don’t want to waste a sick day on that. So what if coworkers would get it – they’ll all get colds anyway. So what if it takes longer to get over it – I mean, do you know that for sure? Just take some medication. And if we have a flu, we try to tough it out because we have work that we’ll still have to do even if we take the day off. And if we have, say, Covid… um, could I lose my job if I call in and say I have it? If I get tested and find out I have it, I’m off work and locked down for 14 days, right, and…

Presenteeism, notably including forced presenteeism (of the “show up or you’re fired” kind), is known to be a factor in the spread of contagious illnesses. But even when it’s just a case of people not being in top form, it can be a bigger problem than most people want to admit. Even if you just look at dollar values in productivity – as though people are made for the economy, rather than the economy for people – research indicates that presenteeism costs about ten times as much in lost productivity every year as absenteeism. Presenteeism has been very well studied and the results show strongly that it’s an important problem. But bosses tend to like simple, easy measures, and things that they can see, and butts in chairs at desks are a much more visible and straightforward measure than relative losses in productivity due to working while unwell. Also there’s that whole Protestant Work Ethic thing and the valorization of “toughing it out” in popular entertainments.

We’ve had this word presenteeism at least since the 1930s, though the current sense that features the downside has been prevalent only in the past few decades. It’s formed, as you’d guess, in contrast to absenteeism, rather than being made from a word presentee that is in turn made from present plus ee. Absenteeism was formed around 1820 from absentee, which has been in English since the 1500s, borrowed from Norman French abscenté and referring first of all to someone who owned an estate but, contrary to expectation or requirement, didn’t reside on it.

For years, when I was on salary and got four sick days per year, it was a point of personal pride for me to take only one or none each year. A cold? No problem – I’ll get through it. I finally realized that I would get through the cold sooner, do more and better work when I was at work, and not get my coworkers sick if I took a day off to stay home and drink lots of fluids and so on. If I went to work, I was just borrowing on the future anyway.

So if you’re in less-than-top form, if you at all can, call in sick for a day (or more) and take care of yourself. There’s no time like the present not to be a presentee. (And if you’re a boss, stop assuming your employees are just out to cheat you, and for heaven’s sake don’t do asinine things like requiring a doctor’s note for every sick day, which just costs the system, takes time from the person’s recovery, and probably forces them to go to a doctor’s office, where they will be surrounded by sick people and might pick up yet another illness to spread around your office…)

cullion, cullionry

There are many popular idioms that equate testicles and their related substances, including testosterone, with virtue, valour, substance, courage, fortitude, and so on. “Do you have the balls to do it?” “I think this individual is lacking in testicular fortitude.” “What you need, man, is cojones.” “Grow a pair.”

There seem to me to be fewer that are more honest about the fact that men intoxicated with testosterone and the dictates of the contents of their nutsacks have a record of doing appallingly stupid things, making amazing messes, wreaking wanton destruction, and stomping through the world oblivious to their own inanity. Fewer idioms, but not none at all. Now and then we see an honest reference to cullionry.

Cullionry? It sounds… culinary, doesn’t it? And so it may be, if you mean prairie oysters. Huevos, you know, and not necessarily rancheros. Cullionry is the conduct of a cullion as roguery is the conduct of a rogue, and a cullion is… literally, a testicle; figuratively, a, um, dickhead. Loser. Jagoff. One who thinks himself among the lions but is more fit to be culled. I mean, just imagine if the world were run by 15-year-old boys trying to impress and dominate their peers and the objects of their attraction. (Alas, it’s not that great a stretch of the imagination.)

Cullion comes from French couillon; it’s related to Spanish cojon (you see it now, don’t you) and comes from Latin culleus (‘sack’ or ‘testicle’), which in turn comes from Greek κόλεος koleos (‘sheath’). It’s a well-formed word that could stand for anything, really; look at galleon, scallion, mullion, bullion, and billion and you will be reminded of the usually arbitrary basis of the relation between a word’s form and meaning. Cullion just happens to refer to the orchids (I mean figuratively, but also literally: it can be used to refer to the finicky flowering plants, and why not, they’re named after testicles – Greek ὄρχις orchis means both ‘orchid’ and ‘testicle’).

And cullion just happens to be used figuratively to mean ‘lowlife’. Somehow it seems not unreasonable that, in the time of Shakespeare, you might call a scoundrel or rascal a testicle (“Away, base cullions!” —Henry VI, part 2, I.iii.43), even as you would in other contexts highly value the same bits as jewels of manhood and emblems of fortitude.

The ambivalence has ever been with us; after all, a popular vulgarity referring to destruction and catastrophe also literally refers to the act that the same speakers most ardently wish to engage in as often as possible. And, of course, the organ so often used as an emblem of fortitude is famously the most vulnerable and sensitive – kick someone in the cullions and see for yourself. (And why would you do that? Because of their cullionry, of course.) But such is the contradictory nature of the classic cullion: both hyperaggressive and hypersensitive.