Mumbling Madness

photo of robin against the sky.
There’s never a muffled or slurred note from this fine fellow.

Circumstances determined that I would never become a movie buff. In my childhood, movies were forbidden. That created both suspicion and heightened desire, of course, so in my young adult years, I indulged in binges of movie-watching, all the while maintaining moral scruples about content. But then came marriage and children and a limited budget, followed by years of limited time; movies thus remained in the category of the special, a rare treat, like going to see live theatre. Actually, the latter occurred more often than the former. I did anticipate that in my retirement I would finally catch up on all the movies I’d heard about that I wanted to watch.

            Who would have guessed that in the meantime, mumbled lines would have become the standard of excellence? I will acknowledge that sorting out desirable conversation sounds from background noise has become more difficult for me; otherwise, though, I still have good hearing. So why was it nearly impossible to understand dialogue? It’s hard to get involved in a storyline when none of the characters speak clearly. Who cares about a romantic exchange muttered into the pillows? Heated arguments make no sense when characters speak over each other or otherwise muffle and slur their words.

            According to a recent opinion piece in our local newspaper, such inarticulate dialogue has become de rigueur in the name of realism. A lot of people mumble in real life, and so, the argument goes, to be authentic, actors should also speak without moving lips or jaws or bothering to face the camera.

            I readily agree that there’s lots of mumbling going on these days. Too many messages left on my phone remain unintelligible even after repeated replays; I’ve overheard conversations that might as well have been spoken in another language; lyrics of popular songs are often incoherent except perhaps for one or two phrases endlessly repeated. What’s much worse—one can select one’s entertainment, after all—I’ve struggled to understand health professionals who couldn’t be bothered to enunciate their words. In some cases, I was convinced that even if they removed their masks, I still wouldn’t have had a clue what they were saying.  

            The consequences of all this muttering and slurring of words should give us pause, I think. In the entertainment world, it means that the tales told are “full of sound and fury, / signifying nothing” (Shakespeare, Macbeth). Of necessity, story-lines have to be conveyed through action, which might explain the ever-increasing violence and general aggression in dramas and ever-more dramatic silliness in comedies. Characters can hardly be developed in any depth, if they never say much that can be grasped.

              In the real world, in the various situations that we all negotiate—and this is my real concern—mumbling has the effect of humiliating and excluding far too many people. It is acutely embarrassing to have to ask people to repeat themselves, only to hear the same inarticulate sounds, just with greater volume. Socializing becomes difficult and those important conversations we have to understand in order to buy a license, get a prescription, hear a public speaker (minister, politician, emcee at a wedding or concert), respond to an incipient quarrel seem too daunting to attempt.

            When we are also asked to cope with background music at foreground levels or urban noise or the hubbub of a larger crowd, the work of translating verbal codes into meaning is finally exhausting. Shouted mumbling is no better than quiet mumbling. The frustration of not being able to decipher the words of even those who are trying to be helpful leads inevitably to helplessness and increased withdrawal. Why bother going to the church service or the concert or the memorial or the family gathering only to sit on the sidelines, unable to participate?  

            Much has been said in recent years about inclusivity and the importance of meeting the needs of the vulnerable among us. I agree with that emphasis. We should be encouraged to be kind, to include, to make sure that all people have what they need and can access necessary services. So how can we begin to turn mumbling into an embarrassment for the speaker instead of for the baffled hearer?

Children pick up their speech habits from their caregivers, of course, and are unlikely to alter those without focused effort, both by them and by their teachers. Which translates into yet another obligation placed on our educators. That seems unfair. The movie industry can continue to churn out inarticulate dialogue because closed captioning is available (and is increasingly being used by ever younger populations). In the really important conversations among people who matter to us, subtitles aren’t an option. Lip-reading a mumbler isn’t a successful enterprise, either. Hearing aids amplify sounds, of course, and newer technology can help block out extraneous sounds, but no hearing aid, and no technician at the sound board of a public address system, can fix distorted sounds created by a lazy mouth.

            There’s nothing else to do here, friends, other than giving our lips and tongues some much-needed exercise and learning to enunciate our words. It’s not that hard, and even if it were, wouldn’t it be worth the effort to avoid shutting people out?  

photo of blue jay, peanut in beak, sitting on a bird feeder attached to a window.
The photo may be blurred but there’s nothing out of focus in this blue jay’s shriek when there aren’t enough peanuts in the feeder!

Learning a Gentler Discourse

Photo of a pond in a city garden with trees reflected in the water, stones in the foreground. Colors are muted since it was an overcast day. It's a scene made for reflection.

Who would have imagined, a mere twenty years ago, that it would become possible to speak English in two ways so different that the speaker of one would be almost incomprehensible to the speaker of another? I’m not talking about pronunciation differences here or even the accumulation of words borrowed from other languages. Those are common processes as people from different cultures interact with one another and influence one another’s languages.

No, what I’m referring to is a difference marked not by how words sound or which words are chosen. The current, unprecedented divide between two socio-political discourses has more to do with over-arching worldviews than with language, although words and their meanings have definitely been a casualty along the way.

As I’ve tried to understand what has been going on, I was reminded of experiences in my graduate student days (1989 – 1996).  Going back into the classroom after an absence of some 15 years had been traumatic for me. The vocabulary of my beloved discipline of English literature had changed. The way I had once read and written about literature, almost without thought about the process, had now gained a name – “formal criticism” – and been dismissed as naïve and uninformed. Entire clans of “isms” had come to life instead: new criticism, structuralism, deconstruction, psychoanalytical criticism, feminist criticism, queer studies, Marxist criticism, historicism and cultural studies, postcolonial and race studies, and reader response.  I had no choice but to learn what felt like an entirely new language along with its assumptions about the way that the world works and a certain attitude toward writing itself.

One particular class impressed itself in my mind: two students argued heatedly about how to interpret Kurt Vonnegut’s Slaughterhouse Five, a satirical novel about WW2. They offered radically different perspectives (neither of which I now recall – which seems instructive) on the Allied assault on Germany in the last stages of the war. In the manner of grad students, they wielded theoretical relativisms with great verbal skill but little discernible wisdom. The professor, tilted back in his chair as usual, listened until his patience ran out. With a crash, he righted his chair and leaned forward to declare “But Dresden did burn!” There was silence, and then sheepish acknowledgements that yes, that fact was beyond dispute. Therefore, any reading that ignored the burning of the city was invalid.

At the time, I had been appalled at what I saw as an assault on the very notion of truth. But back in that innocent time, we still had facts to fall back on, still had agreed-upon sources of information that were duly vetted before publication. We were learning to acknowledge differing perspectives; we understood that pure objectivity was an ideal only, yet we still believed that some objectivity could be approximated, given sufficient checks and balances.

That is no longer the case. Language seems as fractured and as impossible to mend as it was back in the biblical story of the Tower of Babel. Rebuilding the Tower is hardly an option in a world in which a brick for one side is seen as a sword by the other.

I owe that image of the Tower to an outstanding article on political and social polarization that I would strongly recommend: Jonathan Haidt’s “After Babel: Why the Past Ten Years of American Life Have Been Uniquely Stupid” (The Atlantic, May 2022). Haidt’s analysis is about as balanced as it is possible to be these days; he demonstrates clearly how both the left and right sides of the divide have contributed to the current mess, and then offers suggestions for rebuilding the institutions that make civil politics workable. The tone of the article is gentler than its title might lead you to think. Haidt is interested in rebuilding democracy for the future, not in raising ire.

The story of Babel is the best metaphor I have found for what happened to America in the 2010s., and for the fractured country we now inhabit. Something went terribly wrong, very suddenly. We are disoriented, unable to speak the same language or recognize the same truth. We are cut off from one another and from the past.

Jonathan Haidt

My second recommendation is entirely different: make time to visit gardens, repeatedly.

Photo of a carefully planned and tended vegetable garden surrounded by hedges and trees.
Vegetable gardens, forest gardens, tidy gardens, unplanned gardens – let them all be visited and loved.

Recently, after listening to yet another disturbing newscast, I fled outside and went to sit quietly on a rock beside a haskap bush. As my body relaxed, I began to hear a magnificent chorus of bees, each visiting one small blossom after another. Neither bees nor blossoms required words. There was no animosity. Indeed, without their togetherness, there would be no berries for my future breakfasts, or for the birds.

The discourse of loveliness is fruitful and wonderful. It refuses to take sides and responds to no outrage. All that is necessary for us is to look and listen. Visit a public garden or two, either alone or with other visitors.

An elaborately designed, large city garden with mowed grass, small flower plots, ponds and fountains, and a wide walkway for pedestrians.

Walk through your neighbourhood and stop to chat with a gardener or two. Open your heart to color and scent and design. Remember that the astonishing beauty of flowers and grasses and trees is temporary, yet everlasting. Each flower will die and leave behind a seed or a thousand to become new flowers.

So too we humans will die and leave behind memories, traces of who we once were. Let our gift to the world be gentler words and quiet caring.

Photo of a cemetery with plenty of trees and shrubs and flowers. One might describe it as overgrown.
Visit cemetery gardens – they help us think about what matters, in the end.

Despite our human tendency to read selectively and interpret according to personal assumptions, I remain hopeful that listening to one another’s stories will help us move on from the Tower of Babel in whose wreckage we are now living. It is advisable to choose a variety of story-tellers, and we will need to be prepared to listen to stories we might not like at first. If we can privilege the stories we hear in person from story-tellers whose context we can observe and whose voice we can hear directly without the noise of social media, maybe we can salvage enough bricks to begin building institutions that bring us together.  Along the way, we should never forget to grow gardens.

Close-up photo of a single pink rose.

We Need a New Word

Words are slippery.

They mean what they mean, yes, but they mean always within a context, and contexts change.

            As a child of the Protestant Reformation, a descendant of Mennonites (a radical branch of that Protestant Reformation), and a wordsmith, I’ve been thinking a lot lately about our political language. What on earth happened to protest since the 1500s? Why could I be proud of my religious heritage, yet so much on edge and unhappy now?

The verb “protest” has become more noun than verb. One doesn’t pro-tést these days – one joins a pró-test, and that changes more than just pronunciation and grammatical function.

 So: protest as a verb. It differs from object and from disagree. To disagree means, according to Oxford Dictionary, to “hold a different opinion.” In other words, you and I don’t think the same way about some idea or some thing or some action: cement is a better surface for an urban driveway than asphalt. There are good reasons on either side of that disagreement (cost, labor, endurance) but moral implications are absent.   

To object means, again according to Oxford, to express or feel opposition or disapproval or reluctance. That’s stronger than to disagree because emotion is involved. Whatever happens in the discussion, the one who is objecting feels hurt or offended or even appalled. That would be the distinction that my editing self would make. When my late father used to introduce me to his acquaintances as the “baby of the family,” never mind that I was already an adult with children of my own, I objected strenuously. It felt belittling to me, although I’m willing to concede now that he meant it as affection. We disagreed on the meaning of “baby” and I objected to his application of it.  

But to protest is to bring in not only emotion but moral judgment. Here I’m reaching back in time to try to recover the meaning of the word before it became a noun that means an official demonstration against government or some other powerful institution or leader. That’s the primary meaning now. Even in that noun form, perhaps especially in that form, the word carries the weight of moral judgment. A protest (noun) occurs because enough people judge some action morally wrong. It’s deemed unjust, unfair.

 If we’re talking about unfairness or injustice, it follows that the protester is in a position of less power than the person or institution against which the protest has been made. The protestor may be a direct recipient of the unjust action or maybe not. Many protests have been launched on behalf of those who had no voice or influence. The common thread is the moral judgment. This or that action is just wrong; it violates a law or some accepted standard of behaviour.

 There is something else about the verb “protest” that we seem, as a nation, to have forgotten entirely: it is intended to persuade. The very fact that the objection raised is morally justified assumes that the one who protests and the one against whom the protest is made share (or should share) a common ethical standard. The concept of injustice makes no sense without an accepted definition of justice.  Martin Luther, who inadvertently began the Protestant Reformation, appealed to the standard of the Bible and the tenets of Christianity when he protested against several actions of Roman Catholic clergy. His initial intention was first open discussion, then persuasion, based on a common faith.  

(Generally speaking, it is, of course, possible that the objection has been made in bad faith and is not morally justified; equally possible is that those whose behaviour has been objectionable do not have any ethical standards to which one can appeal. Neither case invalidates the protest’s initial purpose of persuasion. I insist that the ideal not be forgotten.) 

            By this point, given the current political climate, all sorts of righteous stances are doubtless being claimed by my readers, not to mention fervent disagreements with my definition of “protest.”

 So I will retreat temporarily into a simple illustration taken from my teaching years. A student was unhappy with an assigned grade; she felt certain that I had marked her paper unfairly because I was prejudiced against her. That is a moral problem. While some subjectivity is always a factor in marking essays, outright unfairness is unacceptable, not only to students but also to university administrators and department heads.   

As long as my student expressed her opinion courteously and presented evidence for her accusation, she was completely within her rights and could hope to be persuasive. My role was either to offer a reasonable explanation of the grade or to acknowledge her point and re-evaluate the paper (and/or ask a colleague to evaluate it). Either way, we should have been able to end the discussion with our dignity intact. Indeed, it could have been the beginning of an improved relationship.  

 However, if she had insulted me as a person and added threats of character assassination or even worse, she would have crossed a line between protest and blackmail—“you do this or I will ruin you.”  That is not yet physical violence, but it is violence. Her protest would have given up the moral high ground and become intimidation, thus turning the interaction into a power struggle, which leaves no one’s dignity intact, and makes an improved relationship very difficult, indeed. 

 When Martin Luther King, Jr., and Gandhi before him, insisted that any and all protests should remain non-violent, in language and in action, they were aiming at persuasion, which seeks to make clear what the relevant moral principles are and appeals to both a common humanity and a common acceptance of those moral principles. This is not to say that protests against long-standing evils such as slavery are easy. By no means. Many, perhaps most, slave-owners saw the protest marches as intolerable uppity behaviour by those whom God had made to be their slaves. As long as the marchers refused to turn their protest into rebellion, they kept the moral high ground and underlined the principle of a common humanity, something the slave owners had consistently denied. 

            I indicated earlier that I was a descendant of Mennonites, first known as Anabaptists, who refused to bear arms and developed a strong code of pacifism. Other groups like the Quakers have also chosen non-violence. That does not rule out protest. To speak up against unfairness and injustice, even oppression, is a moral obligation, especially if the speaking up is not for oneself but for those who cannot speak up.

But the way of peace refuses violence in all its forms, and seeks reconciliation. That is the ideal. I cannot speak for Quakers but I know that Mennonites have not always avoided violence, either on the national stage or in their own families. The teaching remains, though, challenging us to seek actively to make peace.

 I confess that I am congenitally disposed to avoid even legitimate protest. I will write letters to my elected representatives (not very often), but I do not march or carry signs. My preference is to “guard each man’s dignity and save each man’s pride,” to quote from a 1970’s Christian worship song.  Other cultures value the “saving of face” which is simply a different metaphor for the kind of agreement that allows for gracious exits from the conflict.

            Is that always possible? I don’t know. Some situations do present themselves as inherently impossible, yet I have read many inspiring stories of people who have suffered much rather than use violence and have ultimately brought about lasting change. Stephan A. Schwartz argues that social changes attempted through revolution and violence generally do not last as long as those social changes created through non-violent means. He lists several examples, including universal education, abolition of slavery in countries such as Britain, universal health care. Remember the old saying, “a man convinced against his will is of the same opinion still”?

As Stephen Berg wrote in “Deer in the Mist,” “insisting on angels drives angels away.” Or as I heard in a sermon many decades ago, the way of spiritual grace is always a matter of “gift,” not “grasp.”

Everything real, happens first,
out of sight, in the far away furnaces of courage
which are fueled, not by passion, but love.
(Stephen Berg)

Photo of a single clematis vine climbing up a wall with seemingly nothing to cling to. There is one lovely mauve flower.

Thinking About Report Cards

photo of a vase of a dozen coral roses

A prowl through a file cabinet drawer, long untouched, revealed a collection of report cards with my name on them (Grades 1 – 12). Oh, my. There were some blunt comments from teachers about my hopeless handwriting—that mattered in those days—and inconsistent work habits, and one anomalous observation on the Grade 3 report card that perhaps as I grew older I would take part more in outdoor sports.

photo of my report cards from Grades 1 - 4.

Remember those report cards, and the trauma of taking them home? Those were the days when children could fail their grade and be asked to repeat it. I was never seriously concerned about that possibility, yet still anxious about what I might have to take home to be signed. Would the report card be good enough that I wouldn’t get any reprimands? My siblings and I were expected to do well in our studies and to conform to strict standards of behaviour. And where there is a clear expectation, there is also the possibility of failing to meet it.  

Which raises two questions, I suppose, with wide application: how clear and reasonable is the expectation? how fair and appropriate is the evaluation?  That bygone teacher who bemoaned my lack of participation in softball had known nothing about the daily hours I spent outdoors walking, exploring, doing farm chores, playing with animals, helping in the garden, even reading in secret places in the nearby bushes. She could not have known that for me solitude in the natural world felt infinitely safer than the ball diamond.   

And I began thinking about the edginess in our societies these days.  I use the plural form of “society” because ever-present social media have created separate cultural groups whose component parts span continents, and because the pandemic has encouraged the creation of very small sub-societies along with huge online silos of rigidly held opinions. No longer do the report cards, in whatever form they take, come only once a year.  

 We live now with evaluations all the time: some are formal, such as work performance reviews, grades on particular projects, peer reviews of publishable articles, demotions or promotions, professional degrees, trade certifications; some are informal, such as the disappointment or delight on someone’s face, a welcome invitation to a social occasion or utter silence from former friends, thousands of likes or brutal online bullying, a stunning bouquet delivered at the door or a package of dog poop left on the porch, acceptance or rejection. There is not much point in railing about the unfairness of evaluation itself—who can ever really grasp everything about someone else’s circumstances or motives?—because we simply cannot manage without evaluations, both great and small.  To be realistic here, I should admit that we have always been living with evaluations; they are nothing new.

 Do we not get quotes for prospective building projects or home renovations? Each business that submits a quote will be evaluated. Do we not develop friendships with former strangers on the basis of our judgment of their trustworthiness and compatibility? Do we not evaluate the politicians who present themselves for office and call for our votes? It’s important that we take time to decide whether trust is justified or not. Will we listen to the cold call we just got on the phone, or slam the receiver on yet another bogus message about credit cards? (It is really too bad that cell phones have no slam option). Will we respond warmly to the chatty clerk or resist what feels like too much sales pressure?

There are degrees of judgmentalism, of course. Some of us are suspicious, automatically assuming that others’ motives must be nefarious at worst, self-interested at best; some of us are more open, assuming that others are well-meaning until we are clearly proved wrong. I am using the personal plural “we” and “us” rather freely here to underline the fact that none of us is entirely one kind of person or the other. Our motives are not consistent; our behaviour is not consistent; our tolerance of risk varies; our ability to learn and change is always there.

 Herein lies the importance of report cards. They do not function only to regulate who is allowed to proceed and who is not qualified for some task (and I know of no society that does not have some such structure for organizing itself). For now, think instead of the personal value for the recipient of the report card, whether it be an actual document with an official seal on it or not.  

  The phrase that comes to my mind is Canadian novelist Adele Wiseman’s description of Abraham, the key character in The Sacrifice. He has visualized himself as a very important man in his small Jewish community; he may be just the local butcher but he’s also a keen student of Torah, a master story-teller, a man of wisdom who “knows” that God has a special role for him. He is, after all, Abraham (and Wiseman gives him no surname). But there comes a moment in a terrible family conflict when the angry words of his daughter-in-law become a “mirror flipped up in his face and he himself stood revealed as he was to another, a stranger. . . (The Sacrifice 316, emphasis mine).  

  That is the function of evaluations. How can we know ourselves without the reactions of others? Child psychologists speak of the importance of parents mirroring the infant’s efforts to communicate. The return smile and the verbal echoes tell the little one that she/he matters. Ditto for the clapping games and the singing and the hugging. The babe is busy discovering a self through parental affection—a process that remains mysterious, despite all the books and much documented experience.  

 This discovering of a self, shaping a self? I understand far too little to hold forth on it with any wisdom. What I do know is that, necessary as unflattering report cards are now and then, equally necessary, in far greater measure, is affirmation of the various selves that we live out in our daily lives—affirmation that is needed in both the giving and the receiving.  

 In these days of way too much judgment and far too many anonymous “report cards” circulating online like some virus worse than COVID, perhaps the best thing we can do is to flip up a gentler mirror that reflects respect: “I see you, and you are a human being of great worth.”

I wish I could show you

when you are lonely

or in darkness

the astonishing light

of your own being.

            (Hafiz)

Photo of a single coral rose.

What We Can Choose – Part Two

Photo of a trail leading to a rickety wooden bridge over a creek in the forest.

            This reflection will not be obvious. It considers not the what, but the how and the why and the what happens next. Those are often not obvious at all, partly because our culture has cast the language of choice in the individual mode. I am convinced that that can be misleading. There is no such thing as an entirely “personal” choice.

Shelves of packages of candies, taken in  a London Drugs store.

Let’s start with the trivial: which candy I choose to spend my dimes on (oops, not dimes—dollars!) can hardly matter in the grand scheme of human endeavour. The world seems indifferent to such a choice, even to whether I choose candy at all or potato chips (much more likely – I dislike candy). Yet as soon as we back away from a single bag of candy, the scene changes.

Store owners stock only those candies that sell; the more often I and others opt for lemon drops, the more likely it is that stores will stock them. That then determines what factories produce, and if making lemon drops has deleterious effects on the health of factory workers, then my utterly trivial choice matters. The more candy I eat, the more likely it is that the sugar overdose will affect my health, beginning with my teeth. My health, as it happens, is important to more people than just me.

I could also talk about what I choose to do with the now empty wrapper. Does it end up in the ditch at the roadside? or on the sidewalk beside a park? Out of what was that wrapper made? What was its overall cost?  

Even in the most trivial choices, I am in the midst of a whole web of connections with other human beings.

Shelves of different breakfast cereals, also taken in London Drugs.

Consider another seemingly simple choice: what shall I have for breakfast? Someday, archaeologists will draw conclusions about our culture based on packaging debris that survives beneath the rubble of centuries. Be it Frosted Flakes, or granola, or bacon and eggs, or smoothies with startling ingredients, every selection affects which business grows and which does not, which animals and plants are grown and which are not, which divisions of our health care institutions are overworked and which are not, which tracts of land are cared for adequately and which are not (see Michael Pollen’s The Omnivore’s Dilemma).

 The how and why of all our small choices together reveal our tastes, our values, even the causes for which we’ll be prepared to march in the streets. All of those choices have been created in the crucible of our multiple contexts, some of which have been given (perhaps most) and some of which we have chosen, each choice determining to some degree what follows.

Some choices are made without thought, the variables having been sorted out long ago: I need no conscious decision to walk by the candy store without pausing; I will, however, linger by the camping supply store and linger even longer by the book store window.

 Other choices are far more difficult. Why did I leave one church and eventually settle on a different one? Indeed, why have I chosen to continue to identify myself as Christian? (The initial identification as such was hardly a genuine choice, not where I grew up.) To answer those questions would require long stories, which call for a different venue than this blog.

The point I want to make here is that the choice was not personal except in the sense that I was the one who had to make it. In the end, my choice to leave a church I’d been part of for decades was the result of the influence of people (and some books) who invited me into different perspectives and other people who made it increasingly difficult to remain. No doubt my choice likewise affected others. Just how many or how much, I don’t know beyond the fact that some friendships ended.  

 Since we cannot know all the intricate ways in which our smallest choices might affect so many other people, the least we can do is to remain aware that our choices are both personal and not personal. That is, we do have to choose, many, many times a day even; I am the person whose foot pushes down on the brake or the accelerator—no one else does that for me. At the same time, every choice I make is not only the result of all the overlapping circumstances of my life but will then also affect later choices of mine and of others. Every effect becomes itself a cause.

 In our current climate of anxiety over the pandemic and dire political and climatic circumstances, perhaps two principles could and should be kept in mind. One is that sooner or later our choices (even the trivial ones) will enter the territory of values; they will become moral choices. As C.S. Lewis once insisted, all of our decisions, both trivial and momentous, will make us more of a certain kind of person, and who we become matters a great deal.

“I’ve been considering the phrase ‘all my relations for some time now. . . . It points to the truth that we are all related, that we are all connected, that we all belong to each other. . . . ALL my relations. That means every person, just as it means every rock, mineral, blade of grass, and creature. We live because everything else does.”

Richard Wagamese

 The other principle is connected to the previous one: the well-being of others should come first. That is such a huge statement that it has already filled libraries with books as philosophers and theologians and thinkers of all kinds have struggled to work out the relationship between our instinctive—and necessary—care for ourselves and our equally necessary care for others.

If we look out only for Number One, the society around us is likely to become, or least seem, more hostile. When unchecked selfishness is pursued in high office, the entire country becomes a less liveable place. Jesus once said, “Do unto others as you would have others do unto you” and also “those who would save their own souls must first give them away.” Other religions base their rules of conduct on the same principle, albeit worded in slightly different ways.

 If religious reasoning is not your preference, then scientific analysis will lead you to a similar conclusion. It turns out that human infants do not thrive without love (nor, for that matter, do adults), and societies in which altruistic behaviour is encouraged offer better and more satisfying living conditions.

Robert Frost’s famous poem “Two ways diverged in a yellow wood” concludes with “I took the one less travelled by / and that has made all the difference.” Generations of school children absorbed the lesson that we should be brave individuals and choose to be non-conformists. I would argue that had the narrator chosen the more travelled road, it would still have made all the difference. Choices do that.

A mountain trail, but it's narrow and half over-grown. Only a small sign beside it reassures the hiker that this is an actual trail through the forest.

Oh, I kept the first for another day!

Yet knowing how way leads on to way,

I doubted if I should ever come back.

Robert Frost

What We Can Choose – an exercise in the obvious

        

Landscape with ocean and mountains very much in the background. In the foreground is a high bluff with dry grasses, one lone small crooked tree and a wooden fence that angles from the bottom left-hand corner to the middle of the right hand. The photo is a combination of wide vistas and a fence that draws a clear boundary between dried-up lawn and wild grasses.

To begin at the beginning—and I said this would be obvious—we did not choose to be born. Or to be born as a human being. However you view the world that you know, whatever framework of meaning you might use to contemplate your momentous birth, you were most definitely not the one who decided that you would be a human and not a tadpole or a poodle or a grizzly bear.

It follows that you also did not decide what hormones would be dominant in the microscopic wiggly something that was your first shape. So one of the first pieces of your identity, which usually determines the kind of name you get, was not your choice. Okay, changing names is an option; even changing gender is now possible. What is not possible to change is what you came into the world with in the first place.

 Ditto for your parents and your surroundings. You did not choose the year of your birth or the location. You did not choose the economic situation of your mother (or her relationship to your father), the color of your skin, your genetic make-up, your biological relatives, your first language, the culture in which you practiced that language, your first notions of spirituality. None of those momentous determiners out of which come so much of what makes you who you are were chosen by you. Not one.  

 So we cannot logically claim credit for any of those momentous determiners of our identity. Nor can we blame ourselves or anyone else, for what was not ever chosen, by us or them.  

 Am I belabouring the obvious here? Yes, I am. Because too many discussions—in our public squares, in our courts, in our governments, in our living rooms—ignore the obvious. Should a child born in a refugee camp or in city slums be despised for being poor? No. She did not choose poverty. Should the child with millions in her bank account before she can count to ten be respected for that very fact? No. She did not choose it or earn it. Should the dark-skinned individual be blamed for her skin? Or be made into a curiosity because of her kinky hair? No, absolutely not.  

 Let me be specific and personal. I do not deserve praise or blame for being a woman or being light-skinned or even for being born into a family and culture that valued hard work and education. Whatever advantages were granted to me simply because of where and when I was born were indeed mine to use or not to use, but I need to remember two facts. One is that not everyone comes into the world with similar choices available; two is that I actually had considerably less choice in many ways than I once imagined. I could not, for example, as a teenager in a small Mennonite town, have chosen to become Muslim—that was not within the range of possibility for me until I was in my thirties or forties probably, once I had actually met Muslims and learned something about Islam.

 We tend to treat religion and sometimes politics as well as if those stances can be freely chosen from a wide spectrum of offerings. Not so. It would, for example, have been actually impossible for someone living in Shakespeare’s time to become an atheist. The very concept had not yet taken shape. One could be Catholic or Protestant—that choice had become available, probably within Shakespeare’s living memory. Mostly, though, one was what one had been born to.  

 To imagine that it is readily possible to choose from many religions is a modern idea not often sufficiently qualified by the fact that our initial worldview, through which we view all subsequent options, is given to us before we are old enough to choose anything. I would argue that “choosing” our political views is equally contingent upon the culture in which we have first learned to think politically and the political surroundings to which we have been subsequently exposed. Surely that fact should temper any impulses we might have to label the “other” party as the enemy or to see ourselves as supremely righteous and clever for belonging to “our” party. Not that changing a political stance is impossible, nor that converting to another faith is impossible. Clearly not. As Viktor Frankl wrote, “Between stimulus and response there is a space. In that space is our power to choose our response.” Nevertheless, an acknowledgement of contingencies that shape habitual responses could help to defuse tense conversations.

Between stimulus and response there is a space. In that space is our power to choose our response.

Victor Frankl

 Popular metaphors regarding the philosophically fraught business of choosing include the well-used image of “playing the hand that we were dealt.” When playing poker or bridge or even solitaire, our ability to choose is severely limited, not only by the cards that are actually in our hand for each round of play but also by the rules of the game. Moreover, how we play our cards will depend on who else is sitting around the table (are they highly competitive, poor losers, cheaters, family members, strangers?) and what the stakes might be (are we playing for peanuts, or laughs, or hundred dollar bills?).

 Theoretical speculation and playful metaphors aside, may I ask as politely as I can, what is going on in the current intensity of political and racial language, all amid an insistence on “freedom of choice”? As if everyone has available all manner of choices.  

Let me try to illustrate: it is highly unlikely that I will choose my response to a police officer on my front step from an infinite list of options because the very fact that I have a front step on which the officer can stand already rules out quite a few possibilities, such as an immediate fear of eviction. The additional fact that it is highly likely the officer will have the same color of skin as I do rules out more possibilities. I will still probably feel real fear, but it will be fear that someone I love has been hurt in an accident, not fear that I’m about to be arrested for something I may or may not have done. In other words, I enter a particular event out of my own context, shaped by various givens, and by the experiences I have lived through before that moment, only some of which I could have chosen.   

  I belong to the Boomer generation; that means that my economic opportunities will have been different than those of my parents and different again from those of my children, and of my grandchildren. My parents were immigrants, so it’s no surprise that I learned the virtues of hard work and education. Then again, my parents were Mennonite and I was a girl, which means that the value of hard work applied but the value of education would have been tempered by certain assumptions about women’s place in the world. Could I have, as a teen, decided I was going to be Christian missionary? Yes, definitely. That option was endorsed by pretty well everyone I knew. Could I have decided to become a politician and hope to become premier of the province? Not in my wildest dreams. Could such options have opened up for me later in my adult years? Possibly, but with great difficulty.

 Buried beneath the obvious limits set by culture and religion and language and economic opportunity is the shaping of the individual personality which unfolds in a mysterious symbiotic process of givens and choices, each of which exercises influence on future choices and even on the terms in which memories are recalled. Psychologists have studied these variables since psychology became a recognized science. Long before that, though, parents have agonized over causes and effects ever since Adam and Eve somehow ended up with a devout and biddable shepherd and a jealous gardener turned murderer.

In other words, we do all have choices to make, important choices, which we make within a range of possibilities, choices for which we are responsible. I’m not arguing for complete determinism, just pointing out the inevitable limits of free will – limits that should curb our judgmental impulses and intemperate rhetoric.

Photo of forest on Vancouver Island but the trees are low except for one scrubby evergreen bent by prevailing winds to a 45 degree angle. In lower right hand corner is a path.

The good news, as I see it, is that if we choose to, we can expand our range of possibilities. While it’s true that we were all gifted with the worldview through which we first tried to make sense of who we were, we can choose to widen that worldview, just by letting ourselves hear other stories. I can dismiss as nonsense your belief that houses should be always immaculate, for instance, or I can ask to hear your story about how that belief came to be yours. In the process of telling and listening, both of us could adjust our perspectives.

  I admit that our capacity to absorb new information is limited. It is not possible to know everything and to hear everyone’s story with sympathetic mind. The first action is limited by the sheer abundance of stuff to know, and the second is limited by one’s emotional and imaginative capacity, which has not been developed equally in all children. Nevertheless, each story I listen to with as open a mind as I can manage will exercise my imaginative faculties and enlarge my perspective.

 Then perhaps I can learn to defer judgment or animosity until I have heard more of the story. That’s a choice that becomes ever more available as I practice it.     

To be continued.  . . . .      

Later afternoon sun on the ocean in the background. Foreground is the author staring through the trees at the ocean, leaning against a bench.