Waiting for Gestalt

Waiting for Gestalt

By Gene Wilburn

Gestalt (ge STALT). A word meaning, roughly, when the brain perceives with clarity that the whole of a system is greater than the sum of its parts, and everything clicks into one awareness. One can have a gestalten moment. But can one achieve a gestalten existence?

When I was coming of age intellectually at university in the early to mid 1960s, there were a number of explorations of the mind making the rounds. Existentialism, the sometimes bleak philosophy that arose strongly in Paris after the Nazi occupation at the end of World War II, was alive and well. Sartre, Beauvoir, and Camus were still publishing and there was something compelling in the message that you’re responsible for who you become, creating a personal integrity in the face of the meaninglessness and absurdity of the universe. This is, of course, an over simplification.

Along with the primary existential philosophers came “Theatre of the Absurd,” a literary form of existentialism, perhaps best seen in the play by Samuel Beckett, Waiting for Godot, in which “logical construction and argument give way to irrational and illogical speech and its ultimate conclusion, silence.” [Wikipedia, “Theatre of the Absurd”]

Another prevailing line of thought came from the field of psychology, in the form of Abraham Maslow’s “Hierarchy of Needs” with “self actualization” at the top of the pyramid. In its wake people were self actualizing all over the place, or at least that’s what they professed. It certainly launched a full-blown pop psychology business and fuelled New-Age-style thinking before “New Age” had even become a word.

A different branch of psychology, from Germany, had earlier in the century introduced Gestalt Theory, a holistic psychology that seemed to imply that if you could attain a gestalt with yourself and your environment, you could flow through it with understanding, and perhaps appreciation, in the way that listening to a symphony is an experience that transcends the individual notes of the musical score.

Looking back on this fifty years later, I think existentialism has held up rather well, especially when augmented with a generous helping of late Roman-style stoicism. Maslow’s hierarchy of needs still has a sound feel to it, though there is a sense that Western society, as a whole, has slipped down the pyramid a bit in this era of anti-enlightenment, anti-science populism.

But the one that still teases my mind is gestalt theory. At the turning of each decade I’ve been waiting for that gestalten moment when everything would click into place and I would reach an understanding — “Because something is happening here / But you don’t know what it is / Do you, Mister Jones?” [Bob Dylan, “Ballad of a Thin Man”]

The problem is, how does one achieve gestalt when everything keeps changing?

The Impact of the 1960s

I emerged from the 1950s like most boys who had reached their teens by the start of the 1960s, interested in cars, playing basketball, grooving to the week’s Top–10 radio, and thinking about going to university after high school. In other words, I was as cookie-cutter naive as one could be.

It was the folk music era which, in my relative isolation, I took to be the music of the Kingston Trio, Limelighters, Chad Mitchell Trio, Burl Ives, and that new group on the radio, Peter Paul and Mary. It was when I heard Joan Baez sing a couple of old ballads like “Barbara Allen” I began to perceive a different kind of folk music that was less slick and more personal. Back then it was just music I liked. Later it would change me.

My intellectual life began when I went to university where I first majored in engineering. It was a tough study, but I was getting by, being moderately good at math and logic. There was, however, a problem. I enjoyed learning folk music more than studying STEM subjects and the lyrics of Bob Dylan and Phil Ochs left me questioning what I was doing. I bought a guitar, learned a fistful of chords, and learned to sing and play the songs that were haunting me.

My taste in folk music had also led me to discover the Weavers, Pete Seeger, Woody Guthrie, Cisco Houston, and a rich vein of black blues singers from Big Bill Broonzy and the Rev. Gary Davis to Mississippi John Hurt. I loved all these voices of the people.

I couldn’t square my study of engineering with my awareness of what was happening. The civil rights movement in the American South highlighted the inappropriate treatment of black people. President Kennedy had been assassinated, then Martin Luther King, then Robert Kennedy. There was a strange, unpopular war being waged in Vietnam.

Things were changing, blowing in the wind, as it were, and the gestalt of the time was changing with it. I switched my major to English and my minor to French, and began studying literature with its plays, novels, poems, and essays. In French classes, we frequently read the existentialists Sartre and Camus. I studied philosophy, social history, and art history. I met and became friends with dozens of like-minded individuals, some male, some female, some straight, some gay, a few who were black or hispanic, all of whom shared a passion for literature, art, philosophy, and music. I had found my people.

Something happens to your mind when you embrace the Humanities — something that comes as a series of epiphanies that raises your consciousness into new realms of thought and feeling resulting from contact with the great writers, poets, playwrights, philosophers, artists, and musicians of all eras. It’s intoxicating and exhilarating and, as Thomas Wolfe proclaimed in the title of his novel, You Can’t Go Home Again. You’re changed.

You reach for a higher kind of gestalt, the gestalt of the modestly well-educated. You begin to read the New York Times, The New Yorker, The New York Review of Books, Le Monde, The Times (London), The Guardian, Harper’s, Atlantic Monthly, The Globe and Mail, and university quarterlies. You listen to folk music, cool jazz, classical music, and opera. You see Verdi in the same tradition as Shakespeare, and taste the richness of Old English in Beowulf and the delightful Middle English of Geoffrey Chaucer.

It’s a heady experience, all in all, but the question always arises: what are you going to do with all this when you head out into the “real” world?

One Pill Makes You Larger, and One Pill Makes You Small

For one gestalten period it seemed as if the world had changed. The war in Vietnam was vigorously opposed, campus radicalism was on the rise, and hair got longer. The folk music I’d grown up with was woven into a new kind of rock music and the voices of Joni Mitchell, Grace Slick, Janis Joplin, and Crosby, Stills, Nash, and Young filled the airwaves, along with new bands like the Doors, Led Zeppelin, Grateful Dead, Jefferson Airplane, Santana, and Frank Zappa.

Alan Watts taught us about Zen, the tarot deck came back into fashion, and decorated VW vans filled with flower children with headbands, victory signs, peace medallions, and bloodshot eyes were common sights.

Among the reading favourites were One Flew Over the Cuckoo’s Nest, Been Down So Long It Looks Like Up to Me, Catch–22, The Vedas and The Upanishads, The Teachings of Don Juan, The I Ching and The Whole Earth Catalog.

Everyone was for “getting back to nature” and many communes were started, mostly ending in failure, and from the broadway musical Hair to massive rock concerts, it was assumed that the Age of Aquarius was upon us. The Mexican poet Octavio Paz described it as an “explosion of consciousness.”

It’s sometimes said that if you remember the 60s, you weren’t really there. My own memory of the time is patchy, with psychedelically-coloured gaps and an enduring sense of mysticism. But, like many, I didn’t see how it was sustainable. In the words of the Jefferson Airplane, “You are the Crown of Creation / And you have no place to go.”

The Origin of Species

The flower-power era couldn’t last, of course, because someone has to pay the bills. I trimmed my hair, picked up a degree in library science, and took a job. Through sheer good fortune I ended up as Head Librarian at the Royal Ontario Museum, in Toronto. It was there that I began hanging out with ornithologists, palaeontologists, mammalogists, geologists, mineralogists, ichthyologists, and entomologists, as well as archaeologists. It has shaped my thinking to this day. I had encountered the gestalt of scientific thinking and research.

One of the curators, a palynologist (one who studies modern and ancient pollens) challenged me with the question: “Have you read Darwin’s Origin of Species?” Being a lit major, I hadn’t, so I decided to give it a go.

What surprised me the most was how clear Darwin’s Victorian prose was. I was mesmerized by the concept of “descent with modification” or as it came to be known, “evolution.” Shortly after reading Origin, a new volume by Stephen Jay Gould passed through the library — a collection of essays entitled Ever Since Darwin. I gave this a read and subsequently read every book of essays Gould produced, culled from his monthly column in Natural History.

As a newly-minted amateur naturalist and birder I became hooked on reading science books written for the general public. The 60’s mantra “all is one” took on a philosophically material interpretation when I studied how the universe started, how suns ignited and planets formed, and how, on this one we call Earth, life sparked and evolved, going through great periods of diversity, extinction, more diversity, more extinction, and so on, leading eventually to a group of suddenly sapient simians. As Carl Sagan pointed out, we are made from the remnants of star dust, and every living thing on the planet is related.

My readings in science and science history led me to reaffirm the existentialist theme that life can be heaven or hell, but human beings mean very little in the face of the universe. I shed any last remnants of religion. Materially, we are bodies that live and die, each of us randomly sorted into different situations, different cultures, different countries and it’s these things that shape our sense of who we are.

There are people for whom science is enough. To paraphrase Darwin, there’s a grandeur to this concept of life and its descent with modification through time and its tangled branches and the sudden bursts of evolution that Gould referred to as “punctuated equilibrium.” This is a gestalt that most naturalists come to feel through their observation of life’s many remarkable species.

But is science alone enough to sustain the human spirit, or psyche, that je ne sais quoi that some people call a “soul”? Perhaps, and perhaps not, depending on the individual. What science does, for me, is to throw into relief all the amazing works of mankind, from art, history, philosophy, literature, and music to the increasing technological achievements that accompanied the industrial revolution.

By the time I had begun to assimilate this naturalistic view, information technology was picking up the pace. Television, radio, newspapers and other media shaped us and moulded us in ways that perhaps only Marshall McLuhan could sort out. But that was merely a preface of things to come: the computer revolution.

Bits, Bytes, and Qubits

From the late 70s onward the computer revolution picked up momentum until it reached nearly Biblical proportions: “And in that time a great change came across the land” [my paraphrase]. Computing became personal, portable, and profoundly ubiquitous.

Like others, I joined the revolution, pivoting my career from librarianship to Information Technology (IT). From the earliest whimsical days that included an ad in Byte Magazine for dBase II, entitled “dBASE II vs The Bilge Pump,” to the corporate adoption of personal computers as strategic tools in the workplace, to the computer (aka smartphone) in one’s pocket or purse, a virtual Pandora’s box of consequences was unleashed.

My work involved setting up workstations, email servers, database servers, storage servers, web servers, and firewalls, with a little programming tossed in for spice. I enjoyed decades of computing projects and by the time I retired, in 2006, the industry had progressed from 8-bit personal computers such as the Apple II, to 64-bit powerhouses running Microsoft Windows, MacOS, Linux, iOS, Android, and a few dozen lesser-known operating systems. Smartphones and tablets had become almost a birthright.

Computing begat digital photography, streaming audio and video, automobile electronics, appliance electronics, social networks, and, with lesser success, self-driving cars. I now listen to streaming music, watch streaming videos, and get my news and opinion pages from the Internet.

On another level, machine learning (ML) has grown and penetrated the Internet to such a degree that one can examine a product on Amazon and see ads for it within hours on Facebook. Privacy has suffered. The Internet, invented for the purpose of sharing scientific information, developed a dark side, the extent of which is still being assessed — surveillance, phishing attacks, the hacking of personal information, and possibly enough manipulation to sway elections.

The pace is still swift and the increasingly successful bids to harness Quantum Computing (whose basic unit of information is called a Qubit) will likely bring unforeseen changes. Nothing stands still.

End Game

“You can’t stop the future. You can’t rewind the past. The only way to learn the secret, is to press play” ~ Jay Asher, Thirteen Reasons Why

In my retirement, I’ve once again become a student. I read incessantly, both fiction and nonfiction, I take the occasional online course, and I think, if not profoundly, at least genuinely. It aids thinking to have a philosophical framework to compare one’s thoughts to, and I continue to find the challenge of existentialism worthwhile for this. It’s an honest philosophy, derived from the human spirit looking at an irrational and uncaring, absurd, universe and deciding to carve out a personal meaning for being human. It’s a difficult challenge (never underestimate existential angst) but it’s more open and honest than clinging to a derived set of values, liberal or conservative, from those around us.

I’m beginning to understand why Camus used the story of Sisyphus to highlight the challenge. In the Greek myth, Sisyphus was condemned to roll a huge boulder to the top of a hill. Every time he reached the top, the boulder would roll back to the bottom and he was required to repeat the procedure, for eternity. “Camus claims that when Sisyphus acknowledges the futility of his task and the certainty of his fate, he is freed to realize the absurdity of his situation and to reach a state of contented acceptance. With a nod to the similarly cursed Greek hero Oedipus, Camus concludes that ‘all is well,’ indeed, that ‘one must imagine Sisyphus happy.’” [Wikipedia, “The Myth of Sisyphus”]

It would be neat and tidy, at this final stage of my life, to wrap up my thoughts with a pretty bow attached, but I’m unable to do so. There have always been random elements in our story that change the story itself: a colliding meteor, a world war, an economic depression, climate change, the overthrowing of the monarchy and aristocracy, the re-establishment of a wealthy set of plutocrats, the place you were born, the family you emerged from, the schools you attended, the number of freedoms, or lack thereof, of the prevailing government, and, not least, who you fall in love with. It is difficult to piece all this together into a holistic understanding. I am, in my final years, still waiting — waiting for gestalt.

 

On My 72nd Year: My Ten Fundamental Beliefs

On My 72nd Year: My Ten Fundamental Beliefs

By Gene Wilburn

“I just dropped in to see what condition my condition was in” ~ Kenny Rogers

Every year around birthday time (June 10 for me), I like to take stock of what I believe in. Where do I fit with the cosmos? What are my bedrock, fundamental assumptions? This year’s thinking mirrors very closely what I’ve thought for several years, but age has perhaps lent them more clarity.

Let’s start at the beginning. As Terry Pratchett once wrote, “In the beginning, there was nothing, which exploded.” For each of us our cosmology starts somewhere, and for me it starts with the Big Bang, which I’m told was not really so much an explosion as an expansion — a very dramatic expansion in which a primordial soup of plasma emerged that was so hot not even atoms could form. As it expanded it created space and time. The universe was born. About 13.7 billion years ago, if our measurements are correct.

If it helps you to believe that this was a result of God breathing across the waters, so be it — we each have our favourite narratives. The question of how something can come of nothing is a profound one, and physicists have some thoughts about this: that there really is nothing such as nothing. Particles and antiparticles evidently come into and out of existence billions of times per second and usually annihilate one another, but, at least once, it is possible that particles accumulated faster than antiparticles forming a singularity and, well, boom!

So, that’s belief number one: the universe came into existence. This had implications. Chemistry was born. Eventually particles changed into quarks as the plasma cooled, and later, hydrogen atoms formed. Concentrate enough hydrogen atoms together and what do you get? Fusion. The birth of stars. Then, over time, some stars die in a spectacular explosion called a supernova in which most of the rest of the naturally-occurring chemical elements are created, spewn forth as stellar dust and ice. As these clouds of stellar material concentrate and condense, new stars form, with planets around them. One of these we call the sun, and the planet we live on, which we call earth, formed about 4.5 billion years ago, give or take a million or two.

Which brings me to belief number two: Out of inorganic matter, life began. How is still unsolved, but researchers are exploring the tantalizing possibilities of RNA and other life-critical molecules developing in places like deep-sea vents and evolving into a self-replicating thing that we might call first life, or even protolife — ancestors to the prokaryotes, or single-celled organisms without a nucleus, like bacteria and archaea. Somewhere along the line a bit of luck (for us) happened: two prokaryotes combined to form nucleated cells, which we name as eukaryotes.

Which brings me to belief number three: life evolved. Over great periods of time, eukaryotes developed along plant and animal lines, oxygenating the atmosphere, and eventually some pioneering life forms ventured from the oceans to the lands to colonize the barren geology of earth, turning it into large organic ecosystems.

Belief number four is that, for the most part, the universe is random. It is not willed, or fated, or progressive, though randomness can lead to increased complexities. Using some basic structural parts, nature evolved through random genetic changes into extraordinarily rich organic landscapes and seascapes, filled with the plants and animals of its day. The PreCambrian Explosion, various extinctions, and random events, such as comets crashing into the earth, diverted the path of life several times, until, after eons, the great age of reptiles was over and land mammals had the chance to fill the empty ecological niches.

Belief number five is that the emergence of human beings, in the form of Homo sapiens, was not preordained. We’re a branch of primates that evolved in particular ways to adjust to our environment and we had cousin species, Neanderthals, Denisovans, etc., who did the same. We’ve only been on the planet a short while, in geological terms, but we’ve become a new force. After nearly perishing from extinction ourselves, we made it, and we developed a complex brain that would allow us to discover agriculture, mathematics, and art. Not to mention learning how to sew reindeer hide into warm clothing for the Arctic.

Belief number six is that we originated in and emigrated from Africa. Down deep, we’re all Africans and, living in a very warm climate, we were all probably dark skinned because extra melanin in the skin protects against overpowering UV radiation. Those of us who migrated to colder climates lost some of our melanin because whiter skin helps absorb the sun’s rays better in colder conditions and not as much protection against UV radiation is required.

Belief number seven is that we developed into a language-oriented species that loves narratives. We acquired strong imaginations to accompany the impressive encyclopedic knowledge of our environment we learned through hunting and gathering. Tales around the campfire, stories of our ancestors, legends, myths, and, of course, gods. We’re a species that wants to participate in its own narrative, even if that narrative is unreliable.

Belief number eight is that this narrative of human history is important to study in all its facets, including science, the humanities, and the arts and music. Physically, humans haven’t evolved much in the past 100,000 years or so, but mentally we’ve evolved through many great civilizations in ways that are fascinating and that contributed to our rise as a species.

Belief number nine is that, mentally, we went through our ‘teen’ years between Galileo and Einstein. We began to mature toward mental adulthood by rigorously questioning, observing, measuring, and testing our premises. We bent the planet to our wants and needs with an industrial and scientific revolution.

Belief number ten is that we’ve reached, borrowing from an Arthur C. Clarke title, Childhood’s End. We now possess the ability to destroy the planet for mankind, as well as other species. As adults we must learn to be stewards of our planet and treat it with respect.

I realize I’ve said little about the human condition itself. That is left to explore and think about in the context of the first ten beliefs, but I suspect it will always remain a personal, and sometimes communal, journey for each of us. This is why I write essays — to see what I think about life. We are all part of the overall narrative of mankind and each of us expresses it in a personal way.

To be sure, if we reach mental adulthood intact, the greatest history of Homo sapiens could just be starting. As we look at our current state, we see automation trending ever forward, with artificial intelligence waiting in the wings. We may create a new kind of life form. We may, if wise, develop eco-friendly attitudes about our home planet and change how we obtain energy and food. Our journey as mental adults has just begun. We may even prosper, as long as a random comet doesn’t smash into us again, and as long as we don’t self destruct. Fingers crossed, humankind!

A Brief Meditation on Solitaire

A Brief Meditation on Solitaire

By Gene Wilburn

“Patience’s [Solitaire’s] design flaw became obvious for the first time in my life: the outcome is decided not during the course of play but when the cards are shuffled, before the game even begins. How pointless is that?” ~ David Mitchell, Cloud Atlas

Napoleon Bonaparte and I have very few things in common. He was short, I am tall. He was an extrovert, I’m an introvert. He was a brilliant military strategist, commanding great armies. I’m a quiet essayist with, maybe, six regular readers. He, evidently, ate arsenic. As far as I know I’m arsenic free, but we do have one shared passion: we both love to play Solitaire, or as he called it Patience, with a French accent.

Where the game originated is in question, but it seems to have emerged either in France, or in the Balkans, some time in the 18th Century. Called Patience, with an English accent, in England, it spread to Canada and the United States and became famous during the Klondike Gold Rush in the Canadian Yukon Territory as a game called, unsurprisingly, Klondike. One imagines lone miners stuck in their huts, whiling away the hours with the cards, or maybe two miners betting on the games, listening to the gales of the Arctic winds dampen their enthusiasm for finding riches. The rules of Klondike proved popular and are the basic rules of the game most of us now know as Solitaire.

Solitaire enjoyed its biggest boost ever by being included as a free computer game in early Microsoft Windows. As a consequence it is now played world wide by millions of devotees, not to mention bored office workers, but nowhere is it as enjoyable as on an iPad or other tablet where it has the feel of real cards without all the bother of shuffling, moving around, and occasionally dropping, small pieces of laminated cardboard.

For me, Solitaire is not so much a game as a centring process. When the world is too much with me, late and soon, as Wordsworth said, I turn to Solitaire for a time out and an active mindful meditation. I don’t so much play to win as to observe what different patterns of randomness can do. In order to make this more interesting I tweaked the rules of the game: I redefined what a Solitaire “win” means. Getting all the cards up to the top is the standard definition of “win” but if you play Solitaire by the three-card-up option, as I do, you very seldom win, or even come close to winning by these rules.

The Solitaire game I play on my iPad is called Real Solitaire HD, by EdgeRift. It keeps a running numeric score of the game as you play it and through playing hundreds of games I finalized on a score of 150 as “beating the odds.” Under my rules, if you beat the odds, you’ve “won.” Of course you may do much better than that and any game in which you score 200 points or more is a very decisive victory. (Your game of Solitaire may score points differently, so you may have to do your own calibration to determine a reasonable score for winning.)

David Mitchell’s complaint in the opening quote about Solitaire being decided at the moment of the shuffle is mainly true, but there are nuances. Part of the fun, for me, is looking at the initial deal and, based on lots of play experience, gauging the odds of beating the odds. If you’re dealt a set of cards that are all red or all black, you can just about kiss your chances of winning goodbye. I play the game anyway because there can be surprises lurking hidden in the deck. It doesn’t happen often, but once in awhile an unexpected chain of sequences occur that changes the odds during play. It’s like watching an unseeded underdog win a tournament. Exciting. I should probably note that I’m easily amused.

There are various patterns of Solitaire games that I’ve noted. The ones that are totally doomed from the beginning with no useful cards turning up are almost as interesting as wins, in terms of the odds. Then there are those that start well but die before reaching the magic 150-point mark. And the exciting ones that look doomed, but come through with a burst at the last minute that carries it to victory. Or the agony of games that end up at 145, just 5 points shy from winning.

In a way, it’s like the odds in life. People may start well, emerging from a good family and educational background, yet fail to achieve their full potential. Others may have a hard start, but through perseverance pull through, beating the odds. Some are golden — only good things seem to happen and they reach a “perfect” game. And then there are those who never had a good start and never achieved a second chance.

Like life, Solitaire is all about randomness. And randomness can be streaky, with long runs of good games and bad games. Just like life. You play the cards you’ve been dealt. The odds are difficult to predict, or as Winston Churchill said, “It is always wise to look ahead, but difficult to look further than you can see.” Yes, life is very much like the odds of Solitaire: unknown, but hopeful.