Beginnings: Setting a Story into Motion

In July, I shared a video essay by screenwriter Michael Arndt on insanely great story endings. The 90 min presentation is a brilliant excavation of how narrative works, and how crisis and catharsis interweave to create ‘insanely great’ story endings.

The Oscar winning screenwriter of ‘Little Miss Sunshine’ and ‘Toy Story 3’ has also created a shorter lecture on story beginnings.

 

 Here are the key steps Arndt identifies set up a good story beginning:

Step 1: Show Your Hero Doing What They Love Most

The first step to setting up a story is to identify the protagonist or hero’s ‘grand passion’. This is their defining trait, the centre of their universe. As you introduce the character the universe they live in, you show your hero doing the thing they love to do the most.

Step 2: Add a Flaw

The character’s grand passion however, contains a flaw. Usually it’s the dark side of their natural love, a good thing that’s taken too far, a fear or a weakness. What is key however, is that the hero’s flaw is tied to their deepest love and desires.

Step 3: Add a Storm

In the early stages of the story, usually around page 10 of a screenplay, you want to establish ‘storm clouds’ on the horizon of the main character’s world. Your character is walking down the road of life, on a nice bright sunny day, and then BABOOM! ~ something comes and totally blows their joyous life apart and irrevocably changes the path they are on.

Step 4: Add Insult to Injury

This bolt from the blue not only interferes with your character’s life but skewers them through their grand passion to their deepest flaw. This wound, changes their whole sense of what their future is going to be. To increase the stakes at this early point in the story, add insult to injury, making the whole world seem a little bit beyond unfair.

Step 5: Make Your Character Pick the Unhealthy Choice

All of this serves to set up the character journey of your protagonist for the rest of the story. Your hero’s grand passion has been taken away, the world is revealed to be unfair and he or she comes to a fork in the road, and they have to make a choice on how to deal with their new reality.

There is a high road to take, a healthy responsible choice, or a low road to take. As the audience, we are barracking for the hero to do the unhealthy, irresponsible thing, because we feel his or her pain.

Bring It Home

To put everything right, your character must make a journey that is the rest of the story. By the end of this journey, hopefully, not only will they get back what they lost, but they will have healed the flaw they had which was tied to this deep passion and desire.

The key point of Arndt’s analysis is that the essence of your story, comes out of your character’s deepest desire and darkest fears. The thing they love gets stolen away from them, and the world is revealed to be unfair. Their journey to reclaim their lost passion, heals their deepest fears, their wound and flaw and re-establishes equilibrium and peace.

And this is what makes insanely good story beginnings.

Our Brains Tell Stories So We Can Live

This article was written by Robert A. Burton and published on Nautilus on August 8th, 2019. To read the original article please read here.

————————————————

Without inner narratives we would be lost in a chaotic world.

We are all storytellers; we make sense out of the world by telling stories. And science is a great source of stories.

Not so, you might argue. Science is an objective collection and interpretation of data. I completely agree. At the level of the study of purely physical phenomena, science is the only reliable method for establishing the facts of the world.

But when we use data of the physical world to explain phenomena that cannot be reduced to physical facts, or when we extend incomplete data to draw general conclusions, we are telling stories. Knowing the atomic weight of carbon and oxygen cannot tell us what life is. There are no naked facts that completely explain why animals sacrifice themselves for the good of their kin, why we fall in love, the meaning and purpose of existence, or why we kill each other.

Science is not at fault. On the contrary, science can save us from false stories. It is an irreplaceable means of understanding our world. But despite the verities of science, many of our most important questions compel us to tell stories that venture beyond the facts. For all of the sophisticated methodologies in science, we have not moved beyond the story as the primary way that we make sense of our lives.

To see where science and story meet, let’s take a look at how story is created in the brain. Let’s begin with an utterly simple example of a story, offered by E. M. Forster in his classic book on writing, Aspects of the Novel:

The king died and then the queen died.

It is nearly impossible to read this juxtaposition of events without wondering why the queen died. Even with a minimum of description, the construction of the sentence makes us guess at a pattern. Why would the author mention both events in the same sentence if he didn’t mean to imply a causal relationship?

Once a relationship has been suggested, we feel obliged to come up with an explanation. This makes us turn to what we know, to our storehouse of facts. It is general knowledge that a spouse can die of grief. Did the queen then die of heartbreak? This possibility draws on the science of human behavior, which competes with other, more traditional narratives. A high school student who has been studying Hamlet, for instance, might read the story as a microsynopsis of the play.

Despite the verities of science, we are compelled to tell stories that venture beyond the facts.

The pleasurable feeling that our explanation is the right one—ranging from a modest sense of familiarity to the powerful and sublime “a-ha!”—is meted out by the same reward system in the brain integral to drug, alcohol, and gambling addictions. The reward system extends from the limbic area of the brain, vital to the expression of emotion, to the prefrontal cortex, critical to executive thought. Though still imperfectly understood, it is generally thought that the reward system plays a central role in the promotion and reinforcement of learning. Key to the system, and found primarily within its brain cells, is dopamine, a neurotransmitter that carries and modulates signals among brain cells. Studies consistently show that feeling rewarded is accompanied by a rise in dopamine levels.

This reward system was first noted in the 1950s by two McGill University researchers, James Olds and Peter Milner. Stimulating electrodes were placed in presumed brain reward areas of rats. When allowed full unrestricted access to a lever that, when depressed, would cause the electrodes to fire, the rats quickly learned to repeatedly depress the lever, often to the exclusion of food and water. Realizing that our brains are capable of producing feelings so intense that we choose to ignore such basic drives as hunger and thirst was a first step toward understanding the enormous power of the brain’s reward circuitry.

Critical to understanding how stories spark the brain’s reward system is the theory known as pattern recognition—the brain’s way of piecing together a number of separate components of an image into a coherent picture. The first time you see a lion, for instance, you have to figure out what you’re seeing. At least 30 separate areas of the brain’s visual cortex pitch in, each processing an aspect of the overall image—from the detection of motion and edges, to the register of color and facial features. Collectively they form an overall image of a lion.

Each subsequent exposure to a lion enhances your neural circuitry; the connections among processing regions become more robust and efficient. (This theory, based on the research of Canadian psychologist Donald O. Hebb, a pioneer in studying how people learn, is often stated as “cells that fire together wire together.”) Soon, less input is necessary to recognize the lion. A fleeting glimpse of a partial picture is sufficient for recognition, which occurs via positive feedback from your reward system. Yes, you are assured by your brain, that is a lion.

An efficient pattern recognition of a lion makes perfect evolutionary sense. If you see a large feline shape moving in some nearby brush, it is unwise to wait until you see the yellows of the lion’s eyes before starting to run up the nearest tree. You need a brain that quickly detects entire shapes from fragments of the total picture and provides you with a powerful sense of the accuracy of this recognition.

One need only think of the recognition of a new pattern that is so profound that it triggers an involuntary “a-ha!” to understand the degree of pleasure that can be associated with learning. It’s no wonder that once a particular pattern-recognition-reward relationship is well grooved into our circuitry, it is hard to shake. In general—outside of addiction, that is—this “stickiness” of a correlation is a good thing. It is through repetition and the sense of familiarity and “rightness” of a correlation that we learn to navigate our way in the world.

Gallagher_BREAKER

Science is in the business of making up stories called hypotheses and testing them, then trying its best to make up better ones. Thought-experiments can be compared to storytelling exercises using well-known characters. What would Sherlock Holmes do if he found a body suspended in a tree with a note strapped to its ankle? What would a light ray being bounced between two mirrors look like to an observer sitting on a train? Once done with their story, scientists go to the lab to test it; writers call editors to see if they will buy it.

People and science are like bread and butter. We are hardwired to need stories; science has storytelling buried deep in its nature. But there is also a problem. We can get our dopamine reward, and walk away with a story in hand, before science has finished testing it. This problem is exacerbated by the fact that the brain, hungry for its pattern-matching dopamine reward, overlooks contradictory or conflicting information whenever possible. A fundamental prerequisite for pattern recognition is the ability to quickly distinguish between similar but not identical inputs. Not being able to pigeonhole an event or idea makes it much more difficult for the brain to label and store it as a discrete memory. Neat and tidy promotes learning; loose ends lead to the “yes, but” of indecision and inability to draw a precise conclusion.

When we make and take incomplete stories from science, there are moral consequences.

Just as proper pattern recognition results in the reward of an increased release of dopamine, faulty pattern recognition is associated with decreased dopamine release. In monkeys, the failure to make a successful prediction (correlation between expected and actual outcome) characteristically diminishes dopamine release exactly at the time that the predicted event is anticipated but fails to occur. Just as accurate correlations are pleasurable, lack of correlation produces the neurotransmitter equivalent of thwarted expectation (or worse).

Once we see that stories are the narrative equivalent of correlation, it is easy to understand why our brains seek out stories (patterns) whenever and wherever possible. You may have read or heard about the famous experiment in which University of Illinois psychology professor Daniel Simons asked subjects to watch a video and count the number of times a ball is dribbled by a basketball team. When focused on counting, the majority of viewers failed to see a woman in a gorilla suit walk across the playing area. In effect, well-oiled patterns of observation encourage our brains to compose a story that we expect to hear.

Because we are compelled to make stories, we are often compelled to take incomplete stories and run with them. With a half-story from science in our minds, we earn a dopamine “reward” every time it helps us understand something in our world—even if that explanation is incomplete or wrong.

Following the Newtown massacre, some experts commented on the killer having Asperger’s syndrome, as though that might at least partially explain his behavior. Though Asperger’s syndrome feels like a specific diagnosis, it is, by definition, nothing more than a constellation of symptoms common to a group of people. In the 1940s, Austrian pediatrician Hans Asperger noted that a number of patients had similar problems with social skills, eccentric or repetitive actions, unusual preoccupation rituals, and communication difficulties, including lack of eye contact and trouble understanding facial expressions and gestures. The 2013 decision by the American Psychiatric Association to remove the diagnosis of Asperger’s syndrome from its guidebook for clinicians, the Diagnostic and Statistical Manual of Psychiatric Disorders (DSM-V), for failing to conform to any specific neuropathology, underscores the all-too-common problem of accepting a clustering of symptoms as synonymous with a specific disease. Syndromes are stories in search of underlying causes.

Similarly, studies of psychopaths have shown a diminished volume of gray matter in specific regions of the prefrontal cortex. But these findings aren’t the sole explanation for violent acts. Because it is impossible to stimulate a specific brain region to produce complex and premeditated acts, we are left to conclude that while certain brain conditions can be correlated with a complex act, they are not necessarily causing it. Likewise, brain scans that reveal abnormalities in mass murderers may help us understand what might have contributed to their behavior. But the abnormalities are no more the sole explanation for violence than childhood neglect or poor nutrition are. They are stories, albeit with a detailed neurophysiological component, but stories nonetheless.

When we make and take incomplete stories from science, there are often moral consequences. How much personal responsibility should we assign to an individual with a damaged or malfunctioning brain? What is the appropriate punishment and possibility of rehabilitation for such a person? Only when we openly acknowledge the degree to which science is presenting its observations in the form of story can we address this moral dimension. We must each work out our own guidelines for when we think scientific data has exceeded its bounds and has morphed into the agenda and bias of story. Of course this is always going to be a challenge in the absence of a full array of scientific data.

But we can begin by being aware of the various ways that storytelling can insinuate itself into the presentation and interpretation of data. Good science is a combination of meticulously obtained and analyzed data, a restriction of the conclusions to those interpretations that are explicitly reflected in the data, and an honest and humble recognition of the limits of what this data can say about the world.

Loose ends lead to the “yes, but” of indecision and inability to draw a precise conclusion.

As members of the public, we need to ensure that any science we accept as truth has passed through the peer-review process. We should also understand that even peer-reviewed data is not always accurate. In 2011, Nature reported that published retractions had increased by a factor of 10 over the last 10 years, while the number of papers published rose only 44 percent. Also in Nature, scientists C. Glenn Begley and Lee M. Ellis wrote that their colleagues at the biotechnology firm Amgen could reproduce only six of 53 landmark hematology and oncology studies from the scientific literature. Similarly, scientists from Bayer reported in 2011 that they could not consistently reproduce about two-thirds of oncology studies relevant to their work.

When reading science reports, we should also search for information on the limits of the data. Were assumptions made? What do the “error bars,” or graphic representations of variable data, say? We may not always understand the data limits, but we should be worried when some discussion of them is completely absent.

In the end, scientists have the tools, language, and experience to tell us informed, engaging, and powerful stories. In turn, we should judge their studies in the same light in which we judge other artistic forms. Like a literary critic, we should assess the preciseness of language, the tightness of structure, the clarity and originality of vision, the overall elegance and grace of the study, the restraint with which they present moral issues, how they place their studies in historical, cultural, and personal context, and their willingness to entertain alternative opinions and interpretations.

The methodology of science remains one of the great advances of humankind. Its stories, properly told, are epic poems in progress, and deserve to stand alongside the great stories of history.

____________________________________________________________________

This article was written by Robert A. Burton and published on Nautilus on 8th August 2019. To read the original article please read here.

Agatha Christie: Queen of Crime

Agatha Christie is an English writer, known for her 66 detective novels, 14 short story collections and 1 play. More than thirty feature films have been based on her work and her novels have sold around 3 billion copies.

Her success as a novelist is an example of how an author can develop a truly recognisable voice and brand; her’s being The Queen of Crime or The Queen of Mystery. Despite critique of her populist style, her prolific output of novels featuring recognisable protagonists such as Hercule Poirot and Miss Marple, creates a universe into which readers can return time and again to enjoy new episodes.

She was born Agatha Mary Clarissa Miller on 15th September 1890 into a wealthy upper-middle-class family. Christie described her childhood as “very happy”. Her time was spent between her home in Devon, a family house in West London, and parts of Southern Europe, where her family would stay during the winter.

Agatha receive a home education and was a voracious reader from an early age. At age 11, after her father’s early death she was sent to receive a formal education and later to Paris where she attended finishing school. In 1910, Christie and her mother Clara moved to Cairo to enjoy the warmer climate and Christie attended many social functions in search of a husband.

Christie wrote her first short story, a 6,000 word piece on the topic of “madness and dreams”. Other stories followed however, magazines rejected all her early submissions. Christie set her first novel, Snow Upon the Desert, in Cairo drawing from her recent experiences there. Still rejected by publishers, a family friend and published writer sent her an introduction to his own literary agent, who despite rejecting her novel suggested a second.

Agatha met Archibald Christie at a dance near Torquay in 1913. He was an army officer and they married on Christmas Eve 1914 while Archie was on home leave from the War. Christie volunteered at home and attended to wounded soldiers at a hospital in Torquay where she qualified as an “apothecaries’ assistant”. After the war, Agatha and Archie settled in a flat in London and in 1919, welcomed a daughter, Rosalind Margaret Hicks.

Agatha kept writing and having long been a fan of detective novels including Sir Arthur Conan Doyle’s early Sherlock Holmes stories, she wrote The Mysterious Affair at Styles. This novel first featured Hercule Poirot, a former Belgian police officer, inspired by Belgian soldiers whom she helped to treat as a volunteer during the War. Her original manuscript was again rejected by many publishers however, after several months, The Bodley Head offered to accept it with revisions. It was finally published in 1920 and Christie was 30 years old.

Her second novel, The Secret Adversary (1922), featured a new detective couple Tommy and Tuppence, again published by The Bodley Head, earning her £50. A third novel followed again featuring Poirot, titled Murder on the Links (1923), as did more short stories. As Agatha kept writing, the popularity of her work grew.

Around this time, the Christie’s toured the world promoting the British Empire Exhibition, leaving their daughter Rosalind with Agatha’s mother and elder sister. However in late 1926, Archie asked Agatha for a divorce. He had fallen in love with a woman he had met on the promotional tour. On 3 December 1926, an evening that Archie left their house to see his mistress, Christie disappeared causing an outcry from the public and a nationwide man hunt. Christie’s disappearance was featured on the front page of The New York Times. She was discovered safely 10 days later, however her global fame was secured.

The couple divorced in 1928, and Archie married his mistress. Agatha retained custody of their daughter Rosalind and the Christie surname for her writing. The same year, she left England for Istanbul and subsequently for Baghdad on the Orient Express. Late in this trip, in 1930, she met a young archaeologist Max Mallowan, whom she married. Their marriage was happy and lasted until Christie’s death 45 years later.

During the Second World War, Christie worked in the pharmacy at University College Hospital, London, where she acquired a knowledge of poisons that she featured in her post-war crime novels. For example, so accurate was her description of thallium poisoning that on at least one occasion it helped solve a real case. Also during the Second World War, Christie wrote, Curtain and Sleeping Murder, both the last cases of the great detectives, Hercule Poirot and Miss Marple. Both books were sealed in a bank vault until near the end of her life.

Christie often accompanied her husband Mallowan on his archaeological expeditions, and her travels with him contributed background to several of her novels set in the Middle East. Christie’s 1934 novel Murder on the Orient Express was written in the Pera Palace Hotel in Istanbul, Turkey and the archaelogical temple site of Abu Simbel, is depicted in Death on the Nile as is life at the dig site in Murder in Mesopotamia. Their extensive travelling also resulted in transportation often playing a part in her murderer’s schemes.

From 1971 to 1974, Christie’s health began to fail, although she continued to write. She died on 12 January 1976 at age 85 from natural causes. She remains the most-translated individual author, being published in at least 103 languages and her novel, And Then There Were None with 100 million sales holds the record of being one of the best-selling books of all time. Her stage play The Mousetrap also holds the world record for longest run, opening in 1957 and still running today in the West End after more than 27,000 performances.

Whatever one thinks of Agatha Christie one cannot but admire the enormous impact she has made on world literature.

Endings: the good, the bad and the insanely great!

In this short video, screenwriter Michael Arndt, outlines the ingredients of what he believes make a great film ending. Oscar winner for best original screenplay, ‘Little Miss Sunshine’ [2006] and best adapted screenplay ‘Toy Story 3’ [2010], Arndt also worked on the script for ‘Star Wars: The Force Awakens’ [2015] and is a true veteran of the craft.

He emphasises several times throughout the presentation that he does NOT intend to state that story telling is formulaic. His analysis merely is an attempt to understand how great stories work by taking the viewer to the point of emotional catharsis.

Arndt points out that many scripts fail to deliver on their endings. While the girl gets the boy, or the hero wins the prize, much of the emotional catharsis of story resolutions are sorely lacking. With much reflection, he has identified three important ingredients for a great story, which when resolved create a great ending: a personal stake, an external stake and a philosophical stake.

To illustrate what he calls, ‘insanely great’ endings, Arndt uses ‘Star Wars: A New Hope’ [1977], ‘The Graduate’ [1967] and his own ‘Little Miss Sunshine’ [2006], each which within only 2 short minutes, bring resounding emotional climax and catharsis for the viewers.

Watch the full video presentation via the link below:

Endings: The Good, the Bad, and the Insanely Great from Pandemonium on Vimeo.

Brutalism

Brutalist architecture, or Brutalism, is an architectural style which emerged in the mid-20th century. It is characterized by simple, block-like structures and bare building materials such as exposed concrete and brick.

The term “Brutalism” was coined in association with béton brut, meaning raw concrete in French. The style descended from the modernist architecture of the turn of the century and embodied an architectural philosophy which was often associated with a socialist utopian ideology: by a desire to improve the condition of every member of society, by peaceful means, and endeavor, by small experiments.

Close to home for me, examples of the Brutalist style are Queensland Art Gallery and more famous global icons include the Barbican Centre and National Theatre in London, UK and Boston City Hall, USA.

It gained momentum in the United Kingdom during the 1950s as economically depressed, World War II-ravaged, communities sought inexpensive construction and design for housing, shopping centres, and government buildings. However, the movement as a whole, has drawn a range of criticism including from Charles, Prince of Wales, who denounced Brutalist structures as,

“piles of concrete”.  

Indeed the style is unappealing due to its “cold” appearance, and association of the buildings with urban decay. The forms can project an atmosphere of totalitarianism while the concrete easily becomes streaked with water stains, moss and lichens, and rust stains from the steel reinforcing bars. Cladding can be applied to improve the appearance of the exterior however has increased fire risks; as exemplified in the 2017 Grenfell Tower fire disaster.

How can architecture modeled on a philosophy of utopian desire to improve society, become so dystopian?

In his essay ‘The Feeling of Things: Towards an Architecture of Emotions‘, Peter St. John writes,

The choice of a building’s construction, its material and its structure, has a direct effect on the emotional character of its spaces. Although discussions of construction often centre on issues of performance and technique, ultimately construction is about appearance. 

Brutalist architecture is an interesting study of the intersection between philosophy, politics, history, economics and art. We humans are affected by the building we inhabit, and which make up our towns and out cities, the stories and ideologies they embody and the emotion and character of their space.

When we decide what is right and wrong …[spoilers within].

“You will not certainly die,” the serpent said to the woman. “For God knows that when you eat from it your eyes will be opened, and you will be like God, knowing good and evil.When the woman saw that the fruit of the tree was good for food and pleasing to the eye, and also desirable for gaining wisdom, she took some and ate it. 

Genesis 3: 4-6

In May 2019, the epic HBO TV series Game of Thrones came to an end. The 8 season, 73 episode series was first aired on April 17, 2011 and the finale aired this year to a staggering 17 million viewers worldwide [not including illegal downloads]. Despite controversy and fan protest about the series conclusion, it has shattered all records for being one of the most watched TV series of all time.

The now famously controversial final season was reduced from the normal 8-9 episodes to only 6 intense episodes full of battle scenes and special effects. At approximately $5 million-$10 million production budget per episode, the final season was ‘epic’ indeed.

In a poetic soliloquy to sum up epic series, Tyrion Lannister declares:

What unites people? Armies? Gold? Flags?

…. Stories. There’s nothing in the world more powerful than a good story. Nothing can stop it. No enemy can defeat it.

And epic story it is. In an earlier post, I discussed the series in a post Game of Faiths, analysing its rich world of spiritual and religious ideas. Jon Snow is styled by George R R Martin to be an epic hero of mythic narrative. A Christ-like figure of messianic proportions.

It is Jon Snow who demonstrates he is a true leader, one worthy of this cosmic battle. He sacrifices for his men and gains their loyalty and trust. He is betrayed at the hands of his friends and murdered, but he returns from the clutches of death to prompt the Priestess of Light to declare him  Azor Ahai, the one prophesied to bring balance between light and dark, to end the Great Battle with the forces of darkness and death.

https://bearskin.org/tag/game-of-thrones/

However, the final series falls shy of such predictions. Jon does not kill the Night King, ending the long winter, nor does he take the Iron Throne to rule Westeros in peace. Instead he stands by and watches the demise of his love, Daenerys, maddened by grief and power-lust.

She falls prey to the same fate as her Targaryan ancestors, becoming a ‘mad queen’, torching the city that should be hers, mercilessly, and beckoning Jon to join her in creating a new future world, styled in her version of ‘goodness’.

With imagery allusive of post World War II destruction, Daenerys looks over a destroyed city, covered in a white layer of ash including human flesh incinerated. Unrepentant of such necessary evil, she summons Jon to join her to ‘break the wheel’ of tyranny and rule a new world together.

Daenerys ~ ‘It’s not easy to see something that has never been before. A good world.’

Jon ~ ‘How do you know? How do you know it’ll be good?’

Daenerys ~ ‘Because I know what is good. And so do you.’

Jon ~ ‘No I don’t’.

Daenerys ~ ‘You do. You do, you have always known. ‘

Jon – ‘What about everyone else? All the other people who think they know what is good?’

Daenerys ~ ‘They don’t get to choose.’

Daenerys words hearken to one of the oldest stories of human history, a narrative in which humans first fall when they wish to decide what is good and what is evil.

Alongside Nazi Germany and many other of history’s horrible despots, Daenerys goes the way of wicked men and women whose power consumes them and their humanity when they decide their standard of goodness in unique and superior.

Jon Snow, does not sit on any throne, but instead honours a greater standard of good to serve his family and his nation sacrificially.

Whatever you think of the final series of Game of Thrones, the 8 season epic drama has truly set new standards of televisions epic fantasy story telling.

For a good examination of why the final season so disappointed fans of the series, read this excellent article in Scientific American, by Zeynep Tufekci.

Out of Africa

“I had a farm in Africa, at the foot of the Ngong Hills,”

And so in the lilting Danish accent of Meryl Streep, opens Out of Africa, a 1986 film directed by Sydney Pollack.

With sweeping plains of East Africa in view, an attractive cast including Streep and Robert Redford, bolstered by a beautiful musical score by John Barry, ‘Out of Africa‘ went on to win 7 Academy Awards and box office earnings of over USD $227 million.

Based on the memoir with the same title by Danish author Karen Blixen, [Isak Dinesen] the original book was first published in 1937, and recounts events of the seventeen years when Blixen made her home in Kenya, then called British East Africa. The film script was adapted with additional material from Dinesen’s book Shadows on the Grass and other sources.

The book’s title is probably an abbreviation of the famous ancient Latin adage,

Ex Africa semper aliquid novi.

Pliny, The Elder

Out of Africa, always something new.

The book and film are a lyrical meditation on Blixen’s life on her coffee plantation, as well as a tribute to some of the people who touched her life there. It provides a vivid snapshot of African colonial life in the last decades of the British Empire.

Noted for its melancholy, nostalgic and elegiac style, biographer Judith Thurman describes Out of Africa using an African tribal phrase:

clear darkness.

The tale covers the deaths of at least five of the important people in Blixen’s life, and is a meditation on her feelings of loss and nostalgia. She describes her failed business, and comments wryly on her mixture of despair and denial, of the sadness she faces there. A brave and hard working woman for whom almost nothing flows smoothly: marriage, love, business, health. Everything is challenging, even crushing.

Why then is such a story, so sad and so melancholy, yet so enduringly popular among movie goers and readers?

Perhaps in true modernist and existentialist style, Blixen captures the feeling of living, the sights, smells, and sensations of a foreign land and the strange and diverse people she meets there. The bitter-sweetness of existence is shared with us through her experience, marked by love, loss, desire, knowing, holding and surrendering.

Blixen was admired by her contemporaries including Ernest Hemingway, who is reported to have said on winning his own Nobel prize in 1954,

I would have been happy – happier – today if the prize had been given to that beautiful writer Isak Dinesen.

Big Little Lies

I recently attended a debate in central London hosted by Intelligence Squared entitled ‘Identity Politics is Tearing Society Apart‘. The panel boasted an editorial director of BBC news Kamal Ahmed, and novelist Lionel Shriver among others.

Identity is defined in Oxford Bibliographies,

as a tool to frame political claims, promote political ideologies, or stimulate and orientate social and political action, usually in a larger context of inequality or injustice and with the aim of asserting group distinctiveness and belonging and gaining power and recognition.

Vasiliki Neofotistos (2013). “Identity Politics”Oxford Bibliographies. Oxford University Press. Archived from the original on 27 October 2018. Retrieved 9th June 2019.

Arguments in favour of the motion focused on the fact that identity politics has in fact fueled a backlash of populism, bringing alt-right figures to the fore, destroying society’s broad sense of the common good, and increasing antagonism and fragmentation in our society.

Upon entry and upon exit the audience were polled for their agreement or disagreement with the debate title, and the majority 55% left the debate in agreement that indeed, identity politics was tearing society apart.

I, however, did not agree.

Recently I completed the 7 episode first season of ‘Big Little Lies‘ a HBO original series, produced by and starring Nicole Kidman and Reese Witherspoon. The American drama television series, based on the novel  by Australian author Liane Moriarty, premiered on February 19, 2017, and follows the lives and relationships of four women in Monterey, California, The women are united around their children who share a grade one class at the local school.

Their community is socially and economically homogeneous. The women are white Americans, upper middle class, heterosexual, well educated, and nice people. While there is an African American character, she is a vegan, yoga instructor who is socially and economically their equal. One character is single and working class but she is soon brought into the fold by the other women through shared experience. Under the surface of this idyllic beach-side life, where women share expansive homes with their handsome, domesticated husbands lies violence, lies, betrayal and hatred. Each character has layers, motives, jealousies and wounds which drive them through the story arc, Shakespearean at times in range and depth. It’s clear that this society is being torn apart yet – identity politics does not play one note the pain and violence which exists.

Surely there is something deeper than identity that tears our society apart?

Judeo-Christian theology, upon which our western society is based, teaches radical love and service to the ‘other’ most emphatically, the ‘other’ who is powerless, stateless, and voiceless. As such, duty bearers and power-holders have a mandate to identify with and support the recognition of the group who would otherwise be excluded from rights and privileges. Judeo-Christian theology is the very basis of ‘identity politics’.

So why do good, moral, people, feel identity politics has gotten out of hand, tearing at the fabric of society? Why does identity politics get the fall for the violence and dissolution of society?

A quick perusal of any history text shows that every generation of society has been riven by racial, geographical, class and religious wars – each tearing society apart in different ways. The 18th and 19th centuries were defined by class political wars, and the 16th and 17th centuries were defined by religious political wars. Earlier centuries were marked by ethnic wars and indeed the annals of history stretch back into time immemorial to tell of countless epochs of bloodshed.

It begs the question whether it in fact something deeper, something more human which is the enemy to human peace?

If it were identity which bred violence, one solution for humanity may lie in what the Buddhists teach as the denial of identity, the absolution of any ego-attachment to self or otherness and the blissful nirvana of non-being. It can be captures in the lyrics of the late-great John Lennon – ‘imagine’ a world where no countries, religion, or possessions exist, where humans live in peace and ‘as one.’

The challenge with such a philosophy is that it negates love which from the ground of self engages the ‘other’ and gives of self to the other.

In ‘Big Little Lies’ no one escapes the narrative to be ‘good’ or ‘ethical’. Everyone has their story, their motives, their depths. It was Aleksandr Solzhenitsyn who wrote:

But the line dividing good and evil cuts through the heart of every human being. 

It is not the negation of self that brings about peace, nor is it the eradication of ‘identity politics’ which will be the solution to our social ills or the healing of our social fabric. It is only when we address the violence that exists in the human heart that we can begin to find true and lasting peace.

Why Nations Fail

As a follow on to the Bear Skin blog post several weeks ago titled ‘What would Machiavelli Do?‘ comes this short comment on the book “Why Nations Fail” by Daron Acemoglu and James Robinson.

While Niccolo Machiavelli gave a very well thought out treatise on what Princes, or individuals of power, should do to maintain a stable state, Acemoglu and Robinson give a very well thought out treatise on how complex political and economic systems contribute to the prosperity [or failure] of a state.

In brief, their book puts emphasis on the need for centralised power in much the same way Machiavelli does. Their argument is that prosperity is generated by investment and innovation. Without centralised power, there is disorder, which is anathema to investment.

However, for investment and innovation to flourish, entrepreneurs and inventors must have good reasons to think that, if successful, they will not be plundered by the powerful. If the institutions of power enable the elite to serve its own interest – a structure they term “extractive institutions” – these interests ultimately undermine the very innovation and investment necessary for prosperity.

Numerous case studies are listed of both ‘inclusive’ and ‘extractive’ systems of government creating both ‘virtuous circle’ and ‘viscous circle’ of national prosperity or decline. Botswana is lauded as a contemporary example of a nation which has prospered under good leadership. At the critical juncture of independence from colonial rule, wise Botswanan leaders such as its first president, Seretse Khama, [see A United Kingdom] and his Botswana Democratic Party chose democracy over dictatorship and the public interest over private greed. Botswana holds regular elections, has not since had a civil war and enforces property rights. When diamonds were discovered, a far-sighted law ensured that the newfound riches were shared for the national good, not elite gain.

What is of most startling interest when contrasting the two works of political theory and philosophy, is that Machiavelli eschewed ‘morality’ and what ‘should’ be done, in favour of what is most politically expedient while Acemoglu and Robinson seem to be pointing us back to ancient wisdom. Acemoglu and Robinson argue for leadership that cedes short term power and gain for long term national good and which promotes public interest over private greed. Yet they argue for this from economic rather than morally grounded reasons.

This begs the question, do ancient moral codes derive their wisdom from systems thinking? And are they less divinely illuminated and more beholden to insight taken from the consequences of decisions across generations rather than within the lifespan of any individual?

Once could counter Machiavelli on his point:

He who neglects what is done for what ought to be done, sooner effects his ruin than his preservation.

… with the counter wisdom that, one [a ruler] who neglects what ought to be done, sooner effects the ruin of future generations. This alone should give any leader pause to consider their decisions lest their short term success indeed bring about ruin for those who follow.