The Globe and Mail’s July 21, 1969, front page was intoxicating. Bold, green, three-inch high print announced MAN ON MOON. It reported 35,000 people breathlessly glued to a big TV screen in Toronto’s Nathan Phillips Square who cheered at 10:56 pm when Neil Armstrong stepped from the lunar module. Mayor Dennison delivered a brief speech calling it, “the greatest day in human history.” He may have been right. What he couldn’t know, and the Globe missed, were the important lessons contained in the paper that day, lessons that resonate today.
The moon adventure was the culmination of an effort begun by President John F. Kennedy on May 25, 1961. He had just returned from meetings with Soviet premier Nikita Khrushchev. While Kennedy negotiated, Khrushchev had hectored. Kennedy became convinced that the Cold War was about to turn hot.
Upon his return, he called a special meeting of Congress and asked for a whopping $1.6 billion increase in military aid for allies and $60 million to restructure the American military. He called for a tripling of civil defense spending to help Americans build bomb shelters for a nuclear holocaust that, he warned, was a real possibility. The president also said: “I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth.” His popularity surged.
It was daring and presumptuous. The Soviets were far ahead of the United States in space exploration. But that day, and later, Kennedy expressed the courageous new effort in soaring rhetoric that appealed to America’s inspiring exceptionality and Cold War fears. When cheers arose from public squares and living rooms only seven years later and that night everyone instinctively looked up, it was the culmination of Kennedy’s dream for the world and challenge to America.
Kennedy did not micro-manage the NASA project. He set the vision and got out of the way. He did not badger the agency regarding tactics or berate it over temporary failures. He didn’t question the intelligence or patriotism of those who politically opposed his ambitious goal. Rather, he met with them, listened, and tried to convince them of the value of ambition. He gave NASA the money it needed then trusted the scientists and engineers to act as the professionals they were. His vision and leadership spurred the team and survived his death.
The Globe and Mail’s July 21 front page declaring his vision’s realization did not mention President Kennedy. However, a smaller headline at the bottom noted, “Woman dies in crash, police seek to charge Kennedy.” The story explained that Senator Edward Kennedy, the president’s brother, would be prosecuted for leaving the scene of an accident.
On July 18, with the Apollo astronauts approaching the moon and their rendezvous with infamy, Senator Kennedy had attended a party on Chappaquiddick Island for six women and two men who had worked on his brother Bobby’s doomed 1968 presidential campaign. While driving 28-year-old Mary Jo Kopechne back to her hotel, he took a wrong turn, then missed a slight curve on an unlit road and drove over a bridge and into eight feet of water.
Kennedy managed to escape the submerged car and later spoke of diving “seven or eight times” but failing to free Kopechne. He walked back to the party and was driven home. That night he consulted with advisors and then, eight hours after the accident, called the police. A coroner reported that an air pocket probably allowed Kopechne to survive for three or four hours before drowning. A quicker call for help, he concluded, would have saved her life.
In the 1990s, Edward Kennedy would become the “Lion of the Senate,” guardian of the Democratic Party’s progressive wing, and model for bi-partisanship. However, when he ran for his party’s nomination for president against the incumbent Jimmy Carter in 1980, many saw not a lion but liar and not a politician but playboy. Chappaquiddick appeared to reflect a belief that ethics, morality, and the rule of law applied only to others. Voters punished his conceit by withholding support.
It was all there in the Globe and Mail, nearly 50 years ago this week. We have the legacy of one brother who, despite his personal flaws, understood the nature, power, and potential of leadership. He knew what it took to be an effective president. And we have the other brother who seemed, at that point, to understand only the arrogance of privilege, the hubris to believe that he was above the law, ethics, morality, and decency. They are lessons of the moon and the bridge.
And now, as we cringe through our inability to tear ourselves from the tragedy unfolding in Washington, as we watch political leaders displaying the characteristics of one Kennedy brother or the other, we wonder if the lessons of the moon and bridge have been learned.
If you enjoyed this column, please send it to others on Facebook, twitter or your social media of choice and consider leaving a comment.
In June 1816, Mary Shelley and her husband were enjoying a dinner party with a group of friends. They talked of books and poetry and swapped German ghost stories. The dinner led Shelley to write a short story that she later turned into her 1818 novel, Frankenstein. The book was a cautionary tale of a research scientist who successfully assembled a living being from corpses, only to have his creation turn on him and wreak havoc on the community. The book asks us to be aware of the Frankensteins of unintended consequences all around us. Let’s consider one.
One day in 1991, researchers working in England for the pharmaceutical giant, Pfizer, were taken by surprise. They had been toiling away to develop a chemical compound to treat heart problems. They had come up with Sildenafil. It looked promising but then, during clinical trials, older men who had been taking the compound reported rock-hard erections lasting more than an hour. Those in the placebo-taking control groups reported no such effects. The Pfizer heart research project took a quick turn. More tests were done, the discovery was deemed sound, and so a method of mass producing the compound as a pill in the proper dosage was quickly established. The research team had inadvertently invented Viagra.
Patents were obtained. Observers wryly noted the unusual lightning speed with which the predominately middle-aged men in charge of so many of the world’s government approval processes allowed the little blue pill to machete its way through red tape. Within six months of its American approval, in March 1998, 7 million prescriptions were written, rendering it the country’s most popular medication.
Pfizer’s future changed and its stock and profits rose dramatically. Commercials changed acceptable public conversations by dragging discussions of impotence, or erectile dysfunction, as it was renamed, from the shadows. The research changed the lives of millions of men and couples for whom impotence had been a problem. All was well.
But then, retirement homes and senior-dominated communities began reporting skyrocketing numbers of cases of sexually transmitted diseases. Arizona’s Pima and Maricopa counties, for instance, have unusually large senior populations. From 2005 to 2009 the number of people older than 55 who contracted syphilis and chlamydia for the first time in their lives rose by 87%. As is the case with most corporate, applied research, Pfizer never released the names of those who created Viagra so we don’t know their reaction to the good and bad changes their work brought about. But Mary Shelley would have smiled.
What other research and inventions bring about Frankenstein change? What small decisions have we made in our lives, that ended up big ones in disguise, put us on roads we had hoped to never travel? How many political decisions made for expedient or partisan reasons have helped some but hurt many? Can we rise up as the torch-bearing villagers did in Shelly’s novel and defeat our Frankensteins? Let’s first identify them in our lives and our communities. Then, let’s light the torches.
If you enjoyed this column, consider sharing it with others by reposting on Facebook or your social media of choice or checking my other work at http://www.johnboyko.com
The crowd hushed, cameras snapped, and Senators sat respectfully still as the slight, pale woman limped slowly to the big table then, painfully, took her seat. It was June 4 1963, and Rachel Carson was 56 but looked much older. She was dying. Cancer had fractured her pelvis, taken a breast and, hidden by a dark wig, her hair.
Carson had worked as a United States Fish and Wildlife Service marine biologist and written articles for a number of magazines. She had turned her love of the sea and outrage with what was happening to rivers, lakes, and oceans into three best-selling books: The Edge of the Sea, Under the Sea-Wind, and The Sea Around Us. Each presented disturbing ideas and scientifically sophisticated arguments without jargon, preaching, or rancour. She married her knowledge, passion, and writing and investigative skills in the creation her next book: Silent Spring.
While researching the book, Carson had served on the Natural Resources Committee of the Democratic Advisory Council where she became aware of Massachusetts Senator John F. Kennedy having initiated the Cape Cod National Seashore Act. Kennedy had read her books on the sea and then the committee report and so when he sought his party’s nomination for president, he invited Carson to join the Women’s Committee for New Frontiers.
Photo: Rachel Carson Council
As president, Kennedy read Silent Spring pre-publication excerpts in the New Yorker magazine. He was moved by Carson’s detailing the devastating effects of pesticide use on animal and human health and invited her to attend a White House conference on conservation. The conference led to Kennedy announcing that, because of Carson’s work, he was ordering the Department of Agriculture and the Public Health Service to investigate the dangers of pesticide use and the establishment of the President’s Science Advisory Committee to study links between pesticides and health.
Silent Spring became an instant bestseller when published in September 1962. It explained how pesticides, and specifically DDT, had been around since 1874. The American army had used DDT in both world wars to delouse soldiers and that Paul Hermann Müller had won the 1948 Nobel Prize for determining its effectiveness in killing mosquitoes and other pests. Carson’s book explained how DDT was also killing fish, birds, and people. Her title warned of the day that birds would be gone and skies without song. Most shockingly, Silent Spring told of how the government, scientific community, and the companies making and selling pesticides knew of their harmful effects. But there was money to be made. And so, the evidence was ignored, hidden, and denied. Carson asked an essential question: “How could intelligent beings seek to control a few unwanted species by a method that contaminated the whole environment and brought the threat of disease and death even to their own kind?”
Pesticide manufacturing companies Cyanamid, Monsanto, and Velsical were outraged. They attacked. Velsical threatened to sue Carson, her publisher, and the New Yorker. They even tried to stop the publication of an article about the book in the Audubon magazine. The companies paid scientists to write editorials and articles that belittled Carson and her conclusions. The National Agricultural Chemicals Association published a booklet, Fact and Fancy, that savaged Kennedy and Carson. It was argued that Americans would suffer a food shortage without DDT.
In May 1963, the President’s Science Advisory Committee released a 46-page report, Use of Pesticides. With point after well-supported point, it said the companies were wrong and Carson was right. It stated, “Until the publication of Silent Spring by Rachel Carson, people were generally unaware of the toxicity of pesticides…The Government should present this information to the public in a way that will make it aware of the dangers while recognizing the value of pesticides.”
A month later, as part of that public education process, the sick, fragile, and wan Carson took her seat before the Senate subcommittee. She briefly summarized Silent Spring’s findings and then listed specific recommendations. The government should ban aerial spraying without the permission of landowners. Citizens should enjoy guaranteed security against poisons used by companies, governments, and private individuals. Corporations making pesticides, and all those using them, should be strictly regulated. She advocated the outright banning of DDT. The government should fund and support grass roots citizen organizations and non-government organizations to encourage awareness of environmental issues.
The environmental movement was born. American companies sold 90,000 tonnes of DDT in 1963 but production decreased the next year and every year after that. It took a while, but in 1972, American DDT production was banned. Carson’s name was raised and Silent Spring was read by those advocating and then celebrating President Nixon’s Clean Air and Water Acts, the National Environmental Policy Act, the Endangered Species Act, and, in 1970, his establishment of the Environmental Protection Agency.
In his 1996 book, Our Stolen Future, Dr. Theo Colborn wrote about chemicals that interfere with our body’s hormonal system called endocrine disrupters. He credits Silent Spring with awakening him and other scientists and researchers to the dangers of manmade chemicals and noted how it was still inspiring discoveries and environmental advocacy.
Breast cancer took Rachel Carson in 1964. But her voice still echoes for Silent Spring is still read. It still inspires. It still exasperates.Silent Spring is still discussed around the world every Earth Day.
Books that matter always educate and infuriate and important authors, like important ideas, are always ignored, then mocked, then attacked, and, in time, celebrated. Books measure how far we have come and how far remains to go. As the American government appears ready to deregulate corporations and eviscerate environmental regulations, and women are leading the charge to fight the turning back of the clock on this and other issues, perhaps Silent Spring is more important now than ever.
Rachel Carson’s Silent Spring helped change the world. It may need to change it again.
Is every child my child? Does ideology end at the bedside of a sick child? I ponder those questions every day when I watch the bravest person I know – my granddaughter. Consider this:
A healthy, happy little boy was suddenly insatiably thirsty. He began urinating a lot and often and feeling increasingly tired. His skin became thin and dry. No matter how much he ate, he continued to lose weight. A few months later he was weak, gray, and skeletal. His eyesight weakened and then his retinas detached rendering him blind. Within nine months, the now bedridden child gasped for air. Less than a year after falling sick, he slipped into a coma and, mercifully, died.
The sad part to this tragic tale is that it was not rare. Ancient Egyptians, Greeks, Chinese, and Indians saw children and adults die in this horrible, mysterious fashion. A first-century Greek researcher, Arataeus of Cappadocia, described the disease as “the melting down of flesh and limbs into urine.” He used the Greek word for “passing through” or “siphon” to name it: diabetes.
For hundreds of years, researchers were stymied. It was suggested that diabetics eat things that the body would have to fight to turn to urine such as almonds and broken bits of coral. It didn’t work. Seventeenth-century Scottish researchers developed a diet treatment in which patients ate nothing but blood puddings, fat, and rancid meat. It didn’t work. In the 1800s, doctors bled diabetics; every day for a week or so, a vein would be opened and pints of supposedly bad blood was drained. It didn’t work. In the early 1900s, diabetic children were hospitalized and fed only 450 calories a day. They were starved to death. German scientists found that eating carbohydrates was linked to symptoms and so they locked up diabetic children and force fed them oatmeal. Nothing worked.
An import step came when German researchers used autopsy studies to link diabetes to the pancreas. The pancreas is a small seahorse-shaped gland that lies between the stomach and spine. You can locate it by pressing your right thumb and little finger together, keeping your other fingers straight and together, and then placing your thumb at the centre of your stomach, even with your lowest rib. Your three extended fingers now approximate the location and size of your pancreas.
German researcher Paul Langerhans advanced learning by postulating that the pancreas produces two types of cells. One is secreted into the small intestine and aids with digestion. He called them external cells. The other is secreted into the bloodstream to regulate glucose levels. He dubbed them internal (later the islets of Langerhans). It was postulated that without the internal clusters of cells, sugars could not be metabolized from food and so suger entered the blood stream and gathered in increasingly high levels as the body could no longer clean and flush it out. Then the awful symptoms began.
It was a breakthrough but for decades afterward, researchers tried but failed to find a way to utilize the new understanding by artificially doing what a dead pancreas could not – extracting cells from a healthy a pancreas and injecting them into a diabetic patient. People continued to die.
Photo: Queen’s University
Frederick Banting grew up on a small Ontario farm. He undertook medical training at the University of Toronto. After service as part of Canada’s First World War Army Medical Corps, and becoming both wounded and decorated, he became a surgeon in Toronto. He later opened a small practice in London, Ontario. The 29-year-old was barely eking out a living.
In the middle of a sleepless night, he was reading a medical journal about diabetes research when he experienced a eureka moment. It appeared clear to him that when extracting secretions from the pancreas, researchers were missing the possibility that external secretions were damaging the internal secretions. The two had to be separated, he thought, and then a serum could be developed using only the internal secretions.
The next weekend, he arrived without an appointment at the office of the University of Toronto’s professor of physiology, J. J. R. Macleod, who was famous for his work on the metabolism of carbohydrates. McLeod listened patiently but was unimpressed by the young man with little knowledge of current diabetes research, without a Ph.D., and with no clinical research experience. After several more visits, Banting was about to give up when he saw the professor lean back and close his eyes. But then, McLeod leaned forward, smiled, and said the idea just might work.
In April 1921, Banting arrived at McLeod’s small lab. He met fourth-year student Charles Best who would assist. They used dogs. Banting removed the pancreas of some to induce diabetes. He removed part of the panaceas from others and then, with blood vessels still in place, sewed the severed portion just below the skin of the abdomen. He then tied off, ligated, the grafted portion and waited for the external cells to die. Internal cell clusters were then extracted, purified and processed using water at first and, as they learned more, alcohol. They then injected the extraction into depancreatized dogs. Some showed slightly positive reactions but most didn’t. Many died. The determined Banting and Best slaved away in the smelly, sweltering lab, painstakingly honing the process of removing impurities from the extracts.
In July, after a number of revisions and failed experiments, they injected a depancreatized white terrier with duct-ligated extract. Blood sugar levels dropped from dangerous highs to near normal levels. With their extract in its body, the dog was metabolizing sugar as if its pancreas was still there. Unable to estimate the amount of extract necessary, the dog died. They learned. They injected another dog that had fallen into a diabetic coma with new extract and marveled as the dog awoke, wobbled to its feet, and then walked about the room. Banting and Best were ecstatic. They called their extract Isletin.
A month later, shortly after MacLeod’s return from an extended absence overseas, Banting stormed into the professor’s office with a list of demands including a salary, more assistance, and changes to the lab. A young man was hired to tend to the dogs, biochemistry professor James Bertram Collip joined the research team, a bigger lab was found, back pay for Banting and Best was paid, and a university lecturing job was found for Banting who at that point was just a few dollars from destitution.
Research moved more quickly when Banting began using the pancreas of unborn calves that he procured from local abattoirs. The diabetic dogs began responding better and living longer. Finally, it was time
His name was Leonard Thompson. He was 14 years old. He was from a poor family and so was a public ward patient at the Toronto General Hospital. His diabetes had been diagnosed nearly two years before. He was emaciated and near death. He weighed only 65 pounds. His skin was gray, he could no longer walk, and had trouble focussing and even staying conscious. Banting explained the extract trial to Thompson’s father who quickly consented.
On January 11, 1922, two doses of isletin extract were injected into young Thompson’s backside. Thompson was too ill to even flinch. The sugar in his blood and urine dropped by 25%. It was good but not great. The disappointing results were deemed the result of impurities in the extract and so they went back to work with Collip whipping up batches like a chef trying new recipes.
Two weeks later they walked back across the street to Toronto General Hospital’s H Ward. Leonard’s condition had worsened. He was now fading in and out of a coma. The boy was given two injections that afternoon and one the next morning. It worked. Miraculously, he sat up. He smiled. The fog that had haunted his eyes for so long suddenly cleared. He asked for food. Leonard was Lazareth.
Banting opposed patenting what they were now calling insulin. He insisted that medical advances belonged to all and were for the good of mankind. A patent was eventually applied for in the names of Best and Collip and with the direction that it would be assigned to the University of Toronto. It was written so anyone could use their process to manufacture insulin but that no one else could patent the process. It thereby deprived anyone from stopping anyone else from manufacturing insulin. American legalities later led to Banting’s name being added to the patent.
True to Banting’s principles, the Indiana-based Eli Lilly and Company was afforded an exclusive deal to manufacture insulin in the United States but for the first year it had to be distributed free of charge. Toronto’s Connaught Laboratories manufactured and distributed free insulin in Canada. It was also agreed that the university would happily send the formula to any researcher in the world for free, in return for a promise that insulin would not be produced for sale.
By the end of 1923, diabetes patients in Canada, the United States, and parts of Europe were receiving insulin injections. Each represented an inspiring and heartrending story of recovery as they stepped back from death’s door. The 1923 Nobel Prize for Physiology or Medicine was awarded to Banting and McLeod. McLeod shared his prize money with Collip and Banting shared his with Best.
Among the millions of lives that have been saved by the work of Banting and his Toronto colleagues, and those upon whose shoulders they stood, is my granddaughter. She’s eight years old. For three years now she has pricked her thumb to draw then test blood six to ten times a day. It hurts every time. Trust me, I’ve done it, and it hurts. She now injects herself with insulin six or more times a day. She watches what she eats and her Mom counts every carbohydrate consumed to adjust insulin dosages. It’s an awful disease but it doesn’t define her. Before the work of Banting, Best, and the others, though, it would have killed her.
We know now that type two diabetes is mostly contracted by adults and mostly due to lifestyle choices. But type one attacks children. No one knows why. For some reason, a virus that gives some kids a cold kills the pancreas of others. Today, over 420 million people around the world and about 10% of Canadians have diabetes. Most have type two. About 26,000 Canadian children have type one.
And so we are back to our initial question. God bless the determined researchers who are working in labs every day, uncelebrated, and often underfunded and underpaid. And God bless those who support the idea that our circle of community involves devoting charitable giving and a sliver of our tax money for research. We are helping people we’ll never meet. We are making all children ours. We are saying where ideological arguments should die so that fewer children will; at the bedside of a sick child.
Someday the cure for type one diabetes will be found. Banting and Best will be remembered. And on that day, I will stand with my granddaughter, and we will cheer.
If you enjoyed this column, please share it with others, consider leaving a comment, and checking out my other columns at http://www.johnboyko.com
Last fall, after recalling some obscure lyric, I said to my friend Chris, “I’ll miss my memory when it’s gone.” Chris is a witty guy. He said, “No you won’t.” Sadly, he was right. This week has led to my considering memory over and over again and it’s left me humbled.
My little band was performing its once a month gig at the local pub, the Canoe and Paddle. As I began to count in a song I realized that I didn’t have a clue as to its first line. I have cheat sheets for some songs but not for this one and, suddenly, Billy Joel’s Still Rock n Roll to Me was gone.
I began playing the thumping guitar part, moved a bit and smiled as if my playing it so long was just part of the show, and then, in a flash, the first line appeared as if in skywriting. If I can get the first line then everything else – the lyrics, chords, guitar parts, arrangement – all click into place. And it happened. But how did it happen? And what happens, I thought, when one day it fails to happen?
It occurred again with a speech I delivered this week about my new book. Like always, I never want to bore an audience with reading so I had no notes. I was fighting a cold and was feeling awful. During the introduction I shivered with sudden chills and then felt drips of sweat. As I stood, I felt dizzy and had to concentrate on smiling and not falling. No part of me was thinking of what to say as I placed a hand firmly on the table that, thank goodness, was close by. Then, from out of nowhere, came the stories, jokes, names, dates, and everything I needed for the 30-minute talk. Where is this nowhere? Again, what happens the first time that it fails to produce?
Like every week, I enjoyed time with my one-year-old granddaughter. She is a beautiful marvel, but what else would you expect me to say? Her walking and talking is akin to a hopelessly charming drunken sailor. Her smiles, peak-a-boo and ball-rolling games, and warm cuddles send my heart soaring. But while crunching my knees on the hardwood and melting with her giggles I considered how much of all this she’ll remember – nothing.
My great grandparents’ Port Dover farm had a bench that encircled a big tree. The corn stalks across the road were as tall as mountains and the chickens in the dark, old barn were scarier than the wicked witch’s flying monkeys. And then there was the big kitchen, and my great grandfather’s stubble, and the big red swing. The farm was sold when I was six but the shards of memories remain. But for things that happened when I was one – nothing. I know things that happened before I can recall them affected and helped shape me as things are now shaping my granddaughter but my actual memories are, and with her will be, an empty well.
Like every week, I also spent time with my father, seventy-nine-years older than my granddaughter. We discussed the impending doctor’s appointment and what might have to happen. Then it did. He has all the coping mechanisms in place with a day timer always in his pocket, a wall calendar, and numbers written by the phone. The scaffolding is there with people cleaning the house and shovelling the snow. But this was one more blow, a devastating blow. Taking cabs from now on is not the end of the world but it is certainly another step in a journey that is proceeding far too quickly. He’s always been a good man and still is. But one important person in my life is growing toward her memory while another is growing out of his.
Scientists define memory as electrical brain impulses that encode, file, and retrieve information. Poets write and sing of misty places beyond the bounds of time and where people and places and smells and smiles are clearest when our minds are calmest. Who is right rests upon who we are, the machines or the ghosts within them. The scientists and poets are both right and both wrong.
This week I was forced to consider how much of what I love is dependent upon memory. I was forced to consider how much of who I love is dependent upon memory. I will never forget this week, but then again.
If you enjoyed this column, please consider sharing it with others on Facebook or your social media of choice.
As a historian, my job is to urge greater understanding of where we are through offering fresh perspectives on where we’ve been. My humble efforts constantly have me discovering things I never knew while challenging myself to reconsider things I thought I knew for sure. The curiosity quest has led to more questions than answers, which, I think, is as it should be. The following are among those issues and queries currently furrowing my brow.
Science: In grade 4, Miss Haney taught me that man very early made jars stand up nearly perpendicular. The mnemonic device allowed me to remember the nine planets and their order from the sun. Look back and see what I mean; I’ll wait.
All was well until 2006 when scientists demoted Pluto to dwarf planet status because it had an unsteady orbit and was unable to “dominate its neighbourhood”. Then, thanks largely to the Hubble telescope, it was discovered that beyond our solar system there are perhaps a trillion planets. I don’t really know what a trillion is but it’s a lot more than eight. These new facts laid waste to Miss Haney’s old facts and ruined her perfectly charming memorization trick.
So, is science based not on facts but our best guess at the moment? If that is true, then what of mathematics, economics or anything else resting upon quantifiable truths?
Music: I used to sneak a small transistor radio into my bed every night. From beneath my pillow, so my parents couldn’t hear, I nodded off to a Buffalo radio station that skipped the latest rock ‘n’ roll across Lake Ontario just for me. I was ripe for the Monkees. I bought the records and every week enjoyed their TV show.
Although an enamoured nine-year old, I noticed that what I was hearing did not match what they were playing; especially Micky the drummer. It ends up that the Monkees sang but the music was played by a group of crack LA studio musicians called the Wrecking Crew. They were the same talented group we really heard when listening to The Byrds, Mamas and Papas, Beach Boys, Association, Partridge Family, Grass Roots, Paul Revere and the Raiders, and many more.
So, can music be enjoyed while accepting deceit in its creation? If so, does the same acceptance apply to other forms of artistic endeavour? If we accept deception in art, then where else will we wink at irony tilting toward lies – perhaps business and governance?
Bible: Until we stopped going to church for some reason, I attended Sunday school. Every week I fidgeted with the adults before we kids were led downstairs for a snack and lesson that we could actually understand. The rather violent portrayal of Jesus upstairs and the equally gruesome representation in the basement frightened me. The stories of God were thankfully reassuring as we were encouraged to consider Him as an old man who not only looked like Santa Claus but also acted a lot like him. Both had lists of naughty and nice and both meted out rewards and punishments although God seemed more quick to anger and a whole lot more spiteful and violent. I recall being shaken by the thought that I was apparently under constant surveillance.
I later enjoyed a university World Religions course, read a great deal, and, over the years, I have re-read the Bible four times. I learned to accept that Jesus was likely not the fair-skinned, blue-eyed, blond man with whom I’d grown up. I learned that crucifixion was the Roman’s chosen form of capital punishment. So wearing a cross as jewellery then would be like wearing an electric chair now. Further, I learned that God is no more a man than Santa but, rather, a concept.
All this was fine but I was more troubled to find myself cherry picking from the Bible. I read that Leviticus 18:22 says, “Thou shalt not lie with mankind, as with womankind: it is abomination.” Ok, I disagreed, but it was clearly stated that homosexuality is a sin. But wait, 25:44 says, “Your male and female slaves are to come from the nations around you; from them you may buy slaves.” So slavery, alternatively, is not a sin but, in fact, encouraged. It must be so because Exodus 21:7 says, “If a man sells his daughter as a servant, she is not to go free as menservants do.”
So, can we accept the good things a religion proffers while ignoring the questionable stuff? Can we use a part of the Bible to justify a particular belief while ignoring other parts? Can we treat the Bible as a smorgasbord without cheapening or even rejecting its core message?
Wealth: I once worked at a school for teenagers who were damaged, learning disabled, culture shocked, lost in the criminal justice system, or just lost. Later, I worked in a private school where those of means could buy their children’s peers and opportunities no longer available in the ideologically besieged and fiscally starved public system. I found about the same percentage of happy and unhappy kids in both schools.
Happiness, it turns out, has little to do with money. Last year, University of San Francisco psychology professor Ryan Howell determined that buying more stuff, having more clothes and cars and living in bigger houses do not make people happier. His findings supported a 2010 Princeton study showing that happiness rises until income hits about $75,000. After that, it was found that happiness goes up not one whit even if one’s income soars higher than poor old Pluto.
So, was John Lennon right? Is love really all we need? If the studies are true then should we re-examine the meaning of success, the efficacy of ambition, and the value of materialism?
There are folks I know who are deeply offended by questions that invite an exploration of opinions that they have hardened into facts. The questions should none the less be asked. I believe that we owe it to ourselves to ask questions of ourselves, even if the answers are difficult, illusive, or impossible.
If you enjoyed this column please share it with others and consider pushing the blue button to follow my weekly blog.