10 Minute Walk took 300 Years

Canada is a large and diverse country and so someone who is well known in one region may be a stranger elsewhere. Such is the case with Wayne Adams. Mr. Adams is a Canadian we should all know.

Adams was born in Halifax. His father died when he was 13. His teen years were shaped by a number of positive role models including his mother, uncles, and church and community leader Reverend W. P. Oliver. All inspired him to be industrious, consider others, and work hard to achieve his goals.

Adams’s first full-time employment was at a Halifax Chevrolet dealership. His diligence and initiative led to his becoming the service sales manager and then Halifax’s first African-Canadian new car salesman and, later, used car manager. He then became the manager of the province’s first indoor service station. With the opening of his Shell station in Lower Sackville, Adams became Nova Scotia’s first African-Canadian service station owner-operator.

Always interested in the news and current affairs, Adams became a broadcast journalist. He became widely known in 1969 for his reporting on Canada’s first Summer Games, held on the campus of Halifax’s Saint Mary’s University. Adams created the Black Journal in 1972 which, until its demise in 1978, reported on news and ideas from an African-Canadian perspective.

Politics

Adams had shown an interest in politics when he was elected to the Student’s Council at Halifax Vocational High School. In 1979, his concern with environmental and economic issues and the manner in which the needs of Halifax’s African-Canadians were being ignored led to his running for municipal office. He understood the challenges facing an African-Canadian in local politics because the city had elected its first African-Nova Scotian, Graham Downey, only five years before. He won a seat on the municipal council of what was then Halifax County. His popularity and hard work led to his being re-elected five times and serving for fifteen years. From 1982 to 1983, Adams was Halifax’s, Deputy Mayor.

In late 1992, Adams announced his intention to run for a seat in the Nova Scotia legislature as a member of the Liberal Party. He was enthusiastically supported by many people but he also confronted blatantly racist insults and incidents. He later said, “That kind of negative reaction just exhilarated my efforts to go on and run and win.”

On May 25, 1993, Wayne Adams was elected to represent the overwhelmingly Black riding of Preston and became the first African-Canadian elected to the Nova Scotia legislature. He received letters of congratulations from across Canada. Premier John Savage understood the significance of his election. He quipped that Adams lived only a ten-minute walk from the legislature building but it had taken him 300 years to get here. Adams became the first African-Canadian in Nova Scotia’s cabinet when he was appointed the minister responsible for the Emergency Measures Act, the minister responsible for the Nova Scotia Boxing Authority, and, his most challenging and rewarding portfolio, minister of the environment.

Among his accomplishments was the development of Canada’s first Solid Waste Management Strategy. Implemented in 1995, within five years it had diverted 50% of waste from landfills through a number of initiatives including a recycling program that banned landfills from accepting items such as tin and glass food and beverage containers, corrugated cardboard, compostable organics, and hazardous materials. The strategy also created the Resource Recovery Fund Board, waste management regions, enviro-depots, and a centralized composting system. Related legislation reduced the number of landfills by 75% and introduced stricter guidelines for those remaining that significantly reduced the pollution of adjacent rivers and streams.

Adams also introduced important amendments to the Protected Spaces Act that preserved nearly 8,000 acres of environmentally significant land by bringing it under public control. He also led the reengagement of old trade agreements between Nova Scotia and Caribbean island nations that led to delegations from Canadian environmental industries making deals in Trinidad, Port of Spain, and Barbados.

While Adams was accomplishing a great deal, the government became increasingly unpopular. As a result, many Liberals lost their seats in the 1998 provincial election, including Adams.

Continuing Community Engagement

Adams remained active and influential in the Halifax Board of Trade and Lions Club. He served as an elder in his church, an executive member of the Atlantic Baptist Convention, and was active with the Nova Scotia African Baptist Association. He served as the director of the Halifax Citadel Amateur Boxing club and chair of the Nova Scotia Home for Coloured Children. In 2011, he was invited to the first United Nations’ International Decade for People of African Descent. He told reporters, “There’s strength when you come together…There has to be a mass education, and that comes when you have policy in the corporate sector, as well as the government.”

Wayne Adams

Adams’ ongoing dedication to environmental issues was demonstrated by his becoming the founding president of Chebucto Windfields; a company focusing on creating power through wind generation. Adams also became president of the Nova Scotia Environmental Industries Association. The not-for-profit organization promotes environmental services and products while linking the federal and provincial governments, universities, and businesses to promote progress in matters such as hazardous materials management, fish and wildlife habitat preservation, and environmental research.

In 2003, Adam founded and became CEO of the Adams Consulting and Management Group. It brings together governments, businesses, and interested parties to advance initiatives that address community economic development, renewable energy systems, and product development while promoting business opportunities for Atlantic-Canadian entrepreneurs. Adams is also the Special Project Coordinator with Perennia Food and Agriculture Inc. where he oversees the inventory of agriculture and fishery businesses owned by or located in Nova Scotia’s Black communities while advocating for entrepreneurs in those communities.

Among Adams’ many awards is the Order of Canada. At his May 2004 investure, it was stated, “As a volunteer, businessman, and politician, Wayne Adams has paved the way for generations of young people.” In a 2004 CBC Radio interview, Adams summed up the principle that guides his life, saying, “It is all of our tasks to make the world a better place. The 300-year walk was worth every step.

A Candle in New Zealand

Democracy, the rule of law, and even the truth are under attack. The bedrock of assumptions once thought immutable has turned to sand. And yet, despite troubles and the deafening drumbeats of negativism, idealism is still not naïve, hope remains wholesome, and hard work is still rewarded. We know that even a small candle can conquer darkness. Light, like love, always wins. Think about it – always. As a measure of that audacious notion, I offer New Zealand.

New Zealand is not a place that often, or ever, crosses our minds. But there it is, a nation of 5 million people, made up of two volcanic islands, about 1,500 km south-east of Australia. Earning independence in 1947, its tacit head of state remains Britain’s monarch while real power rests with parliament and the prime minister. New Zealand’s prime minister is Jacinda Ardern. She is a candle.

Having graduated university in 2001, Ardern became a member of parliament in 2008 and, in August 2017, was chosen as Labour Party leader. In a general election held just a month later, her party increased its seat count by 14 and, through negotiations with the National Party, a coalition government was formed with Ardern as prime minister. She became New Zealand’s third female prime minister and, at 37, its youngest.

A Candle in New Zealand

She had campaigned on a promise of “relentless positivity” and that’s how she is governing. Ardern is a progressive. She believes that the state has no right to dictate who people may love and, therefore, supported laws allowing same-sex marriage. She believes that abortions have always occurred but if made legal they become safer and so she supported removing abortion from the Crimes Act. She believes that people’s health and safety comes first and so she has supported efforts to combat climate change.

Last January, Ardern and her husband, who hosts a television fishing show, stood together to announce that she was pregnant. She explained that after giving birth this June, she will take a six-week maternity leave, during which time deputy prime minister Winton Peters will become PM. She will then return to office with her husband assuming full-time caregiver responsibilities.

Ardern was attacked by those who did the math and said that she must have known she was pregnant while negotiating the coalition that made her prime minister. But is being pregnant a disqualifying condition for a position of power; or any position; or anything? She was criticized for thinking she could meet her responsibilities while pregnant. But are men not applauded for courageously carrying on despite health issues that are less natural and less temporary? She was savaged for not resigning to take care of her child. But are men asked to surrender jobs or ambitions when they become fathers?

Ardern met critics with grace. She said, “It is a woman’s decision about when they choose to have children, and it should not predetermine whether or not they are given a job or have job opportunities…I am not the first woman to multitask. I am not the first woman to work and have a baby.” She tweeted: “We thought 2017 was a big year! This year we’ll join the many parents who wear two hats. I’ll be PM & a mum while Clarke will be “first man of fishing” & stay at home dad. There will be lots of questions (I can assure you we have a plan all ready to go!) but for now bring on 2018.”

Like always, mud-slingers were left with more of the stuff on them than their target. They revealed more about themselves and their latent, or perhaps blatant, dinosaur misogyny than about their prime minister. Supporters quickly overwhelmed naysayers. Their thoughts were summarized by a message from Scotland’s prime minister Nicola Sturgeon: “This is first and foremost a personal moment for her — but it also helps demonstrate to young women that holding leadership positions needn’t be a barrier to having children (if you want to).”

Ardern is helping to illuminate a path forward for girls and women everywhere who challenge the darkness of people, laws, and attitudes that shame, limit, deny, and disparage. The path is being lit one candle at a time. Emma González is a Florida high school student helping to shine a light on leaders more concerned with campaign donations than children’s safety. Malala Yousafzai was shot by the Taliban for promoting the education of young women but, after painfully recovering she resumed her fight. Chrystia Freeland is Canada’s foreign affairs minister and Jane Philpott its minister of Indigenous Services. They are among Canada’s most powerful political leaders. Freeland is working to modernize and stabilize Canada’s economy by renegotiating the North American Free Trade Agreement and Philpott to right generations of wrongs by bringing justice to a relationship that has never known the concept.

There are candles like Ardern and the others in your community and, if you are lucky, in your home. Let us not curse the darkness but celebrate their light.

If you liked this column, please share it with others, consider leaving a comment, and checking out my others at http://www.johnboyko.com

Viagra, Frankenstein, and Us

In June 1816, Mary Shelley and her husband were enjoying a dinner party with a group of friends. They talked of books and poetry and swapped German ghost stories. The dinner led Shelley to write a short story that she later turned into her 1818 novel, Frankenstein. The book was a cautionary tale of a research scientist who successfully assembled a living being from corpses, only to have his creation turn on him and wreak havoc on the community. The book asks us to be aware of the Frankensteins of unintended consequences all around us. Let’s consider one.

Viagara and Frankenstein

One day in 1991, researchers working in England for the pharmaceutical giant, Pfizer, were taken by surprise. They had been toiling away to develop a chemical compound to treat heart problems. They had come up with Sildenafil. It looked promising but then, during clinical trials, older men who had been taking the compound reported rock-hard erections lasting more than an hour. Those in the placebo-taking control groups reported no such effects. The Pfizer heart research project took a quick turn. More tests were done, the discovery was deemed sound, and so a method of mass producing the compound as a pill in the proper dosage was quickly established. The research team had inadvertently invented Viagra.

Patents were obtained. Observers wryly noted the unusual lightning speed with which the predominately middle-aged men in charge of so many of the world’s government approval processes allowed the little blue pill to machete its way through red tape. Within six months of its American approval, in March 1998, 7 million prescriptions were written, rendering it the country’s most popular medication.

Viagara and Frankenstein2

Pfizer’s future changed and its stock and profits rose dramatically. Commercials changed acceptable public conversations by dragging discussions of impotence, or erectile dysfunction, as it was renamed, from the shadows. The research changed the lives of millions of men and couples for whom impotence had been a problem. All was well.

But then, retirement homes and senior-dominated communities began reporting skyrocketing numbers of cases of sexually transmitted diseases. Arizona’s Pima and Maricopa counties, for instance, have unusually large senior populations. From 2005 to 2009 the number of people older than 55 who contracted syphilis and chlamydia for the first time in their lives rose by 87%. As is the case with most corporate, applied research, Pfizer never released the names of those who created Viagra so we don’t know their reaction to the good and bad changes their work brought about. But Mary Shelley would have smiled.

What other research and inventions bring about Frankenstein change? What small decisions have we made in our lives, that ended up big ones in disguise, put us on roads we had hoped to never travel? How many political decisions made for expedient or partisan reasons have helped some but hurt many? Can we rise up as the torch-bearing villagers did in Shelly’s novel and defeat our Frankensteins? Let’s first identify them in our lives and our communities. Then, let’s light the torches.

If you enjoyed this column, consider sharing it with others by reposting on Facebook or your social media of choice or checking my other work at http://www.johnboyko.com

The People Will Always Be Heard – Luddite Lessons For Today

People affected by change need a way to express their concerns. Even if those concerns are not significantly addressed, they at least need to know they’ve been heard. The results of being ignored can be unpredictable when change beyond their control, led by complex forces outside their comprehension, alters all they once thought was certain. A people scorned by change will bring about even more change.

In 2016, we saw the connection between change and people’s response to being ignored when British voters chose to leave Europe and, in electing Donald Trump, Americans chose to leave the world. Those bringing change about and benefitting from it had become the enemy. The silenced and disparaged, who had been negatively affected by change, reacted in the most positive way they could. We are all now reaping the effects of the great unheard’s determination to be heard. It is not the first time.

English workers in the 18th century felt as mistreated and ignored as did the 21st century American and British working class. They didn’t have the ballot to express their rage against change and so, like people always do, they found another means.

In the Nottinghamshire village of Arnold, a group of framework knitters took pride in their work. The artisans complained to their overseers that their skills were being debased by the company’s use of substandard material and by “colts”, young workers who had not completed the seven-year apprenticeship. Further, the big, loom machines were producing more product but it was of an inferior quality. The machines also meant that because their skills were less important, their wages had been cut. Things had been made worse when the war with France led to the issuing of the Prince Regent’s Orders in Council. It effected jobs and production by cutting textile exports with France and its allies. There had been layoffs and slow downs. Each time the workers raised complaints, they were told to get back to work. On March 11, 1811, the unheard and frustrated workers destroyed their machines.

Workmen take out their anger on the machines

(Image: Look and Learn Picture Library)

This was not the first time that English workers had protested in this way. In fact, in 1727, the British parliament had passed legislation that rendered wrecking the tools of work a capital felony offense. But the old law had been ignored. News of the Nottinghamshire violence spread. It presented other disgruntled workers with a hero. Ned Ludd was applauded as the apprentice who began it all by having snapped his needles in defiance of his strict boss. Those who followed his lead were called Luddites. Ludd was a myth. There was no such man. But it didn’t matter. The Luddite movement was born.

Over the next two months, textile loom-frame machines were smashed in a number of surrounding villages. There were no arrests. How do you arrest a whole village? But there were also no negotiations between mill owners and workers. Violence erupted again in November and the winter saw sporadic attacks on mills and machines in Nottinghamshire, Derbyshire, and Leicestershire. The military was dispatched to a number of towns to help police. Mill owners hired armed guards. The Luddite movement nonetheless spread, first to the cotton-weaving industry in and around Manchester.

In April, a number of protesters turned their violence directly against mill owners and many were beaten up. Grand homes were burned. Elected officials were threatened. Rawfolds Mill owner William Horsfall was murdered. Some Luddite agitators were arrested but the workers stuck together and refused to give up friends who had been responsible for specific acts of sabotage or violence.

In an 1812 speech to the House of Lords regarding the proposed Frame Breaking Act, Lord Byron demonstrated his understanding of the situation. He knew that responsible leaders don’t react to the symptoms of problems but rather, address a problem’s root cause. Bryon said, “had the grievances of these men and their masters (for they also have had their grievances) been fairly weighed and justly examined, I do think that means might have been devised to restore these workmen to their avocations, and tranquility to the country…These men never destroyed their looms till they were become useless, worse than useless; till they were become actual impediments to their exertions in obtaining their daily bread.”

Byron went on to speak of the danger inherent in dismissing the protesters as a mob to be arrested and tamed. The mob, he said, was the people. The people served in the military and mills and made the country work. It is the people, he told the Lords, to whom they were responsible. It is the people being dismissed as a mob who are responsible for Britain’s growing power and wealth. Byron understood that in commodifying people and valuing them less than the machines they ran, the people were in danger of becoming not partners in the country’s progress but its victims, and thus, its enemies. It is a shame that, over the last decade, the United States and Britain did not have more Lord Byrons.

The government and mill owners eventually responded. Wages were raised a little and work conditions were slightly improved. Food was subsidized and prices dropped. Napoleon’s defeat reopened European markets. The machines remained and continued to change how people lived and worked but the workers most directly affected by change had, at least, been heard. By 1816, the Luddite movement had subsided.

The Luddites were never a unified group advocating a package of political reforms or even, as the word has been passed down through the generations, just about resistance to new technology. The movement represented people’s reaction to change. It reflected a new class consciousness among a group that the invention of steam power and the industrial revolution had helped to create. They were the class that the invention of the assembly line would help to build and the invention of robots would help to destroy.

The Luddites offer lessons regarding the importance of seeing the role that technology plays in spurring change but also in looking past immediate economic benefits to acknowledge and manage change’s costs. I’m betting that even Donald Trump knows that technology and not immigrants or Mexicans or Muslims is responsible for today’s job losses and economic dislocation. I’m hoping that responsible leaders will act responsibly to manage current changes for the benefit of the many and not just the few. I hope those leaders understand that one way or another, people affected by change will always be heard. Always.

If you enjoyed this column, please share it with others and consider leaving a comment.

Are We The 5-Year-Old Us?

I am currently reading Bobby Kennedy: The Making of a Liberal Icon by Larry Tye. It’s the latest of many I have read about the man who was a childhood hero of mine and for whom I still have a great deal of respect. Among the things Kennedy taught me, when my Mom used to say was too young to be thinking about such things, was existentialism. He spoke of being one and so I looked it up and thought it was a tremendous philosophy. I told myself that I was one too. An essential notion is that we are in control of our own destiny and able to create and recreate ourselves regardless of both nature and nurture. This new book, which is very good by the way, had me thinking about that notion again. But it also reminded me of an event whose anniversary is approaching that made me wonder if I should throw existentialism into the ditch. It involved a report card.

You see, about this time last year, my three younger brothers and I were cleaning out my father’s house. My Mom had been gone for some time and it was time for my Dad to be where he could be happier, healthier, and safer. So there were with a dumpster in the driveway, in what had been our home but had suddenly become just a house. What had been family treasures was bothersome stuff. “Why take this,” my one brother said, “only to have my son throw it out thirty years from now?” He was right. Furniture and kitchenware went to a Syrian refugee family and more went to local charity re-use centre, but a lot was going straight into the steel bin of sin. But then we were stopped cold.

My Mom had saved a box full of our old report cards. We stood together, laughing as we read comments from the days when teachers were allowed to be honest and communicate in English. I found my kindergarten final report card which said, “Johnny likes to sing songs and write stories.” Well, so much for Bobby Kennedy and existentialism.

I still like to sing songs. I learned to play guitar when I was nine and sang in a band in high school, then in coffee houses and bars with a friend and later alone. I recorded three songs that I had written as singles and still write a song every month or so to prove to myself that I still can. I play in a little band. We love working out new songs and playing the occasional gig. It is a rare day that I do not pick up the guitar and enjoy time singing and playing; it slows me down and slow is good.

I still like to write stories. I am writing one now. I also write newspaper editorials, magazine articles, book reviews, entries in the Canadian Encyclopedia, and am now writing my eighth book. There is a warm satisfaction earned by composing a well-constructed sentence or in weaving a lucid argument. The muse can occasionally be kind.

So the report card led me to wonder if I have really been living the existential life that I thought I had been living for all these years. Have I really been rediscovering and reinventing myself or was I set at kindergarten?

Consider yourself at age 5 and whether you are significantly different now. How have you changed, or not changed, since high school? When together with old friends, is everyone looking a little older but essentially the same? I wonder if despite the buffeting winds of change, the moments of celebration and chagrin, and the years that colour our hair and idealism, whether we are really that different than the five-year-old us?

Bobby Kennedy was assassinated 49 years ago last week at age 49. It was just weeks before he would have won the Democratic Party’s nomination and gone on to defeat Richard Nixon to become president in January 1969. Think about that. Vietnam would have ended earlier with thousands of lives spared. There would have been no Watergate. He most likely would have been president until 1976. God, he may have even stopped disco – ok, perhaps I’m stretching it.

Robert Kennedy

The point is, that if Kennedy had lived then policies would have been different, the media would have been different, America and the world would have been different and, perhaps most significantly of all, we may have been spared the cynicism born of his having been killed so shortly after his brother and Martin Luther King. The existentialism in which he believed would have been writ large through his example and legacy.

Of course, last year I would have still found the old report card that inspired both a smile and furrowed brow. Even Bobby Kennedy could not have changed that.

If you enjoyed this column, please consider sharing with others and perhaps leaving a comment.

Statler and Waldorf and the Gift of Now

This is a confession. I have become Statler and Waldorf. Those of a certain age will recall that Statler and Waldorf were Muppets. Watching the show on stage from their private box in the Muppet theatre, they were constantly critical, harumphing and grumping away. I felt like that last Saturday, but with a twist. My band was playing a gig and I was channelling my Muppet friends, an old fart observing, but this time from the stage watching the audience. I’d seen it before, of course, as we all have, but this time, right in the middle of singing and playing Peaceful Easy Feeling, and with only half my brain on the lyrics, melody, and guitar lines, it struck me.

You see, the crowd was good and with a line up at the door. Everyone looked like they were enjoying a good time. The band sounded tight and, like usual, we were having more fun than should be legal for grown men in public. The Canoe and Paddle pub is a gift to our community, run by great folks; it’s a gathering place for neighbours and friends and those who soon will be. But then, near the end of the first set, I noticed it.

Statler and Waldorf

At one table were two couples and all four were staring into phones, swiping the screens. I scanned the room. There was another young couple ignoring each other and the fun of the room, tip-tapping away. At a table with six obvious male and female friends, four were staring at phones. I counted four other people ignoring friends or spouses, intently concentrating on Steve Jobs’ gift to us all.

Why?

Are we information addicts? Is it not interesting that we can be out with friends or family, with good food and drink before us and engulfed in music and laughter, and yet be distracted by a vibration, buzz, or ding? When we tap the button to investigate are we not saying, “I have no idea who or what this is, perhaps a friend who just posted a picture of her dinner, or maybe a bomb blew up in Caraccas, but whoever or whatever it is, and I have no idea, I already find it more interesting than you and so I am going to ignore you now and check this out.” It seems to me that unless there is a babysitter back home or teenage children on the town, what can possibly be more important than the people with whom you have chosen to share this sliver of time?

Are we public diarists? Diaries used to have locks. Now they have megaphones. Psychologists often recommend that people keep diaries, or journals, to slow the pace and allow the rich rewards of reflection. Facebook, Instagram, and the rest, on the other hand, invite us to reflect by reflecting a mirror on our lives outward. We post what used to be private to the whole world. We then keep track of how many noticed and liked our latest entry and, indirectly, how many people like us. Psychologists agree that those who regularly post and read Facebook are more likely to experience angst and depression for they compare the ordinary of their lives with highlights of others. And there at the pub on Saturday were all those good folks more concerned with recording and sharing what was happening rather than truly immersing themselves in what was happening.

Do we need a witness? American soldiers moving through Italy and Europe often stopped to paint a crude cartoon of a man peering over a fence and wrote, “Kilroy Was Here”. A drive just north of our community takes you through the stunning Canadian Shield with tremendous sheered rock faces. It is tough to drive long without seeing that someone has spray painted their name, usually along with that of their true love. When our life ends, we have our name more permanently recorded, this time carved in stone. All three practices seem to be about the same thing: we have a need to let others know we are here. Our phones allow us to instantly summon witnesses to our existence without fighting a war, climbing a cliff, or dying. All those people on their phones last Saturday, while I was singing an Eagles song, were like the Whos on the clover held aloft by Horton the elephant yelling, “We are here! We are here! We are here!”

The song ended. Lots of fine folks applauded. I said thank you and glanced at those on phones. Three had put them down and were smiling and laughing with others. Good. But I noticed three new victims of our times ignoring the now. The now is a gift. That’s why it’s called the present. I may be a Statler and Waldorf grump from the wrong generation but it seems to me that the present is something that won’t last and so it’s worth savouring, for just a moment, without distraction.

If you enjoyed this column, please share it with others and consider checking my others at http://www.johnboyko.com

The Future Arrived and We Missed It

In 1957, Stockholm hosted the St. Erik International Trade Fair on Automation. The fair was a dazzling display of inventions that included new gadgets called robots. They were essentially tools that could do simple, multi-step tasks. The word robot came from a 1921 Czechoslovakian dystopian play in which machines, called robota, replaced humans. Robata is Czech for labour.

Inventor George Devol Jr. met physicist Joseph Engleberger at a cocktail party. They discovered a shared interest in electronics and robotics and the potential of the recent invention of the integrated circuit. Shortly afterward, they formed a company, Unimation, and created a robotic arm that synthesized all the current work going on in university and government labs. By 1961, General Motors had purchased the robotic arm and it was hard at work on one of their New Jersey assembly lines. It took red-hot pieces of metal from a die casting machine and placed them in neat piles. The robot saved money by improving the line’s efficiency and replacing expensive workers. GM then bought and employed several Unimation robot welders.

General Motors’ successful use of robots inspired others until, by the 1970s, nearly every thriving manufacturing company in the world had robots on their lines. Production increased and profits rose as labour costs fell. By the 1990s, robots had become so sophisticated that they were even doing jobs that required decision-making and complex thought. A giant leap was taken when robots began using algorithms to design better versions of themselves.

The Future Arrived and we Missed It

(Photo: Business Insider)

India, China, Mexico and others adapted robots to their assembly lines while also offering multinational corporations cheap labour, lax health and environmental regulations, and low taxes. Because corporations are beholden to shareholders, and not to workers or a particular country, they jumped. American, British, and Canadian factories that had provided employment for generations either shrank or closed. Empty, rusting factories and the shuttered businesses that once supplied them and provided services to haunted souls and hollowed cities stood as mocking monuments to broken dreams and an era’s end. The plants that survived did so by trading workers for robots who never erred, stopped to eat or pee, or went on strike.

Robots helped break capitalism’s cycle where production boosted wages, increased spending, which, in turn, demanded more production. It threatened the concept of consumer capitalism and, in fact, capitalism itself. In 2010, American permanent job losses were compared to new job creation and it was discovered that the 21st century’s first decade had created not a single new job. This was unprecedented and frightening.

The changes robots brought about gave rise to populist politicians who spoke to the frustration of those whose dreams of better for themselves and their children were as shattered as their once-gleaming but now disintegrating cities. People were told that others, and the “other”, were to blame. But apportioning blame is not the same as presenting a solution and anger and fear are not strategies. Those who asked the next question knew that India, Mexico, and China could close every one of their manufacturing plants and western countries could slam shut their borders to every immigrant and refugee, and it would change very little. The robots have the jobs and they are not giving them back.

In February 2017, Dominic Martin was the bearer of bad news. As the head of Canadian Prime Minister Justin Trudeau’s Economic Growth Advisory Council, he had been studying the effects of robots and automation on the job market. He reported that due to the increasing automation of jobs in every sector of the Canadian economy, within ten years about 40% of all jobs currently in existence will be gone. Martin’s estimate was close to that of the American McKinsey and Company. It reported in 2016 that 45% of all jobs currently done by American workers will be automated with ten years.

The Canadian and American reports mirrored findings in other countries. Driverless vehicles will replace truck and taxi drivers. Automated check-in and check-out devices will continue to replace grocery store clerks, bank tellers, fast food order-takers, and hotel desk attendants. Automated and online purchasing will continue to replace independent store owners and retail sales staff. Automated robots will replace more agricultural workers as they plant seed, pick fruit, prune trees, and milk cows. Automated calculators will replace more accountants and automated tutors will replace more teachers while automated drones will replace couriers and on, and on, and on. If the Martin and the Kinsley reports are correct, by the year 2030, the unemployment rate in countries like Canada, the United States, Germany, and Britain will reach about 47%. That is a staggering number. Consider that at the height of the Great Depression, that catastrophic collapse that threatened capitalism and democracy and abetted the rise of tyrants like Adolf Hitler, the unemployment rate peaked 30%.

The changes brought about by the invention of robots will continue to change our world in ways that fundamentally change how we live and work and measure success. Capitalism and democracy will change. And the robots won’t care.

If you enjoyed this column, please share it with others and consider checking more of my columns as http://www.johnboyko.com

One-Sentence Lives and a Challenge

Long-time Toronto Blue Jays announcer Tom Cheek once said that every baseball season begins as a story, turns to a paragraph, and ends as a sentence. “Boston breaks the Bambino curse.” “Carter hits the walk-off homer.”

I believe that what is true of baseball is also true of people’s lives. It was this thought that helped me to complete a writing commission in which I was asked to write one-sentence biographies of all 23 Canadian prime ministers. The thought also helped me to reflect on a birthday of note; one of those ending in a zero that moved me into a new decade.

I offer one of the one-sentence biographies and then my own. They are, I confess, run-on sentences that would have my editor’s red pen flying and old English teachers’ fingers wagging, but one sentence none the less. Then comes the challenge.

one-sentence-lives-and-a-challenge

Sir John A. Macdonald: As the most prominent voice at the Confederation conferences, Macdonald was instrumental in creating Canada with its constitution placing dominant power with the federal parliament, essential in building Canada when, as our first prime minister, he added enormously to Canada’s size by purchasing Rupert’s Land and welcoming new provinces, and with his National Policy that allowed the country to grow on steel rails and behind tariff walls, and he was then key in saving Canada at the Washington Treaty negotiations that kept us from American annexation while winning recognition as a sovereign state, and, so, despite some tragic and wrong-headed policies, such as those involving Aboriginal nations, Macdonald was Canada’s indispensable man whose echo reverberates to this day.

And now for me: John Boyko is a walking talking advertisement for the power of existentialism for he has been a teacher, administrator, politician, musician, and author, whose insatiable curiosity, confidence in one’s ability to reinvent oneself, and belief in seeking motive in challenge rather than comfort, and value in experience over things, have informed his life, while through it all he has been a loyal if sometimes annoying friend, and, in the most important part of his life, a devoted but sometimes flawed husband, father, and grandfather.

Our lives are write-your-own-adventure stories. There are so many more books to be read, places to explore, ideas to consider, challenges to be accepted, and warm moments to build and share.

And so now the challenge. I challenge you to write your one-sentence biography. If unhappy with the sentence as written, I sincerely believe we can write ourselves a better tomorrow. Our greatest fear is not that we don’t have enough power to change but that we have more than enough.

If you enjoyed this column, please share it with others and consider checking more of my thoughts at http://www.johnboyko.com or even my books, available online at Chapters and Amazon and bookstores (if you can still find one).

 

First World War’s Last Battle was Last Week

President Trump didn’t send Navy SEALS to intentionally kill an 8-year-old girl. But they did. When the president spoke of the January 29th Yemen raid, he mentioned the death of an American soldier and suspected terrorists but not the girl. Presidents often shade the truth. We do too. For instance, we teach our kids that the First World War ended in 1918. It didn’t. Not really. Its latest battle was Trump’s raid. The little girl was the First World War’s latest casualty.

The First World War senselessly murdered a generation and brought about transformational changes. It led to women earning the right to vote. It enabled the birth of the first Communist state that ravaged its people, conquered its neighbours, exported revolution, and contributed to the Cold War proliferation of nuclear weapons. The manner in which the First World War was settled led to the century’s second global war by making Germans susceptible to the rantings of a narcissist lunatic who promised to make Germany great again.

But the First World War spurred more than just those changes that shaped the past. To see how it affects us today, we need to go back, way back.

From the 14th to 17th centuries, the Ottoman Empire grew to rule swaths of land in north Africa, the Greek peninsula, nearly all of what we now consider the middle east, and southeast Europe all the way to Vienna. It was the world’s most advanced civilization. The multi-cultural but predominantly Islamic empire made stunning progress in mathematics, chemistry, art, and business. It rescued antiquity’s ideas by saving its libraries. The empire’s power sputtered, however, when it failed to adjust to Europe’s industrial revolution. Then, in 1914, came the war.

Germany promised to respect the Ottoman empire’s borders and so an alliance was formed. In 1915, Britain said it would help preserve the holy city of Mecca if Egypt would attack the Ottoman Turks. A year later, French and British diplomats Mark Sykes and Francois Georges-Picot negotiated an agreement whereby their nations would help conquer and then split the Ottoman empire between them. British and empire troops were taken from the western front to attack. Rebel groups were funded and armed. More money and support flowed to the effort when Britain offered Zionists a Jewish homeland in what was then Palestine. The monarchy collapsed and the empire fell.

The Versailles victors’ conference rubber stamped the Sykes-Picot Agreement. While French, British, and American leaders spoke of people ruling themselves – self-determination – they ignored the principle when it suited their interests. They ignored it in the middle east. Nations, ethnicities, religious sects, and tribal groups within the sprawling, complex but now crushed Ottoman empire were ignored. The men in Paris simply drew arbitrary lines on a map. They invented countries from nothing, foisted leaders of their choosing upon them, and lumped competing groups within them. Syria was created. Lebanon was invented. So was Iraq and Iran and more. Meanwhile, national groups such as the Kurds were left state-less, split between what became three new countries.

The anger was immediate but protest was crushed. British and French, and later, American money protected the protectorates with blind eyes turned to whatever their chosen leaders chose to do to their people. The flowing oil enriched multinational corporations, western economies, and the tiny local, governing elites. People raged at the harsh, corrupt, secular, westernized governments. For decades, the rage burned underground.

Anger turned to action with an Iranian university philosophy professor. The Ayatollah Ruhollah Khomeini was exiled in 1964 for criticizing Iran’s puppet regime that was disparaging Islamic religious scholars opposed to the ongoing secularization and westernization. From Paris, Khomeini smuggled cassette tapes back to his homeland. They contained speeches explaining that the Ottoman empire had once been the most powerful in the world but God had turned His back on its people because they had rejected Him. Allah would renew power, happiness, and sovereignty, he said, if the region’s Islamic people again lived according to His wishes. Iran’s people must first adopt orthodox Muslim lifestyles. Then they could overthrow Iran’s leader, the Shah, and create an Islamic state where religious and temporal law were one. In 1979, it happened.

The new Iranian state did as Khomeini pledged and implemented Sharia law. A similar state arose from the carnage of the Soviet war in Afghanistan. Taliban leaders used different words but sought the same goals for the same reasons. But the other middle eastern states invented by the First World War remained propped up and powerful. More action was needed.

On August 11,1988, in Peshawar, Pakistan, the son of a Saudi millionaire, Osama Bin Laden, met with Saudi medical doctor Ayman Mohammed Rabie al-Zawahiri, and Egyptian political philosopher Sayyed Imam Al-Sharif, who is often called Dr. Fadl. They agreed that Khomeini’s vision and goal were correct. They established a new organization and plotted new tactics to pursue it. They would poke the west. They would poke it again and again until it finally reacted by attacking the middle east. Those attacks would bring the long simmering, underground rage to the streets. The pan-Arab idea would win by not losing. That is, the west would be defeated by wearing it down, as happened with the Soviets in Afghanistan and the Americans in Vietnam. The corrupt, secular middle eastern governments would then be replaced by leaders professing Sharia law. The old empire would return. It would be like the First World War had never happened. They called their new organization Al-Qaeda.

The poking began with two westerners killed at Aden’s Gold Mihor hotel in 1992. Two months later, Al-Qaeda operatives detonated a 500kg bomb at New York’s World Trade Centre. Americans screamed but did nothing. It would take more. In August 1998, American embassies in Kenya and Tanzania were simultaneously attacked and 223 were killed. The Americans blew up some Al-Qaeda bases. It wasn’t enough. USS Cole was rammed and sailors were killed. The Americans blew up a few empty tents in the desert. It still wasn’t enough. In September 2001, Al-Qaeda high jackers turned planes into weapons and flew them into the World Trade Centre, the Pentagon, and a fourth plane, on its way to Washington, crashed into a Pennsylvania field. That was enough.

The Americans finally did what Bin Laden and his partners had been hoping all along and attacked Afghanistan and then Iraq. It was perfect. The Americans and their allies brought western armies to Muslim countries and killed Muslims. They desecrated the holy city of Mecca by flying missions from Saudi Arabia. Just as Bin Laden had hoped, the Americans and the west were now, more than ever, the devil to be rejected along with their devilish western ways.

It took longer than the First World War itself but eventually, the Taliban was crushed, Al-Qaeda was broken, and Bin Laden was killed. But Al-Qaeda morphed into a hundred smaller organizations and pockets of resistance without a headquarters to bomb or an army to defeat.

The Islamic State of Iraq and Syria (ISIS) became the most powerful of the angry lot. Its stated goal was familiar: to create a caliphate, one state comprising nearly all of the middle east, and united under Sharia law. In June 2014, ISIS bulldozers flattened desert berms that had demarked the Syrian-Iraqi border. ISIS leaders said they were erasing the line created by the First World War’s Sykes-Picot Agreement and Treaty of Versailles. Every western pledge to defeat ISIS was another promise to keep the old, imperial, unprincipled and artificial First World War borders in place.

Historians say the First World War resulted in the deaths of 7 million civilians and 11 million soldiers. They are wrong. Mr. Trump’s botched Yemen raid on an Al-Qaeda-held village killed an American Navy SEAL, 14 suspected militants, and 10 women and children. One of the children was an 8-year-old girl, an American citizen, born in the United States. Her name was Nawar al-Awlaki. She was shot in the neck.

first-world-wars-last-battle

Nawar al-Awlaki (Photo: Middle East Monitor)

We should add her and the others to the First World War’s staggering statistics for the lives that ended last week are the latest casualties in a war that has yet to end.

If you enjoyed this column, please send it to someone and consider checking out others at http://www.johnboyko.com or even checking out my books that are available at Chapters, Amazon, and bookstores everywhere.

The Real Change and Our Real Decision

A fundamental change that is marking our era and determining our future is upon us. We have a decision to make. We need to make it now.

We are living the consequences of two crashes: 9-11 and the Great Recession. The American-led, western world’s response to the 2001 attacks saw troops, including ours, fighting impossible missions and too often in self-defeating ways. The middle east and then the world was destabilized as new terrorist organizations grew and impressionable youth were radicalized. Explosions in Boston, London, Paris, and elsewhere solidified the belief that fear is justified, there’s an enemy among us, and governments are unable to help.

the-real-change

Billions were borrowed and economic fundamentals teetered in the permanent war against a tactic and expensive domestic security measures that protected us from the last but not next attack. The economic and existential strains, along with the greed of a few bankers and financiers whom deregulation had freed to wallow in avarice, contributed to the 2008 economic crash. Governments were seen borrowing more money but giving it to those who had caused the crisis. Governments seemed incapable of or unwilling to provide a playing field sufficiently level to allow the rewarding of obeying the law, paying taxes, and honest, hard work. Corporations valued the loyalty of neither their workers nor customers. The millions of middle and working class people who lost jobs, homes, and dreams, and were still removing shoes in airports and seeing things explode on TV, could be forgiven for seeking someone, anyone, to blame.

In a world where long established rules and assumptions no longer applied, demagogues who would normally have been dismissed found their messages resonating. Those supporting Britain’s leaving the European Union, Brexit, said Britain first. In his inauguration speech, Donald Trump clenched his first and shouted America first – twice. France’s National Front leader and presidential candidate Marine Le Pen watches her popularity rise as she demands white, French, nationals first. They are not the change. They are the symptoms. They are the arbiters.

The two crashes led to the collapse of the western, liberal consensus that has informed progress and policy since the end of the Second World War. After liberalism and communism allied to defeat fascism, it was determined that we are all in this together. Multilateral, cooperative efforts would save us from another Auschwitz, Nanking, and Hiroshima. We would talk things out at the United Nations, have each other’s back through the North Atlantic Treaty Organization, keep each other stable through the International Monetary Fund, and buy each other’s stuff through trade agreements. The thought was that we were no longer in separate boats, racing through choppy waters for unique destinations. Rather, we’re in one big boat, squabbling like children, but together. We were united in our efforts to create more peace, equality, wealth, health, and democracy for all.

But now, forget the European Union, denigrate the UN, defund NGOs, end trade treaties, call NATO archaic, withdraw from or ignore global climate change initiatives, stifle immigration, throw up tariffs, and build that wall. Mr. Trump’s wall is not yet a reality but already an apt metaphor for our times. Russia knows it. China knows it. They’re loving it.

Canada punched above its weight in helping to create and maintain the post-war liberal-western consensus. Through his commitment to Syrian refugees, the Paris global climate change initiative, and more, Prime Minister Trudeau has demonstrated that that he still supports it. Some Conservative party leadership candidates, on the other hand, seem eager to join Trump and Le Pen in smashing it. Canada has a decision to make. We must join one side of history or the other. We must fight to protect what has protected us and others for so long or flip to the other side. Our decision will determine our future for generations.

The Chinese have a curse: “May you live in interesting times.” We do. Buckle up.

  If you enjoyed this column, please send it to others and consider checking out my others at http://www.johnboyko.com

The Woman Who Changed the World

The crowd hushed, cameras snapped, and Senators sat respectfully still as the slight, pale woman limped slowly to the big table then, painfully, took her seat. It was June 4 1963, and Rachel Carson was 56 but looked much older. She was dying. Cancer had fractured her pelvis, taken a breast and, hidden by a dark wig, her hair.

Carson had worked as a United States Fish and Wildlife Service marine biologist and written articles for a number of magazines. She had turned her love of the sea and outrage with what was happening to rivers, lakes, and oceans into three best-selling books: The Edge of the Sea, Under the Sea-Wind, and The Sea Around Us. Each presented disturbing ideas and scientifically sophisticated arguments without jargon, preaching, or rancour. She married her knowledge, passion, and writing and investigative skills in the creation her next book: Silent Spring.

While researching the book, Carson had served on the Natural Resources Committee of the Democratic Advisory Council where she became aware of Massachusetts Senator John F. Kennedy having initiated the Cape Cod National Seashore Act. Kennedy had read her books on the sea and then the committee report and so when he sought his party’s nomination for president, he invited Carson to join the Women’s Committee for New Frontiers.

rachelcarson

Photo: Rachel Carson Council

As president, Kennedy read Silent Spring pre-publication excerpts in the New Yorker magazine. He was moved by Carson’s detailing the devastating effects of pesticide use on animal and human health and invited her to attend a White House conference on conservation. The conference led to Kennedy announcing that, because of Carson’s work, he was ordering the Department of Agriculture and the Public Health Service to investigate the dangers of pesticide use and the establishment of the President’s Science Advisory Committee to study links between pesticides and health.

Silent Spring became an instant bestseller when published in September 1962. It explained how pesticides, and specifically DDT, had been around since 1874. The American army had used DDT in both world wars to delouse soldiers and that Paul Hermann Müller had won the 1948 Nobel Prize for determining its effectiveness in killing mosquitoes and other pests. Carson’s book explained how DDT was also killing fish, birds, and people. Her title warned of the day that birds would be gone and skies without song. Most shockingly, Silent Spring told of how the government, scientific community, and the companies making and selling pesticides knew of their harmful effects. But there was money to be made. And so, the evidence was ignored, hidden, and denied. Carson asked an essential question: “How could intelligent beings seek to control a few unwanted species by a method that contaminated the whole environment and brought the threat of disease and death even to their own kind?”

silent-spring

Pesticide manufacturing companies Cyanamid, Monsanto, and Velsical were outraged. They attacked. Velsical threatened to sue Carson, her publisher, and the New Yorker. They even tried to stop the publication of an article about the book in the Audubon magazine. The companies paid scientists to write editorials and articles that belittled Carson and her conclusions. The National Agricultural Chemicals Association published a booklet, Fact and Fancy, that savaged Kennedy and Carson. It was argued that Americans would suffer a food shortage without DDT.

In May 1963, the President’s Science Advisory Committee released a 46-page report, Use of Pesticides. With point after well-supported point, it said the companies were wrong and Carson was right. It stated, “Until the publication of Silent Spring by Rachel Carson, people were generally unaware of the toxicity of pesticides…The Government should present this information to the public in a way that will make it aware of the dangers while recognizing the value of pesticides.”

A month later, as part of that public education process, the sick, fragile, and wan Carson took her seat before the Senate subcommittee. She briefly summarized Silent Spring’s findings and then listed specific recommendations. The government should ban aerial spraying without the permission of landowners. Citizens should enjoy guaranteed security against poisons used by companies, governments, and private individuals. Corporations making pesticides, and all those using them, should be strictly regulated. She advocated the outright banning of DDT. The government should fund and support grass roots citizen organizations and non-government organizations to encourage awareness of environmental issues.

The environmental movement was born. American companies sold 90,000 tonnes of DDT in 1963 but production decreased the next year and every year after that. It took a while, but in 1972, American DDT production was banned. Carson’s name was raised and Silent Spring was read by those advocating and then celebrating President Nixon’s Clean Air and Water Acts, the National Environmental Policy Act, the Endangered Species Act, and, in 1970, his establishment of the Environmental Protection Agency.

In his 1996 book, Our Stolen Future, Dr. Theo Colborn wrote about chemicals that interfere with our body’s hormonal system called endocrine disrupters. He credits Silent Spring with awakening him and other scientists and researchers to the dangers of manmade chemicals and noted how it was still inspiring discoveries and environmental advocacy.

Breast cancer took Rachel Carson in 1964. But her voice still echoes for Silent Spring is still read. It still inspires. It still exasperates.Silent Spring is still discussed around the world every Earth Day.

Books that matter always educate and infuriate and important authors, like important ideas, are always ignored, then mocked, then attacked, and, in time, celebrated. Books measure how far we have come and how far remains to go. As the American government appears ready to deregulate corporations and eviscerate environmental regulations, and women are leading the charge to fight the turning back of the clock on this and other issues, perhaps Silent Spring is more important now than ever.

Rachel Carson’s Silent Spring helped change the world. It may need to change it again.

If you enjoyed this column, please share it with others and consider checking my other work at http://www.johnboyko.com

 

 

 

 

The Shameful Power of Lies

I refuse to believe that the truth no longer matters. I refuse to believe that the truth is simply what I choose to believe. I’m loath to admit it, but a clear-eyed look at world politics today and examples from the past suggests I’m wrong. Too many lies have been casually accepted as truth and too many lies have sparked monumentally consequential change.

A young George Washington never cut down a cherry tree or confessed with the line we all know: “I cannot tell a lie.” Biographer Mason Locke Weems made no mention of the tale in the first five editions of The Life of George Washington but the incident suddenly appeared in the sixth. Weems made it up. Similarly, there was no gift-horse, filled with soldiers, with which the Greeks duped the Trojans. Nero did not play the violin as Rome burned. When leaving the room, Galileo did not mumble, “But it does move.” Newton’s work on gravitation was not inspired by a falling apple. Benjamin Franklin never flew a kite in a lightning storm. I could go on.

Lies such as these have been repeated as fact by so many and for so long that they’ve become accepted as true. Joseph Goebbels would understand. As Hitler’s propaganda minister, he said a lie becomes truth when forcefully presented and repeated. Donald Trump certainly understands.

Politico.com studied Mr. Trump’s 2016 campaign speeches and determined that, on average, he lied once every five minutes and sometimes twice in a single, rambling, non-sequitur littered sentence. He lied about having seen thousands of Muslims in New Jersey celebrating the 9-11 attack. He lied about MSNBC distorting his views by editing his statement on abortion. He repeatedly lied about America’s crime rate being higher than ever, about GDP growth being zero for the previous two quarters, and about the United States having the world’s highest corporate taxes. All the lies were shown to be lies but it didn’t seem to matter. Mr. Trump won the presidency. He continues to lie. He recently said there are 96 million unemployed Americans but that counts retired folks and kids in school.

Do the lies that inform so much of what we think we know about our past and Mr. Trump’s successfully lying his way to the White House prove that we don’t care about the truth? We should. Because sometimes lies bring about changes that are enormously consequential. Consider two examples.

President Truman said he approved the dropping of atomic bombs on Japan to save the lives of American soldiers who were preparing to invade the island. With each subsequent interview, Truman’s estimate of the number of men saved went up. He couldn’t quantify it because his justification was a lie. Truman had been advised by the scientists who created the bomb that its use would be immoral. A number of generals and military advisors, including future president General Dwight D. Eisenhower, said it was unnecessary. Japan was on the verge of collapse. All its major cities had been incinerated. The Soviet Union had declared war and was moving on Japan. Japanese leaders were preparing to surrender and Truman knew it.

But the bomb was not really about Japan. Truman agreed with Secretary of State John Foster  Dulles and other advisors that the bomb had to be dropped to brandish its power, especially to the Soviet Union, which they had decided to turn from ally to enemy. They had to demonstrate that America would dominate the post-war world. And so the bombs fell. Months before, Japanese leaders had offered to stop fighting with the condition that Emperor Hirohito stay in place but the Americans refused with their insistence on unconditional surrender. With the atomic bombs suitably displayed, Truman accepted the surrender terms that had been unacceptable before. Hirohito remained. The war ended. But Truman’s lie unnecessarily murdered 150,000 people in Hiroshima and 75,000 in Nagasaki with hundreds of thousands suffering life-altering wounds and horrifying birth defects.

While Truman’s lie involved the end of a war, other lies have started them. The Iraq War was based on the lie that Saddam Hussein had weapons of mass destruction. He didn’t. In 1964, Congress gave President Lyndon Johnson unrestricted power to wage war in Vietnam after an attack on the American destroyer USS Maddox. But the attack didn’t really happen. The lies are disturbing but sadly, tragically, not rare.

At 9:40 in the evening, on February 15, 1898, a tremendous explosion sent a fire ball into sky above Havana’s harbour. The American battleship Maine, which had been anchored there as an expression of American power, had exploded. The ship was destroyed. Its burning, shredded hulk sank, and 266 Americans lost their lives.

Cubans had been rebelling against their Spanish colonial masters in a low-level guerilla war. Thousands of Cuban refugees had been working from new homes in Florida and New York to entice America to intervene on their behalf. After all, they argued, the Monroe Doctrine said that the United States considered the western hemisphere its back yard and would take action to keep countries stable and Europe out.

Powerful newspaper owners had joined their fight. The New York Journal’s William Randolph Hearst and the New York World’s Joseph Pulitzer were in a circulation war and both saw a Cuban war as their ticket to victory. They both had reporters in Cuba before the explosion writing articles that urged President William McKinley to take military action. Two days after the Maine explosion, Hurst’s Journal ran the headline: “Destruction of the warship Maine was the work of the enemy.” The next day, an article quoted unnamed naval men as believing that a Spanish mine had caused the explosion. Hurst offered $50,000 to anyone who turned in those responsible for the mine. Readership soared.

Thousands of Americans wrote to their president demanding a war of revenge with Spain. Militia groups formed and volunteered to leave immediately. Men yelled “Remember the Maine and to Hell with Spain!” as they swamped recruitment offices. Congressmen joined the jingoist parade, declaring that American honour had to be respected. A March 28 Naval Court of Inquiry moved with lightning speed to conclude that the Maine had indeed been downed by a mine. President McKinley was suspicious of the evidence but the mounting political pressure was enormous. He acquiesced. In April, the United States declared war on Spain.

The war lasted only ten weeks. The most famous battle was the taking of San Juan Hill by the Rough Riders, a rag tag group of cowboys, college students, and ex-convicts organized by Assistant Secretary of the Navy Theodore Roosevelt, who had quit his post to join the fight. The war was won when the American navy destroyed Spain’s Atlantic fleet in the Philippine’s Manila Bay. About 2,000 Americans died in the war, all but 385 of disease. About 60,000 Spanish and Cuban soldiers and civilians died. America’s victory led to the Paris Treaty which gave Cuba its independence and ceded the Philippines, Guam, and Puerto Rico to the United States.

The war’s second phase began when Filipino nationalists insisted on independence rather than trading one colonial master for another. When rebuffed, they shouldered rifles. The fighting lasted three years and took the lives another 4,200 Americans and over 20,000 Filipino combatants. The war also saw about 200,000 civilians die from war-related famine, violence, and disease.

The Maine attack and wars that followed entered American civic understanding alongside Washington’s hatchet and Franklin’s kite. They were true because they were believed to be true. But the truth is stubborn.

In its rush to not really investigate but simply confirm the mining of the Maine, the US Naval Court of Inquiry had refused to hear from a number of experts. Included among them was Navy ordnance professional Philip R. Alger. He told the Washington Star that the explosion’s power and ship’s wreckage suggested that the blast had originated with a fire in the Maine’s engine room that ignited its magazine, the room where ammunition and gun powder was stored. In fact, another naval inquiry had reported only a month before that designers of ships such as the Maine had put magazines too close to coal-fired engine rooms. This was alarming because coal bunker fires were a regular problem on naval ships at the time and it had been found that those carrying bituminous coal, like the Maine, were far more likely to suffer spontaneous engine room fires than those carrying anthracite coal. Those salivating for war knew all this but ignored it as they silenced Alger.

In 1974, Admiral Hyman G. Rickover initiated an inquiry into the Maine’s sinking. American, Spanish, and Cuban records were scoured and experts on ship explosions were interviewed. The study concluded that “without a doubt” the Maine had been sunk by a spontaneous combustion fire in her engine room that ignited the magazine. The Spanish had nothing to do with it. Wars had been fought in Cuba and the Philippines, thousands had died, the Spanish empire had shrunk, the American empire grew, and Roosevelt’s political career took flight, all because of a lie. It was a lie the American media helped create and then exploit and that the American people were too willing to believe.

Today, in the revered Arlington National Cemetery, just across the Potomac from Washington, lay the remains of over 14,000 American veterans. On a hilltop near the Tomb of the Unknown Soldier, towers a gleaming white mast. It’s the Maine’s mast. In 1915 it was salvaged and erected atop a large concrete base resembling a ship’s turret. The mast throws a shadow over the respected dead laying nearby while serving as a monument to the power of lies.

power-of-lies-memorial

Maine Memorial (Photo Arlington National Cemetery)

Lies led to the dropping of the world’s worst weapons, were cynically employed to elect a president and used to start unnecessary wars. Lies ended lives and changed the world. It is said that we live in a post-truth era. No. No! We can’t afford that luxury, that embarrassment, that threat. Ignorance is not bliss, it’s dangerous. Ask those resting in American military cemeteries laid there by lies or the ghosts haunting Cuba, the Philippines, Japan, Iraq, and Vietnam. Ask Joseph Goebbels.

The media has an awesome responsibility as the citizens’ eyes, ears, and conscience. It must question and say no to power and not be its poodle. Rewriting press releases is not journalism. The media cannot, as Hearst did, and as Fox and others do, report lies or fashion lies of their own for ratings, clicks, and sales while making us dumber and less safe. We must join the media in robbing lies of their power by calling them what they are and calling out those who either don’t speak the truth, don’t seem to care, or don’t know the difference. We deserve the truth. We can handle the truth. We must demand it.

If you enjoyed this column, please share it with others and consider leaving a comment.

Inventing Change: Why We Do the Things We Do

Consider when you showed up at work this morning and the consequences if you were late. How do you measure the power of your car and the light bulbs in your home? Consider your notions of a healthy environment, how your children are educated, and why most of us live where we do.

In that consideration, pay mind to the fact that at the Crofton Pump Station in Wiltshire, south of Birmingham, England, a steam-driven pump is pushing about twelve tons of water a minute to operate the locks along the Kennet and Avon canal. The same pump has been operating efficiently since it was installed in 1812. More than that, the pump’s core technology, and the notion that led to its invention, changed your world and is affecting you today in ways you seldom stop to think about. Change, you see, is sneaky.

inventing-changePhoto: feelgrafix.com

In 17th century Britain, coal had replaced wood as a source of energy. The need for more coal led to deeper mines which had a tendency to flood. At first, horses walked in endless circles to power the pumps that drained the mines. Then, using technology first developed by Hero in ancient Greece, Newcomen engines were developed. They burned coal to heat water to create steam which, when injected through big cylinders, caused a piston to move up and down to pump the water. In 1763, an enterprising young Scottish craftsman named James Watt was asked by the University of Glasgow to fix a broken Newcomen steam engine. He did more than that. He undertook a ten-year journey to solve the pump’s inadequacies. He even learned to read Italian and German to study current research.

Watt eventually invented a separate condenser that allowed cylinders to maintain a constant temperature and the pump to become enormously more efficient. He then formed a partnership with businessman Matthew Boulton. With Boulton’s financial backing and the use of his company’s precision tools and machinery, Watt invented an entirely new steam engine based on a rotary engine with separate gears and his separate condenser. It was powerful, efficient, reliable, and allowed an operator to control its heat and speed.

(For CBC TV fans, Watt’s brilliant assistant who ingeniously developed new tools and ways of doing things was named William Murdoch.)

To sell his engines, Watt calculated that a mill horse could pull about 33,000 pounds of grain one foot per minute. His engine, however, could push 200 times that amount of grain per minute. He boasted, therefore, that his engine had the equivalent power of 200 horses. A unit of measure was invented that could be easily understood. Watt’s company could barely meet the demand for his 200 horsepower engines.

Bouton-Watt steam engines were soon pumping water from every mine in the country. More coal was extracted than ever before. Brewers used the engine to grind ingredients. Steam engines were soon powering cotton-spinning textile factories and flint mills. Giant steam-powered bellows allowed manufacturers to smelt more refined iron than had been previously imaginable. Steam-powered rolling mills produced better quality steel which was used to make better machinery, tools, and buildings. Every industry that switched from water and horses to steam saw their productivity explode.

It was not long before another English inventor, Robert Trevithick, adapted the steam engine to move wheels and, in so doing, created the first locomotive. In 1830, George Stephenson announced the Rocket. The Rocket was the world’s fastest and most powerful locomotive and was soon moving what had been previously considered unbelievable amounts of freight at unfathomable speeds, up to 36 miles per hour. The world’s first railway linked Manchester mills to Liverpool’s docks. From there, newly developed steam -powered ocean going ships made with steel from steam-powered foundries linked those docks to the world.

Britain’s economy boomed. In the first fifty years of the nineteenth century, it became the world’s leading manufacturer and exporter of steel, iron, textiles, and coal. Iron alone increased its production by an astounding 2,500%. A circle was created where colonies provided raw materials and then the markets for finished products. With its far-flung colonies and secure trade routes all protected by its enormous navy, the steam engine and the industrial revolution it had unleashed saw Britain become the richest and most powerful empire of all time.

Like in all revolutions, the industrial revolution had winners and losers. The few, the less than one percent, grew enormously wealthy through controlling the import of sugar, cotton, and more from the colonies. Others owned or invested in the railways and shipping lines. A few owned or controlled the mills or as Marx would call them, the means of production.

And those growing mills, factories, ports, trains, and ships needed workers. Thousands left farms and obsolete village cottage industries. Former farm workers made more of the tractors that replaced them in the first place. Rapid urbanization saw many cities grow. London became the economic and cultural capital of the world with its population doubling in only fifty years to 2.7 million. People left relatively independent self-sufficient lives to live in deplorable conditions and, at work, act like the cogs in the machines they serviced. Author Charlotte Bronte wrote in Shirley: A Tale, “Misery generates hate: these sufferers hated the machines which they believed took their bread from them: they hated the buildings which contained those machines; they hated the manufacturers who owned those buildings.”

People living in Africa, Asia, and the middle east, often against their own will, became under paid or sometimes unpaid workers that fed British wealth. The need for more textile material led southern American cotton plantation owners to buy more slaves and become so wealthy that, eventually, they thought they could split from the northern powers they never liked and create their own country. The ensuing Civil War killed 600,000 Americans.

Back in England, and in every other country that followed its lead into the industrial era, and for the first time, people cared about time. Farmers followed the sun and seasons. But factories didn’t obey nature, they conquered it. Nature’s time was defeated as workers had to show up at a particular time and were paid by the hour. There were regulated times for breaks, lunch, and going home. Trains had to run on time too and so schedules were created. The tallest feature in many cities and towns ceased to be church spires but the town clocks. For a long while, cities set clocks according to the sun, making schedules impossible to maintain until a Canadian, Sir Sanford Fleming, reworked the most fundamental part of our existence so that the new society that steam had created would work – he mapped out time zones and standardized time.

An education system was created to mimic factory hours and rules. The schools taught the factory mentality of rote learning and obedience to the boss. School was considered practical only if it rendered one better able to work. It was industrial revolution teaching for a determined purpose and not, as the Greeks had envisioned, learning to become a wiser person.

But most kids didn’t attend. Children had worked before but with the massive movement of people and the new, insatiable need for labour, more children than ever came to know 16 hours shifts in the harshest of conditions. The 1832 Sadler Committee Report described parents often being separated from their kids for months or even years at a time and children being denied education, suffering workplace physical and sexual abuse, and sustaining more injuries than adult colleagues due to chronic fatigue. The report said that it was impossible to accurately state the number of children under 10 who died every year on the job.

The burning of so much coal to operate the factories and heat the new homes in the growing cities blackened the sky. It filled lungs with soot and brought disease and death. The rich escaped to big estates outside the cities and far from what radical Christian William Blake called in his poem Jerusalem, “dark satanic mills.” Ironically, many schools, those relics of industrial age educational organization, still maintain Jerusalem as their school song.

The world’s first seismic change, the agrarian revolution, began about four thousand years ago when it was discovered that one could grow food instead of chasing it. Farming made land the world’s most valuable resource and so the world’s richest people were those with the most of the stuff. They were called different things in different societies but in Britain, Lords controlled the land and the King, who owned the most land, controlled the Lords. The industrial revolution meant that the richest people were suddenly those who didn’t own the land but controlled the factories. American steel magnate Andrew Carnegie, one of the richest people of the industrial age, in fact, one of the richest people ever, understood the change and how it had happened. He tipped his hat to James Watt by writing a biography of the Scottish inventor.

The world’s scientists understood too. Watt’s enduring influence in having created a new form of power is remembered each time you turn on a light or power-up nearly anything. A unit of power equal to one joule per second is called a watt.

A number of factors cause change and one of the most significant can be a single invention. Inventions are not discoveries. To discover something is impressive but is essentially noticing what already existed. To have noticed black holes in space was not to invent them. James Watt invented the steam engine and what that invention wrought changed the world. Although the industrial revolution is over, given way to the new information age, sparked by a new invention, its effects remain with us today in ways we seldom even think about.

I bet you showed up on time this morning. And meanwhile, in Compton, the pump keeps right on pumping.

If you enjoyed this column, please send it to others and consider checking my others at http://www.johnboyko.com

Place and Change: Memphis Changes the World

A shy, skinny, eighteen-year-old truck driver walked into a tiny recording studio and asked to make a record for his mother’s birthday. The receptionist, Marion Keisker, asked if he was a singer. He looked down and mumbled that he was. She asked who he sounded like and he glanced up, grinned, and said, “I don’t sound like nobody.” And he was right. The world was about to change.

The ramshackle recording studio was in Memphis, Tennessee and that mattered. It mattered because place matters. Place has always been a catalyst of change. Memphis had become the continent’s largest inland port a hundred years before because it lay at the intersection of the mighty Mississippi that flowed from Minnesota, past Memphis, to the Gulf of Mexico and the Illinois Central Railway that tied the city to Chicago and New Orleans. Its serving as a vortex for people chasing a buck and a dream was rendered even more significant with the building of Highway 61 from New Orleans through Memphis to Canada. The river, rail, and road both fed and consumed post-WWII prosperity with a vibrancy that could be felt and, even more, heard. A new, angry, joyful, scary music raged as if the place inhaled surrounding sound then exhaled a hurricane.

place-and-change

The cotton fields that had ringed Memphis ensured that generations knew of the music African-American slaves sang to pass their sunup to sundown work days. Their songs were mournful melodies, chants, or call and response rousers that bled spirituality while expressing justifiable despair and inexplicable hope. From slave songs, field hollers, negro spirituals, and country-gospel, came the blues. In 1912, Memphis songwriter W. C. Handy was commissioned to pen a tune for a corrupt Memphis mayor and he called it Memphis Blues. He wrote a number of similar songs and, despite others claiming the title, became the father of the blues.

African American Memphis businessman, Robert Church, Sr., purchased land and supported the building of clubs, bars, and the Church Park and Auditorium along what became Beale Street. It offered every known vice and a few it made up. Beale Street became home to a number of African-American owned businesses and where bands and singers played the blues. It attracted performers from Chicago, St. Louis, and New Orleans and every time they came they taught, learned, and went home to spread the news.

The music industry was as segregated as the city. White record shops would not stock “race” music and white radio stations wouldn’t play it. By 1949, Billboard magazine writer Jerry Wexler had developed an appreciation for the new African American music and decided that instead of “race’ music, he would call it rhythm and blues (R&B). It worked. The new name seemed to make it less offensive to white audiences and some white radio stations began to play it. In popularizing the new sounds, Memphis radio stations joined Beale Street clubs where laws were broken and highway 61 honky tonks and juke joints where it was ignored altogether.

White society could segregate everything but radio proved that the air didn’t care. White and black folks in Memphis could hear the Grand Ole Opry out of Nashville, with its lively bluegrass, Appalachian folk ballads, and proud and corny country and western based on three chords and the truth. On other stations, they could hear blaring big bands playing quick-tempo jump and swing along with smooth pop epitomized by Frank Sinatra and Dean Martin. But at the same time, Memphis radio station WDIA was among America’s first to risk R&B records and it even hired African American disc jockeys to play them, including young blues singer Rufus Thomas and Riley King, an exceptional blues guitarist who everyone called B.B. Dewey Phillips at WHBQ was the city’s most popular disc jockey. While he was white, his nine to midnight Red, Hot, and Blue show played black and white music to a black and white audience. The air over Memphis was desegregating sensibilities below.

Among the R & B records played were 1948’s Good Rockin’ Tonight by Wynonie Harris and Rockin’ At Midnight by Roy Brown. Everyone understood that rock and rockin’ were thinly veiled euphemisms for sex. Sex was absolutely taboo in a society where pregnant teenagers were exiled, sex education was unthinkable, and birth control could not even be purchased by married women. Pile atop that the racist terror of oversexed black men with designs on white women, then the sexed-up “race” music, no matter what it was called, and all the radio stations, clubs, and honky tonks popularizing it, meant that something was both degenerate and dangerous. But it was as unstoppable as the Mississippi.

Among those attracted to the growing Memphis music scene was Alabama disc jockey Sam Phillips. Phillips moved to Memphis in June 1945. His Saturday afternoon WREC radio show became as daring as Dewey Phillips (no relation) in mixing black and white records. While working for the radio station at big band shows at the swanky Peabody Hotel, he spoke with white musicians who claimed to play differently when they came to Memphis and having to convert back when they left. He was told of black musicians who played Beale Street bars as well as Highway 61 juke joints and honky tonks who also played and sang differently when in or near Memphis.

Phillips saw that the supply of R&B records was unable to meet demand and recognized an opportunity. He rented an old radiator shop in downtown Memphis at 706 Union Street and had it renovated. In January 1950, he opened the Memphis Recording Studio. With primitive equipment, he recorded anyone with the money to rent time. Most left with nothing but their wax souvenir. Those with a unique song or style, though, found themselves signed to a deal that had Phillips license recordings to established companies that manufactured and distributed them. Through Phillips, independent companies along the rail, road and river lines in St. Louis, New Orleans, and, most importantly, Chicago’s Chess Records, began spreading the Memphis sound.

Among those Phillips recorded was B. B. King. King played a version of the blues that wrenched emotion from lyrics and, while still developing his style, defined songs with crisp guitar runs and riffs. Following King into the Memphis studio were bluesmen who honed their talents on Beale Street and whose music bled the amalgam of styles for which the city was becoming known: James Cotton, Rufus Thomas, Junior Parker, Walter Horton, and the man who would become as legendary as B. B. King, Howlin’ Wolf.

A Clarksdale, Tennessee disc jockey heard that Phillips was recording black singers. Ike Turner gathered his band and headed north. At first hearing, Phillips knew he had something special. Saxophonist Jackie Brenston sang the lead on a Turner composition called Rocket 88. The lyrics reveled in double entendre in equating a fast car to faster sex. The drums were relentless and the sax inventive. An amp had fallen off the car’s roof on the trip to Memphis and the resulting damage distorted the guitar, making it growl menacingly.

The 8-bar blues with the driving back beat sat perfectly at the core of the Venn diagram linking the pop, R&B, country, and the blues that Memphis musicians inhabited and traveling bands imitated. Phillips licensed the record to Chess Records and within weeks it was number one on the nation’s R&B charts with many pop stations and even country stations daring to play it. Rocket 88 was the world’s first rock ‘n’ roll record.

The success of Rocket 88 and other licensed recordings encouraged Phillips to launch his own record company. He called it Sun Records. Starting in February 1952, Sun enjoyed moderate success but Phillips grew increasingly frustrated by the persistent, racist resistance to R&B and blues records. He said to Marion Keisker, “If I could find a white man who had the Negro sound and the Negro feel, I could make a million dollars.” A little while later, on Saturday, June 26, 1954, the shy, skinny Memphis truck driver walked through his door to make his mama’s record. His name was Elvis Presley.

Phillips did not hear Elvis that day or a few months later when he returned to pay another four dollars to record again. When Phillips was again complaining about not being able to find the right singer to blend black and white, Keisker suggested the kid with the sideburns. Elvis was called and he ran to the studio, arriving panting for breath while Keisker was still on the line. Phillips had a couple of talented session players, guitarist Scotty Moore and stand-up bass player Bill Black, work with the kid. But that rehearsal and then a recording session revealed nothing particularly impressive. They were on a break when Presley spontaneously launched into an Arthur “Big Boy” Crudup R&B song called That’s Alright Mama. Black and Moore jumped in, all three laughing at the loose-limbed, ragged sound they were making. But Phillips heard what he’d been searching for.

That’s Alright Mama was quickly pressed and a copy taken to Dewey Phillips at WHBQ. A couple of spins brought phones calls to hear it again and again. The record was played on Memphis radio stations and its local then regional success put Presley on the road. He bought his clothes from Lansky Brothers, a black shop on Beale Street. His on-stage gyrations were variations of the black performers he had seen in Beale Street clubs. He sang, and then soon would record, more black, R&B songs. But with equal conviction, he wore his hair and sideburns in a defiant, white-trash truck driver style and also sang white ballads, gospel, pop, and the country numbers he loved. He was, in short, the embodiment of Memphis, the meeting place, with its new music absorbing influences from the lines that connected it to the world, synthesizing them, and sending them back with the challenge to question the barriers of class, race, age, and gender, and concepts of right and wrong, and fun and indecent.

Presley’s growing success afforded even more allure to Memphis. Carl Perkins grew up in grinding, rural Tennessee poverty. He took his guitar and dream to Memphis where he consummated the marriage of country and rock ‘n’ roll in a new variant called rockabilly. His second Sun Records release, Blue Suede Shoes, became a national hit for him and then Elvis. Hoping to become a gospel singer, Johnny Cash, moved from Arkansas to Memphis where Sam Phillips encouraged him to sing his own compositions including his second Sun release, Folsom Prison Blues. It contains music’s nastiest line: “I shot a man in Reno, just to watch him die.” Roy Orbison was enjoying little success in his native Texas but knew of the musical mecca that Memphis had become. He impressed Sam Philips with his three-octave range, was signed to Sun, and soon Ooby Dooby was a national hit. Jerry Lee Lewis attacked more than played a piano. He was drawn to Memphis from Louisiana and after a stint as a Sun Records session player, recorded Crazy Arms and then the blatantly sexual Whole Lotta Shakin’ Going On and Great Balls of Fire.

MILLION DOLLAR QUARTET

Lewis, Perkins, Cash, and Presley, Sun Records, December 1956. (Photo: The Commercial Appeal)

By 1956-’57, the new music that Memphis had been central to creating was topping national charts, being heard on TV, and filling juke boxes, theatres, and arenas. Parents were yelling upstairs to turn that noise down. Rock ‘n’ roll had become a central element in the transformation of first America and then the western world from old to new. It provided an impetus and soundtrack for the move from the white, patriarchal, sexually repressed world of segregated people and ideas to what would become the more liberal, modern era. Rock ‘n’ roll was the voice of the baby boom, the gigantic demographic whose power was its numbers and a determination to be heard its creed. Rock ‘n’ roll was the notification that the generation that had survived the Depression and war and now yearned for things to be calm, controlled, and predictable, was losing its existential battle for cultural supremacy. It was the bridge from the composed assurance of Eisenhower to the audacious vibrancy of Kennedy.

Memphis was the place of change and the change could not be contained. Up Highway 61, in Hibbing Minnesota, Bob Zimmerman heard the news and would soon change his name to Dylan and immortalize the highway in song. Across the Atlantic, sailors smuggled American records into Liverpool and Manchester where kids named John, Paul, Mick, and Keith studied them and then helped England lead rock ‘n’ roll’s second wave and, with it, inaugurate a new phase in the generational revolution. Place would matter again in causing change. And the change began in Memphis.

If you enjoyed this, please send it to others on your social media platform of choice and consider checking my other columns at http://www.johnboyko.com