A Reflection on Marriage and the Family

(Note: An abridged version of this essay was published by the Newport Daily News on June 13, 2020.)

Fifty years ago this weekend, my wife, Geri, and I were married on a sun-drenched day at the Catholic church at West Point, overlooking the picturesque Hudson River Valley.

Reaching such a milestone is at once quite celebratory and sobering; it is also a good occasion for reflection.

Sadly, that time, as today, was a period of great strife and turmoil in our country. The late 1960s were fraught with political and social unrest, spurred on by several movements: civil rights, anti-Vietnam War, feminism, and the counter-culture. Six weeks before our wedding on June 14, 1970, Ohio national guardsmen shot and killed four college students and wounded nine others at Kent State University, during a protest against the recent U.S. military incursion into Cambodia. On May 15, state police shot into a dormitory at Jackson State College in Mississippi, killing two students and wounding 12 others.

Despite our progress in civil rights since that period, a resilient racism persists in America, and we have paid for it this past month. Writing recently in the Wall Street Journal, journalist William A. Galston stated: “I fear, as never before, for the future of my country.”

Living four blocks apart in the same hometown in New Jersey, Geri and I have known each other since the age of five. Marriage, unmarred by divorce, was the rule in both our extended families.

On my side, in addition to my own immediate family, I had the example of a total of 14 sets of aunts and uncles, with only two divorces. I learned that marriage was commitment.

With all these relatives, most living within a 30-minute drive, my immediate family of five shared Sundays, holidays, weddings, funerals, and other special events. They became important and reliable sources of happy times, dependability, and mutual support.

When the furnace died, Uncle Mike showed up. When godfather Uncle John could not make it, older Cousin George filled in as my sponsor at Confirmation. Uncle Frank showed up with his camera at special family events; he took my Confirmation picture. When the kitchen ceiling was in need of repair, Uncle Gus helped my father fix it. When my mother started to fail, her many sisters went to the local Shop-Rite for the groceries. I learned service to family and dependability. When in need, family shows up.

As early baby-boomers from middle class, suburban America, both raised in an American-Italian culture, Geri and I searched for mates in our college years. To me personally, it was another essential preparatory step—a rite of passage—to launch into true adulthood. We shared two presumptions: Our marriage would be a life-long commitment. Second, we would have several children early. Without children, we would not have a genuine “family.”

The young tend to define love as physical attraction and sexual passion, giddily wonderful and enthralling. It is this; however, whether in six months or in six years the passion eventually wanes. The honeymoon euphoria evolves into the unglamorous everyday—cutting the lawn, packing lunches, emptying the house gutters, completing the tax forms, changing the diapers, driving the kids to soccer practice. Over our 50 years, I have learned that love is also a decision—a decision which one makes often.

In its special report on marriage, “A Looser Knot,” the influential British magazine, The Economist, looked at marriage worldwide and made three major observations: Marriage decisions are increasingly made by the young people getting married, not their older relatives. Second, “marriage has changed from a rite of passage to a celebration of love and commitment—a sign that two people who already live together are ready to commit themselves further.” Third, there is a growing acceptance of divorce.

In his essay, “The Nuclear Family Was a Mistake,” (The Atlantic, March 2020) David Brooks explains that historically extended families were the rule—several generations living together. He argues that the nuclear family—two parents + children—flourished in America for only a short period of time, 1950-1965. The successful family of that period has been replaced “by the stressed family of every decade since.”

He states that the pressures on the family have been mostly cultural. The “self,” privacy, and autonomy have become more important than the family. The women’s movement has given women more freedoms and choices. Much more so today, marriage has come to be about individual fulfillment. In my case, my children and grandchildren came to be a part of my fulfillment rather than hindrances to it.

He ends his essay on a positive note, indicating that the stressed nuclear family is giving way to larger “chosen families” and “forged families,” composed of family and friends, these offering the same kinds of benefits the extended family once gave.

My immediate and extended families, coupled with my “forged family” of lifelong friends, have been sources of great happiness, comfort, security, and practical help, as they continue to be. During a downturn in my health years ago, my wife and family pulled me—at times dragged me—through it, sometimes making decisions for me when I could not make them myself. We all pulled each other through the passing of my son, Tom, last year. On a practical note, my twin grandsons, Anthony and Vincent just finished helping me with back-breaking yardwork.

On the level of American society, I believe the family—along with community—are the best “incubators” of good citizens, teaching important values needed not only for family but for citizenship: commitment to something larger than self, responsibility, rules and order, the limits of individual freedom, civility and manners, along with other important intangibles, such as faith and trust. They, rather than schools or government, are the best developers of character and the best inculcators of virtues, such as patience, civility, and empathy, sorely needed today.

Second, in his farewell address to the nation in 1989, President Ronald Reagan said: “All great change in America begins at the dinner table.” I wonder if our great challenge of anti-racism really begins at our family dinner tables.

Of course, I lament the negative trends for the American family I have witnessed in my lifetime: increased divorce rates and increased births to unwed mothers. Since the Great Recession of 2008, the U.S. fertility rate has dropped; last year it hit a record low of 1.7, well below the rate of 2.1 needed to sustain a population.

With all the restrictions and isolation forced upon us during the pandemic, I did observe some benefits. Instead of working out alone at the local fitness center, I frequently took walks with Geri. On our walks on Water Street here in Portsmouth, we encountered whole families walking and biking together, rather than the solitary walkers and bikers before the pandemic. On our walks we greeted more people unknown to us than ever before. My hard-working son-in-law, Marc, was able to spend quality time with his children, my grandchildren. Geri has finally begun to paint for pleasure with our granddaughter, Mary Jane. And after 20 years, without my wife begging me, I finally painted the cellar stairs.

Fred Zilian (zilianblog.com; Twitter: @FredZilian) is an adjunct professor of history and politics at Salve Regina University and a regular columnist.


Adamy, Janet. “U.S. Birthrates Fall to Record-Low Level.” The Wall Street Journal, May 20, 2020.

Brooks, David. “The Nuclear Family Was a Mistake.” The Atlantic, March 2020, 55-69.

Galston, William A. “I’ve Never Been So Afraid for America.” The Wall Street Journal, June 3, 2020.

A Looser Knot. Special Report: Marriage. The Economist, November 25, 2017. 1-5.

Posted in Uncategorized | Tagged , , , , , | 3 Comments

Little Richard Helped to Birth Rock ’n’ Roll

(Note: This essay was originally published by the Newport Daily News on May 25, 2020.)

The passing of Little Richard earlier this month was a good occasion for me to take an easy stroll down the memory lane of music to the beginning of Rock ’n’ Roll. I dove deep into my music vault to find his first big hit, “Tutti Frutti,” released September 1955, but I was able to find only “Good Golly, Miss Molly.” Sitting back and listening, I was not disappointed—wild, loud, electrifying exuberance and energy.

Sixty-five years ago, Richard Penniman, known as Little Richard, was there at the dawn of Rock ’n’ Roll. He joined the other originators and shapers, including Chuck Berry, Fats Domino, Bo Diddley, and Elvis Presley and groups like the Platters (Only You, The Great Pretender), the Penguins (Earth Angel), the El Dorados (At My Front Door), and the Spaniels (Goodnite, Sweetheart, Goodnite).

Speaking of Little Richard, rock historian Richie Unterberger said, “He was crucial in upping the voltage from high-powered R & B into the similar, yet different, guise of rock ’n’ roll.” Richard would succeed in having ten Top-40 hits before he turned to God and gospel music in 1959.


In that transformative year of 1955, I was just a young lad of seven; however, I was lucky enough to have a sister six years older who played this new sound of rock ’n’ roll on the family radio and on her portable phonograph. Sister Diana and girlfriend Arlene sat on the floor in the corner of our living room, clad in slacks or skirts and bobby socks, and swooned to Elvis’ big hits in 1956: “Heartbreak Hotel”; “I Want You, I Need You, I Love You”; “Don’t Be Cruel” with flip side “Hound Dog”; and “Love Me Tender.”

In addition to this music, two movies were released in 1955 which helped to articulate, shape, and propel the raucous and rebellious teenage culture. In March, “The Blackboard Jungle” was released, starring Glenn Ford as a high school history teacher in an inner-city school and Sidney Poitier as a rebellious and musically talented student. Bill Haley and the Comets had released their song, “Rock Around the Clock,” the year before, without great success. However, when the group sang it at the beginning of the movie, teenagers in theaters across the country danced in the aisles. Sixty-five years ago this month, it entered the Top 40 charts, rose to No. 1, and remained there for eight weeks, becoming an iconic song of the birth of Rock ’n’ Roll.


The second significant movie was “Rebel Without a Cause,” starring a young, hard-edged, 1950s –cool James Dean. The film focused on the decay of youth morality, parenting, and generational conflict. Dean died in a car accident one month before its release. Like Sylvester Stallone’s hooded look in “Rocky” twenty years later, James Dean’s look and sense of alienation came to permeate youth culture.

Rock ’n’ Roll and youth culture blasted off in the mid-1950s with Little Richard and these other artists and songs that I remember in particular: In July 1955, Fats Domino released “Ain’t That a Shame,” which hit No. 10 and remained on the charts for 13 weeks. Fats went on to have 37 top-40 hits, 1955-1963.

In August 1955, Chuck Berry released “Maybellene,” which rose to No. 5 and stayed on the charts for 11 weeks. He followed it with “Roll Over Beethoven” in 1956, and “School Day” and “Rock and Roll Music” in 1957.

First recording for Sun Records in Memphis, Tennessee, in 1954, 19-year old Elvis Presley signed with RCA Records in November 1955. His first big hit on the Pop Chart, “Heartbreak Hotel,” entered the Top-40 in March 1956. It rose to No. 1 and remained there for eight weeks. In that year alone, Elvis had five No. 1 hits, including “Don’t be Cruel”/”Hound Dog.” With 11 weeks at the top spot, this hit tied the record for most consecutive weeks at No. 1, a record which held for decades until Whitney Houston’s “I Will Always Love You” (14 weeks at No. 1).

The emerging rock ’n’ roll culture brought on a wave of criticism and condemnation from worried pastors, parents, and commentators who branded it “devil’s music.” After all, the up-tempo songs and the dancing at times could exude such sensuality and sexuality.

Some hoped that it was just a passing flash. In “Rock of Ages: The Rolling Stone History of Rock and Roll,” authors Ed Ward, Geoffrey Stokes, and Ken Tucker state that some people thought it “was nothing but a momentary craze, something the teens would grow out of, like the Davy Crockett fad that had swept the nation in 1955.”

But it would be neither tamed nor terminated. As Danny and the Juniors sang in early 1958: “Rock ‘n’ roll is here to stay/It will never die/ It was meant to be that way/Though I don’t know why/I don’t care what people say/Rock ‘n roll is here to stay.”

“Closet DJ” Fred Zilian (zilianblog.com; Twitter: @FredZilian) is an adjunct professor of history and politics at Salve Regina University and a regular columnist.


Kot, Greg. “Rock and Roll.”


Accessed May 19, 2020.

Ward, Ed, Geoffrey Stokes, and Ken Tucker. Rock of Ages: The Rolling Stones

  History of Rock & Roll. NY: Rolling Stone Press, 1986.

Weiner, Tim, “Little Richard, Flamboyant Wild Man of Rock ’n’ Roll, Dies at 87.”

The New York Times, May 9, 2020. https://www.nytimes.com/2020/05/09/arts/music/little-richard-dead.html. Accessed May 10, 2020.

Whitburn, Joel. The Billboard Book of Top 40 Hits. NY: Billboard Books, 1992.

Posted in Uncategorized | 1 Comment

Coronavirus May Generate New Civic Spirit

April 13, 2020

(Note: This essay was originally published at thehill.com on April 23, 2020, as “Could the coronavirus create a new civic spirit in America?”)

While the coronavirus does unutterable harm to Americans, it also presents us with a great opportunity to think and act once again more as citizens, rather than consumers, techno-internet citizens, and hyphenated Americans, as many of us have become during the last decades.

As a human body is only as healthy as its individual cells, so a civilization or modern state is only as healthy as its individual citizens, a word we draw from the Latin word “civis,” meaning citizen. From this word we also derive our words: “civic,” “civilized,” and “civilization.” The related Latin word “civilitas” could mean civility or politeness, but to the ancient Romans it could also refer to civic unity or civicism. Mary Beard, the author of “SPQR,” explains it as: “we are all citizens together.”

The ancient Greek concept of citizenship was crucial to the identity and functioning of the Greek city-state. Citizens together took responsibility for the functioning of city government and defense, and for maintaining the proper relationship with the gods. In return, they shared in the city-state’s successes. As the concept evolved, it acquired the meaning of a shared ownership of the common good, not just a legal status, but rather the sense that the citizen was actively involved in the affairs of the city and contributing to its welfare.

To the ancient Greek philosopher Aristotle, a human could reach his full potential only through the city-state. Involvement in public affairs was part of the essence of being a human being. Our word “idiot” stems from the Greek word “idiotes,” used for someone who put private pleasures before public affairs.

Also during the ancient Roman Republic (509 BCE-27 BCE), the idea of shared citizenship among Romans evolved into a key concept. In his book, Rubicon, Tom Holland argues that to a Roman, nothing was more sacred or cherished. He states that in republican Rome “to place personal honor above the interests of the entire community was the behavior of a barbarian—or worse yet, a king.”

Roman culture socialized the citizen to place the common good before personal ambition. Indeed, historian Jackson Spielvogel states that the highest Roman virtue was pietas, “the dutiful execution of one’s obligations to one’s fellow citizens, to the gods, and to the state.”
For two decades after World War II, American citizenship, in my experience, hewed fairly closely to that of ancient Greece and ancient Republican Rome. I heard my uncles talk of their roles in World War II, TV shows glorified American valor and victory on land and sea, and we had a clear and present danger—the Soviet Union.

My classmates and I practiced scurrying to our battle stations under desks or in protected hallways during the nuclear war drills. I was nine when in 1957 the stark and arresting news came of the Soviet’s success in orbiting the Earth with Sputnik. In the succeeding years of the Cold War, we witnessed the arms race, Cuba going communist under Castro only 90 miles off the coast, and then the tense thirteen days of the Cuban Missile Crisis in 1962.

I marched in the annual Memorial Day parades and can remember the ceremonial reading of the list of names of the fallen. At the ripe and receptive age of 14, I heard John F. Kennedy’s famous words which actuated my generation: “And so, my fellow Americans: ask not what your country can do for you—ask what you can do for your country.”

John F. Kennedy at his inauguration


The united early-1960s eventually gave way to the divisive latter-1960s. The 9/11 patriotic moment notwithstanding, the last five decades have seen a shift in many an American mind from the American as citizen to the American dominated by the SELF and by sub-cultures based on such things as race, gender, sexual orientation, and now even technological virtuosity.

The coronavirus pandemic has challenged these mindsets. Since the second week of March, the “I,’ “you,” and “them” of American society and culture have been displaced dramatically by the collective “we” and “us.” Vice President Mike Pence: “We will get through this together.” Dr. Deborah Birx: “We shall move through this together in solidarity.” In Rhode Island, Governor Gina Raimondo: “We are all in it together.” CBS News and local news: “We are all in this together.” A local car dealer: “Together we’ll get through this.” ESPN: “Let’s all do our part;” “#one team.” American citizens—all together.

Our renewed civic vocabulary is being matched with civic action on many levels: Digital titans Apple and Google are collaborating to build software for individual smartphones which will enable digital contact tracing. People will be able to determine if they have had contact with those infected. Numerous Facebook community groups have arisen for mutual assistance. Three New Yorkers created Invisible Hands, a website matching volunteers with seniors and other at-risk people needing food and medication. Firefighters have assembled outside of the Brooklyn Hospital Center to cheer workers. And there is the nightly Clapping in New York City: neighbors expressing civic spirit with those across the alley. Of this new ritual, Amanda Hess: “The Clapping is a communal outburst. It is a reminder that though we are isolated, we are not alone.”

The coronavirus pandemic, despite all the damage and death, will benefit us in many ways and make us more prepared for the next one, which may be much more lethal. The mortality rates of the 14th century Black Death and the 1918 Influenza Pandemic were much higher. The Black Death mortality rate was exceptionally high. European cities lost 20-60% of their populations. In England and Germany entire villages simply disappeared. Between 1347-1351, historians estimate that the European population declined 25-50%.
In the 1918 Influenza Pandemic a total of about 500 million people worldwide were infected, one-third of the world’s population at the time. At least 50 million people died, with some estimates as high as 100 million; indeed, far more deaths than the fallen in World War I.

Having gone through a genuine pandemic and not just a simulation, we will develop effective vaccines and therapies, we will stockpile the necessary personal, protective equipment (PPE), and we will refine our federal, state, and local organizations and protocols to wage effective war against the next pandemic.

However, one of the greatest benefits may be a new American civicism which can counter the mindsets of self and sub-culture, inimical to sustaining a truly American civilization.

Fred Zilian, Ph.D., is an adjunct professor of history and politics at Salve Regina University, Newport, Rhode Island. He is the author of “From Confrontation to Cooperation: The Takeover of the National People’s Army by the Bundeswehr.” Follow him on Twitter @FredZilian.

Posted in Uncategorized | Tagged , , , , , , , , | Leave a comment

50th of Earth Day, Occasion for Hope and Action

(Note: This essay originally appeared on the History News Network (https://historynewsnetwork.org/article/175118, on April 22, 2020. An abridged version of this appeared in the Newport Daily News on the same day.)

In his No. 1 hit, “Eve of Destruction,” Barry McGuire sang of the many national and international issues plaguing the world in 1965, including: violent conflict (The eastern world, it is explodin’/Violence flarin’, bullets loadin’), nuclear war (If the button is pushed, there’s no running away), racial prejudice (Think of all the hate there is in Red China!/Then take a look around to Selma, Alabama!), human hypocrisy (Hate your next door neighbor, but don’t forget to say grace), and simple fear (And can’t you feel the fears I’m feeling today?) and frustration (This whole crazy world is just too frustratin’).

Though the song did not address any environmental issues, perhaps song writer, P.F. Sloan, should have. Five years later, on April 22, 1970, the first Earth Day underlined the many such issues the earth faced. Organized by Denis Hayes, the first Earth Day was celebrated in ceremonies at some two thousands colleges and universities, ten thousand (nasa)primary and secondary schools, and in hundreds of communities across the U.S.


On that first Earth Day, Nobel Prize-winning biochemist, George Wald, addressed an audience at the University of Rhode Island, warning that unless immediate action was taken, civilization would cease within 15-30 years. Writing in this month’s National Geographic magazine, Charles C. Mann paints the unsettling picture of the world in 1970. About 25% of the world’s population was undernourished; about half was living in extreme poverty. The average life expectancy in Africa was less than 46. Famines in West Africa had just killed about one million people. Violent conflict raged in Southeast Asia, the Middle East, Africa, and Latin America.

Environmentally, a flu pandemic originating in Asia in 1968 was still spreading to other parts of the world, eventually killing over a million. Major harbors around the world (London, Boston, Bombay/Mumbai) as well as many great rivers (Danube, Tiber, Mississippi) were polluted. Fumes from leaded gasoline and also smog filled the air. In early 1970, “Life” magazine predicted that “by 1985 air pollution will have reduced the amount of sunlight reaching Earth by one-half.”

Happily since that first Earth Day, there has been great progress on many fronts. Average life expectancy has increased by more than 13 years. Nitrogen fertilizers, better irrigation, and improved seed varieties have boosted food production, allowing it to outpace population growth. Fewer people are malnourished. The proportion of the population with access to healthier water has jumped from 81% in 1990 to 90% in 2015. Better hygiene, health care, and nutrition have dropped the maternal death rate 50% from 1990 to 2015. Pollution has fallen in many places. Masses of people, probably numbering in the billions, have been lifted from poverty to the “middle class”.

Mann concludes: “Thanks to technological advances, political and economic reforms, and cultural changes, average human well-being has, by almost every measure, improved since 1970.”

This substantial progress notwithstanding, the environment on many levels still beckons us to act. Many of its challenges are rooted in runaway CO2 emissions causing global temperatures to rise. From the pre-industrial level of 280 parts per million (ppm), the National Oceanic and Atmospheric Administration reports that it stands today at 407 ppm. The Administration also states that it was three million years ago when the atmosphere last had such a concentration.

In October 2018, a report by a United Nations scientific panel stated that if greenhouse gas emissions continue at the current rate, the atmosphere by 2040 will warm by as much as 2.7 degrees Fahrenheit above preindustrial levels. Consider how a person feels with a minor temperature of just one degree.

Today’s environmental challenges, both internationally and nationally, are many and significant. Internationally, scientific projections indicate that the oceans stand to rise one to four feet by 2100. Along with stronger storms and higher tides, this rise will threaten the estimated 600 million people who live on the world’s coastlines.

The September 2019 National Geographic magazine focused on the arctic and reported that the “trend is unmistakable. The Arctic Ocean will be ice free in summer by mid-century.”

Along with the Arctic, the Himalayan glaciers are melting. The Hindu Kush Himalaya Assessment, completed by 210 writers with input from 350 researchers and policymakers from 22 countries in February 2019, states that rising temperatures in the Himalayas, with most of the world’s tallest mountains, will melt at least one-third of the region’s glaciers by 2100, even if the world’s most ambitious environmental targets are met.

Tiny pieces of plastic—microplastics—in the world’s oceans have become a major concern over the past decade. Several studies have detected high levels in marine life. In 2017, microplastics were found in 83 per cent of tap water samples around the world, 94 per cent in samples from the U.S. Douglas Quenqua, writing in the New York Times in 2018, gave a sense of the scope of the problem: “in the next 60 seconds, people around the world will purchase one million plastic bottles and two million plastic bags.”

In January, researchers from Tel Aviv University concluded that Africa, with rising temperatures over the past seven decades, is experiencing bigger and more frequent thunderstorms.

Last fall Italy, especially Venice, experienced historic environmental events. While the country experienced rain-swollen rivers, high winds, and an out-of-season avalanche, Venice in the space of one week had three floods over 1.5 meters. Since records began in 1872, the city had never had two such floods in a year.

Down under in Australia during the past few months, there has been a string of weather events: first drought, then ravaging bush fire, capped with a foot of rain. Scientists call such a cycle: “compound extremes.” Former dairy farmer, Peter Ruprecht, buffeted by the extreme weather, stated: “We speak about the warmth of Mother Nature, but nature can also be vicious and wild and unforgiving.”

Off the shore of Australia stands the Great Barrier Reef, which recently experienced its third mass bleaching due to warming temperatures in five years. The Reef is one of the world’s most important marine ecosystems, supporting thousands of marine species. Scientists have indicated that other reefs around the world also have been dying at an alarming rate.

Great Barrier Reef


In the past five years, the United States has had its own fair share of dramatic environmental events. Last year across the South, many areas faced both elevated temperatures and drought, even though summer had turned to fall. In October, new record highs were set across the region, with temperatures in the high 90s in cities such as Nashville, Tallahassee, and Louisville. This unusual heat wave struck as the NOAA reported that last summer tied for the hottest on record for the northern hemisphere.

In California in 2017-2018, six of the top ten most destructive wildfires took place, destroying many homes and forested areas. Insurers indicated that increasing population density and a warming planet add up to more uncertainty. According to the state’s Department of Insurance, the wildfires cost insurers $23 billion.

In March, 2019, late-winter rains and snowmelt led to record-breaking floods which inundated the Midwest, causing dozens of levees to fail and billions of dollars of damage. According to the Army Corps of Engineers, at least 50 levees were breached. Counties in Iowa, Nebraska, and Missouri were especially hard hit.

Despite these environmental calamities, I remain optimistic about our ability to address and manage these challenges, both nationally and internationally. First, I believe the coronavirus pandemic may stun us, especially the go-it-alone nationalists, into thinking more globally. Global challenges require global cooperation and action.

Second, there are some truly positive trends regarding renewable energy sources. The U.S. government projects that the renewable contribution to our electricity will increase from 19 percent today to 38 percent by 2050. National Geographic reports that renewable energy—mainly wind and solar—is projected to top all other sources of electricity by 2045. During the past decade, electric charging stations for vehicles have finally shown a dramatic increase, with tens of thousands now across the country.

Third, I have tremendous faith in American technology and innovation. For example, scientists are working on technology which will allow the capture and storage of carbon emissions.

Fourth, in my studies of not only American civilization, but also civilizations throughout history, I still believe in the power of human agency: the ability of societies—with the right leadership—to make smart decisions to address clear threats to their civilizations.

Finally, and most importantly, leadership will soon pass to the younger generations, and in my experience—the reporting I read, the undergrad students I teach, the grandchildren I help to nurture—the world’s youth overwhelmingly supports addressing the environmental challenges vigorously. It is they who shall inherit the Earth.

Laura Parker writing in National Geographic (April, 2020) states: “Millions of children have come of age watching ice sheets melt and temperatures rise, and they are fed up with waiting for government leaders to act.” Political scientist Stephen Zunes states: “The Vietnam War served as a trigger to radicalize a generation. Climate change is going to do the same thing.”

Some examples: Beyond Greta Thunberg, the 17-year-old Swedish activist who has been in the headlines, there is Xiuhtezcatl Martinez, 19, a youth director for Earth Guardians in Colorado. He is one of 21 young people suing the national government over their constitutional right to life by demanding action on climate change.

Ghilain Irakoze, 20, is the founder of Wastezon in Rwanda. It uses a mobile app to connect consumers with the recycling industry. Mayumi Sato, 25, from Tokyo, has done environmental work in Thailand, Laos, Nepal and elsewhere. She states: “We all have to take part in climate justice.”

Kehkashan Basu, 19 and now living in Toronto, started the Green Hope Foundation to give young people a voice. He has been active in planting trees in India and Bangladesh. Felix Finkbeiner, 22 in Germany, founded Plant for the Planet in 2007 which has planted 8 million trees in 73 countries.

Delaney Reynolds, 20 in Florida, states in her speeches that at five foot two, she can look forward to age 60 when the sea level in her home state will be at her waist. She says: “It’s incredible that kindergartners can grasp this as a problem and politicians can’t.”

To paraphrase the final line of “Eve of Destruction:” I don’t believe we’re (yet) on the eve of destruction.

Fred Zilian is an adjunct professor of history and politics at Salve Regina University, Newport, RI (www.zilianblog.com; Twitter: @FredZilian)


“The Arctic Is Heating Up.” National Geographic. September, 2019.

Diamond, Jared. Collapse: How Societies Choose to Fail or Succeed. NY: Penguin, 2011.

Friedman, Thomas. Hot, Flat, and Crowded: Why We Need a Green Revolution—And How It Can Renew America. NY: Picador/Farrar, Straus and Giroux, 2009.

Mann, Charles C. “New Challenges for Us All.” National Geographic. April, 2020, 36-41.

Parker, Laura. “Fighting for Their Future.” National Geographic. April, 2020, 36-41. 70-79.

Posted in Uncategorized | Tagged , , , , , , , , , | Leave a comment

Forgetting Slavery in Rhode Island

(Note: This is the 9th essay in a series on “Slavery in Rhode Island.” For the complete series, please go to http://www.zilianblog.com. This essay was originally published in the Newport Daily News and the Providence Journal on April 13, 2020.)

In the final decades of the 18th and the first half of the 19th century, white New Englanders, including Rhode Islanders, responded to black freedom with rising hostility, seeking to distance themselves from free people of color and to bury and forget slavery.

Joanne Pope Melish, in her book, “Disowning Slavery: Gradual Emancipation and ‘Race’ in New England, 1780-1860,” analyzes the many dimensions of this process. Whites throughout New England took steps to exclude or segregate free people of color. They were generally excluded from juries. In houses of worship, they were restricted to the “Negro gallery” and to “Negro pews.”

This segregation emerged in burials. Whereas formerly the enslaved were buried near their master’s family, with emancipation whites began to segregate the burial plots of free people of color, as in “God’s Little Acre” in Newport’s Common Burying Ground. Also, the corpses of people of color seemed to become inordinately the target of grave robbers, probably for the purpose of dissection.

Headstone of Cuff Gibbs, engraved by his enslaved brother, Pompe Stevens, Common Burying Ground, Newport, RI (Photo: Fred Zilian)

In 1823-24, John Thompson, a free person of color, registered a complaint with the Providence Town Council that the body of his child and also of a woman of color had disappeared. While the Council did agree to offer a $100 reward to “any person who will prosecute to final conviction,” it apparently never followed up and advertised this. This case and others in New England, according to Melish, indicate that whites believed persons of color could be treated “like strangers and criminals, members of a dispossessed class that could be dispossessed of their own bodies with impunity.”

Another aspect of this segregation was the rise of separate schools for children of color. Because of white complaints over inter-racial schools, publicly-funded, separate schools for children of color emerged in Boston (1820), Hartford (1830), and Providence (1838).
In addition, people of color were denied the vote. In many New England towns, free people of color had been discouraged by custom from voting, and in 1822, Rhode Island rescinded their voting rights.

Free people of color fought back against these forms of segregation and discrimination in many ways. They organized their own schools while at the same time protesting public school segregation. They organized their own churches, mutual benefit societies, reading societies and newspapers. They held national and regional conventions to support northern equality and fight southern slavery.

But these achievements were a two-edged sword. Many New Englanders, including Rhode Islanders, derided the efforts by people of color to enact their citizenship. They constructed simple, crude caricatures of them, popularized in humorous anecdotes, cartoons, and broadsides (large posters) which, in general, ridiculed their activities and lifestyles.

A common occasion for this was the annual July 14 celebration by people of color, the anniversary of the closing of the Atlantic slave trade in 1808. Melish states that overall these broadsides sought to ridicule the public activities of free people of color as a sort of pathetic and ineffective “imitation citizenship,” a citizenship of which they simply were not capable. Once whites established this caricature of the “free Negro,” it proliferated to cartoons, stories, and eventually minstrel shows.

The caricatures also depicted people of color as disorderly, hard to control, and dependent. This led whites in New England increasingly to seek their physical removal. One strategy to achieve this was to “warn them out” of towns as “transients” to avoid a public burden.
The records of the Town Council of Providence show the method. There were periodic round-ups of people of color who were “likely to become chargeable” and who were warned out of town unless they could show clear “legal settlement.” However, the records show that many who had lived in Providence for years were still declared “strangers,” and that, compared to poor whites, they were not an inordinate financial burden to the town, according to Melish. She argues that the “menace to the town was imagined.”

By the 1830s and 40s, efforts to send people of color “back” to Africa also increased. By 1830, all New England states (except for Rhode Island) had branches of the American Colonization Society, organized in 1816. The supporters of this movement had various motives; however, as Melish states, all “cast people of color squarely in the role of strangers,” and therefore, “contributed to the effacement of their local history of enslavement and undermined their claims of entitlement to citizenship.”

The final dimension of the purging of free people of color was periodic mob violence against their communities. In Rhode Island, two incidents, both in the Providence area, stand out: In 1824, a mob of whites tore down a number of houses in the black community of Hard Scrabble. In 1831, over a thousand whites were involved in four days of rioting against the Snow Town neighborhood. Four rioters were killed, 14 were wounded, and 18 houses were damaged or destroyed.

At the Colored National Convention in Rochester, NY, in 1853, Frederick Douglass would say: “Our white fellow-countrymen do not know us. They are strangers to our character, ignorant of our capacity, oblivFrederiious of our history and progress.”

(I would like to thank Joanne Pope Melish for her assistance with this essay.)

Fred Zilian is an adjunct professor of history and politics at Salve Regina University and a regular columnist.

Posted in Uncategorized | Tagged , , , , , , | 2 Comments

Earlier Pandemics More Deadly Yet Instructive

(Note: This essay was originally published by the Newport Daily News on March 30, 2020.)

The two most lethal pandemics in western history can help us think about the challenges and stakes we currently face with the coronavirus: the Black Death of the 14th century, and the Great Influenza Pandemic of the early 20th century.

Historian John Kelly in his book, “The Great Mortality”, calls the Black Death “the greatest natural disaster in human history.”

By virtue of the increased connectedness of the world from globalization, the coronavirus has spread much more rapidly than the Black Death, which could spread only as fast as the fastest horse or sailing ship. Like COVID-19, the Black Death originated in Asia, however, not in China but in Central Asia with the Mongols. As these fierce warriors came to dominate Eurasia, the subsequent movement of people, goods, and rats helped to spread the disease east to China and west to Europe.

The plague reached Europe in 1347 when merchants from Genoa (Italy) sailed from the Crimean Peninsula (Black Sea) to Sicily and beyond. One contemporary wrote of the sailors infected with the disease: “When the sailors reached these places [Genoa, Venice, and other Christian areas] and mixed with the people there, it was as if they had brought evil spirits with them.” It took the plague five years to spread throughout all regions of Europe.

While the World Health Organization currently estimates the coronavirus mortality rate at 3.4%, the Black Death mortality rate was exceptionally high. European cities lost 20-60% of their populations. In England and Germany entire villages simply disappeared. Between 1347-1351, historians estimate that the European population declined 25-50%.


Reactions to the Black Death varied. With life suddenly so precarious, some indulged themselves. In 1348, Giovanni Boccaccio wrote: “Day and night they went from one tavern to another drinking and carousing unrestrainedly.” Others, seeing the hand of God, sought to repent and cleanse their souls, flogging themselves with whips to win God’s forgiveness. Anti-Semitism rose dramatically as Jews were accused of causing the plague.


The persistence of the Black Death should give us pause. It did not simply burn through Europe in five years and vanish. Rather, there were major outbreaks in 1361, 1369, and then recurrences until the end of the fifteenth century. It was only then that the European population began to recover.

Just over one hundred years ago the world was struggling with what the Centers for Disease Control (CDC) claim was “the most severe pandemic in recent history,” the 1918 Influenza Pandemic.

It was caused by an H1N1 virus with genes originating in birds. A total of about 500 million people worldwide were infected, one-third of the world’s population. At least 50 million people died, with some estimates as high as 100 million; indeed, far more deaths than the fallen in World War I.

The sickness was first identified in the United States in the spring 1918 among military personnel, and it eventually claimed 675,000 lives in the US. It was so severe that in the period 1917-1918, the life expectancy in the US declined about 12 years, to 36.6 for men and to 42.2 for women, according to the CDC. It struck most age groups; however, it was unique in that it hit the 20-40 year age group especially hard.

Historian John M. Barry quoted one person who lived through it: “It kept people apart. You couldn’t play with your playmates, your classmates, your neighbors. The fear was so great, people were afraid to leave their homes. You had no school life, no church life, nothing. It destroyed all family and community life. People were afraid to kiss one another, afraid to eat with one another. Constantly afraid.”

Like Thucydides writing about the plague in ancient Athens during the Peloponnesian War, David Brooks, writing recently in the New York Times, is correct in considering our current situation also through moral-ethical lenses. In examining several historic pandemics and severe epidemics including the Black Death, Brooks concludes that dread “overwhelms the normal bonds of human affection.” He points out that dire situations can even challenge these bonds within families. He quotes Boccaccio who wrote: “…scarcely to be believed, fathers and mothers were found to abandon their own children, untended, unvisited, to their fate.”

In the course of this current pandemic, we Americans—despite all the wonders of modern medicine and our world-class medical facilities and care givers—will still be challenged with moral-ethical questions. How many provisions and sanitizer bottles do I buy and how much do I leave for my fellow citizen? If the disease spikes, who will get the limited number of beds and respirators, and the use of limited ICU facilities? How should I help the family of a sick or fallen fellow citizen?

Fred Zilian (zilianblog.com; Twitter: @FredZilian) is an adjunct professor of history and politics at Salve Regina University and a regular columnist.

Brooks, David. “Pandemics Kill Compassion, Too.” The New York Times, March 13, 2020.
Calfas, Jennifer. “In U.S., Threat Upends Daily Life.” The Wall Street Journal, March 13, 2020.
Kelly, John. The Great Mortality: An Intimate History of the Black Death, the Most Devastating Plague of All Time. (NY: HarperCollins, 2005).
Spielvogel, Jackson. Western Civilization, Volume I, To 1715, 10th ed. Boston: Cengage Learning, 2018.
Zilian, Fred. “Remembering the Great Influenza Pandemic.” The Newport Daily News, December 17, 2018.

Posted in Uncategorized | Tagged , , , , , | Leave a comment

The Emancipation of the Enslaved in Rhode Island, Part II

(Note: This is the eighth essay—second part—in a series on Slavery in Rhode Island. It was originally published by the Newport Daily News on February 26, 2020.)

The process of emancipation of the enslaved began on a colony-wide basis with Quaker manumissions in 1773 and ended with the General Assembly abolishing slavery in 1842.

Change as transformational as the emancipation of a people requires the impetus of new ideas and the will and determination of human agents. The new ideas came from 18 hundred years of Christianity and the 18th century Enlightenment. The human agents of change were led by white Quakers, and white religious ministers and lay people of other faiths who freed their slaves and fought for the abolition of slavery. It also included the many enslaved and free blacks who enlisted and served with distinction in the War for Independence in the 1st Rhode Island Regiment (“Black Regiment”).

Of the role that blacks played in the war, historian Christy Clark-Pujara, in her book, Dark Work: The Business of Slavery in Rhode Island, writes: “The actions of the enslaved—running away, joining the military, and lobbying for freedom—in conjunction with an emerging abolition movement had torn at the fabric of slavery and challenged the morality and legitimacy of slaveholding in the new democracy.”

During and after the war, the final group of agents who brought about emancipation was the entire enslaved and free black population who—overtly and covertly, little by little, year in and year out, by acts of omission and commission—fought the system of slavery. Clark-Pujara states: “Enslaved northerners ran away in unprecedented numbers, volunteered for military service, and sued for, bargained for, and bought their freedom.” If achieved, freedom may have come quickly or it may have taken decades.

Beginning in the 1750s, these factors began to take effect. During that period, the Quakers began in earnest their criticisms of the trafficking of slaves. In 1769, at a meeting in Greenwich, RI, Quakers appointed a committee to begin manumissions, freeing 49 slaves between 1773 and 1803.

In early 1778, during the War for Independence, the Slave Enlistment Act was passed, providing for the enlistment of former slaves who “presented themselves.” Their masters were compensated between £30 and £120, depending on their age and skills.

In 1779, slaveholders lost the right to sell slaves out of state, a clear sign that slaveholders were losing control of their “property.”

After the War for Independence, the General Assembly—influenced by Quakers and black war veterans—took a major step forward by passing in February 1784, the Act for the Gradual Abolition of Slavery. All children born to slave mothers after March 1, 1784, were declared free. However, they would be indentured to the town of their birth for a period of time—18 years for women and 21 years for men. While a significant step, the law freed no one immediately and to those born before March 1, it meant nothing to them personally. Later amendments raised the number of years of indenture for women to 21 and also specified that former masters, and not towns, were responsible for educating and supporting freed children. While it gave freedom to future African Americans, the later legislation was designed clearly to avoid placing burdens on non-slaveholding whites.

Three years later, over great opposition from slave traders, the General Assembly passed the 1787 Act to Prevent, and to Encourage the Abolition of Slavery and Slave Trade. This act not only forbade citizens from the slave trade, it made clear the intimate connection between slaveholding and the slave trade. Public opinion was shifting against the system of slavery.

By 1810, 97% of blacks in Rhode Island were free; however, Clark-Pujara indicates that most were not freed by these laws, which “were not the catalyst for the disintegration of the institution. Instead, these laws further contributed to an environment in which enslaved people could better negotiate for their freedom….”

Regrettably, despite this legislation and the clear shift in public opinion, the slave trade within the state in the decades after the American Revolution increased and transformed. In the period 1789-1793, the slave trade in the state increased by 30%. Newport resumed its slave trading; however, with the city so devastated by the three-year British occupation, more of the trade shifted to Providence and Bristol. In Providence Moses Brown fought against slavery while brother John enriched himself with the business of slavery. During the period 1784-1807, Bristol slave merchant James DeWolf and his family underwrote 88 African slave voyages.

The drive for wealth clearly dominated the rule of law and of conscience.

Fred Zilian (zilianblog.com; Twitter: @FredZilian) is an adjunct professor of history and politics at Salve Regina University and a regular columnist.


Posted in Uncategorized | Tagged , , , , , | Leave a comment

The Emancipation of the Enslaved in Rhode Island, Part I

(Note: This is the eighth essay in a series on Slavery in Rhode Island. It was originally published by the Newport Daily News on February 25, 2020.)

The enslavement of people of color in the colony of Rhode Island began probably with the founding of the colony in 1636. The process of emancipation of the enslaved began on a colony-wide basis with Quaker manumissions in 1773 and ended with the General Assembly abolishing slavery in 1842.

Change of this order requires ideas and agents of change. In this case, the idea of the natural inequality of humans had to be displaced by the idea of their inherent equality and right to freedom. Western civilization recognizes these new profound ideas emerging with force in the 18th century, known as the Enlightenment or the Age of Reason. Its core ideas included the inherent dignity, worth, beauty, and potential of humans and the agency of humans to reform society for the better.

Thomas Jefferson enshrined some of these new ideas in the opening paragraphs of the Declaration of Independence. “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights; that among these rights are Life, Liberty, and the pursuit of Happiness.” He spoke to the agency and role of humans in continuing: “That to secure these rights, Governments are instituted among men, deriving their just powers from the consent of the governed….”

When coupled to the basic beliefs of Christianity, these ideas became exceptionally compelling to some, even those deeply benefiting from the business of slavery.

To effect such a dramatic change, these ideas needed human agents—white people and people of color—to drive them. For whites, the Quakers took the lead. As historian Christy Clark-Pujara states in her book, Dark Work: The Business of Slavery in Rhode Island: “The Quakers were the first European-descended religious group in the Americas to publicly question and eventually prohibit slaveholding among its members; they were also the driving political force behind legal restrictions placed on slaveholding.”

Quakers believed in the spiritual equality of all humans, that they all shared the ability to receive the “inner light” from God, a concept with its roots in the ancient Greek philosophy of stoicism. Their commitment to emancipation evolved. Initially they saw no conflict with slavery. They believed they could remain holy as long as they treated their slaves well. Many educated their slaves and brought them to their Quaker meetings.

However, eventually many Quakers came to believe that slaveholding was “contrary to true Christianity” and tied slaveholding to slave-trading, making both the focus of their actions. Not only did they free slaves under their control, they also became fervent abolitionists, seeking to legislate the end of slavery. They published broadsides, commentary, and homilies roundly criticizing the horrors of slavery. They sent petitions to legislators. They claimed that the slave trade and slaveholding were “the most barbarous” institutions in history, and they declared that God would punish the entire nation because of these sins.

Other Protestant ministers also served as agents of change. Congregationalist minister Samuel Hopkins was exceptional in this regard. Serving as pastor in Newport, 1770-1803, he became a very vocal, resolute abolitionist after his first few years there. As historian Joseph Conforti writes: “For the first time in his life, the backcountry minister confronted the slave trade’s grim reality. Chained Africans were sometimes unloaded in Newport and sold before his eyes.”

Hopkins came to believe that British tyranny was God’s retribution for the slaveholding and slave trading of the colonists. Moved to action, he preached against slavery and sought to convince wealthy slaveholders and other ministers to join the cause.

Once the War for Independence began, the new nation needed soldiers to fill its ranks. By 1778, with the British offering freedom to slaves and with the Continental Congress calling for more battalions, the Rhode Island General Assembly decided to allow slaves to enlist in the 1st Rhode Island Regiment, which came to be known as the “Black Regiment.” The 1778 Slave Enlistment Act declared: “That every slave so enlisting shall, upon passing muster …, be immediately discharged from the service of his master or mistress, and be absolutely FREE.” Fighting in Rhode Island, New York, and New Jersey, the regiment grew eventually to 226 officers and enlisted—perhaps 110 of the latter being former black slaves.

Christy Clark-Pujara indicates that this was “the first … step in the legal dismantling of the institution of slavery.” Though the act was revoked five months later because of stiff opposition, this act and the performance of the 1st Rhode Island in the war greatly undermined slavery in Rhode Island. Former slaves and free blacks had enlisted and were fighting next to whites against British tyranny and for “unalienable” human rights. Were not enslaved blacks also human and deserving of these same rights? (See “The Emancipation of the Enslaved in R.I., Part II.”)

Fred Zilian (zilianblog.com; Twitter: @FredZilian) is an adjunct professor of history and politics at Salve Regina University and a regular columnist.




Posted in Uncategorized | Tagged , , , , , , , , | Leave a comment

Palmer Raids Attack Anarchists, Communists, but Also Rights

(This essay was originally published by the Newport Daily News on January 31, 2020.)

One hundred years ago this month, the second and final set of “Palmer Raids” took place. These government raids, named after Attorney General A. Mitchell Palmer, targeted mainly Eastern Europeans and Italian immigrants with ties to radical left organizations. This episode in US history again highlighted a recurrent issue for all liberal democracies like the US: In an emergency, when and to what extent is a government allowed to curtail civil liberties and rights in order to protect lives?

To understand the raids, one must understand the context. During World War I (1914-18), there was a strong nationalistic movement in the US against immigrants suspected of possessing excessive loyalty to their countries of origin. The xenophobia was especially strong against Germans, because the war pitted Germany against the United Kingdom, a country with strong ties to the US, and also strong against Irish, because they were in revolt against the British.

In 1915, President Woodrow Wilson warned against those immigrants who “poured the poison of disloyalty into the very arteries of our national life.” “Such creatures of passion, disloyalty, and anarchy must be crushed out.”


In the fall of 1917, as WW I continued, the Russian Revolution erupted. The Russian monarchy was dissolved and replaced eventually by a communist government, led by Vladimir Lenin. Rooted in Marxism, communist ideology not only called capitalism an enemy, it also predicted its ultimate demise, spreading fears in Western democracies. One of the reasons that Lenin withdrew Russia from the allied war effort was his belief that workers of all warring countries, inspired by Russia’s example, would place class identity above national loyalty, forcing a peace settlement. This Revolution and its communist ideology gave rise to the Red Scare in the US, the fear of communist infiltration and subversion.

The fears of many in the US were confirmed when Italian radical anarchists (those shunning all government structures) conducted a series of bombings in 1919. In April, 30 letter bombs were mailed to prominent government and law enforcement officials and businessmen, some exploding and causing harm. On June 2, a second wave of bombings occurred. Italian anarchists exploded large package bombs in eight American cities. One damaged the home of Attorney General Palmer in Washington, DC. Accompanying each package were flyers declaring war on capitalism.

In October, the US Senate demanded action. In response on November 7, agents of the newly-formed General Intelligence Division of the Bureau of Investigation, headed by 24-year-old J. Edgar Hoover, executed raids against the Union of Russian Workers in 12 cities. Exceeding the number of official warrants, the arrests made were sometimes indiscriminate and included innocents who happened to be at the wrong place at the wrong time. Of 650 arrested in New York City, the government succeeded in deporting only 43.

On January 2, 1920, the Justice Department launched another series of raids, extending over six weeks. The raids, many again indiscriminate, were conducted in over 30 cities and towns and 23 states. At least 3000 were arrested, with some of the arrests and seizures made without search warrants and with the detentions conducted under harsh conditions.

Criticism of the raids eventually erupted. Resigning in protest, Francis Fisher Kane, the US Attorney for the Eastern District of Pennsylvania stated: “It seems to me that the policy of raids against large numbers of individuals is generally unwise and very apt to result in injustice. People not really guilty are likely to be arrested and railroaded through their hearings….” Palmer replied that the raids were warranted given the “epidemic” and asserted the government’s “right for its own preservation….” The Washington Post supported him, indicating: “There is no time to waste on hairsplitting over infringement of liberties.”

In May, 1920, the American Civil Liberties Union, established only five months earlier, published a report documenting and criticizing the unlawful and excessive government actions.

In June, a decision by the Massachusetts District Court Judge George Anderson ordered the discharge of 17 arrested aliens and criticized the government’s actions. He wrote: “…a mob is a mob, whether made up of Government officials acting under instructions from the Department of Justice, or of criminals and loafers and the vicious classes.” This decision essentially halted any further raids.

The anarchist bombing campaign continued intermittently for another 12 years.

Fred Zilian (zilianblog.com; Twitter: @FredZilian) is an adjunct professor of history and politics at Salve Regina University and a regular columnist.


Posted in Uncategorized | Tagged , , , , , , , | Leave a comment

Young Aware of Danger of Smartphones

(This essay is the second in a series on Technology, Society, and the Human Being. It was originally published on January 18, 2020, in the Newport Daily News.)

It is easy for the Baby-Boom Generation to criticize the younger generations for spending too much time on individual screens; however, maybe there is hope. If my undergrad students are any indication, more and more of them are aware of the downside of smartphones and social media.

Over the past several years, I have watched individual screens come to dominate the lives of some of my grandchildren. In some cases one screen is not enough; they need two. Sitting together in front of a large TV, some are not satisfied. Simultaneously most of them are also busy on their smartphones. Regrettably, smartphones tend to be used inside and while stationary.

On a beautiful, sunny Saturday last September at a beach cottage in Little Compton, after devouring the scrambled eggs I had prepared for them, my four youngest grandchildren returned to their bunkbeds and individual screens rather than bike to the beach, explore the woods, climb a tree, shoot hoops, or explore the neighbor’s blackberry patch. Regrettably, this was not a singular event. They tend to default to the smartphone rather than other activity.

In a survey by the Common Sense Media non-profit, released in October, of 1677 young people, ages 8 to 18, it was found that the average tween, 8-12, spent four hours and 44 minutes daily with entertainment devices. For teens, the figure was an incredible seven hours, 22 minutes. (This did not include time spent in cyberspace listening to music, doing homework, or reading books.)

Since their introduction in 2007, I have been amazed at the places I find people using smartphones: driving a car, driving a lawn mower, riding a skateboard, riding a bike, using the urinal, sitting in a library surrounded by a million books, between sets at the fitness center, and walking out of a college classroom and bumping into me entering.

However, I have found grounds for hope in my Gen Z (born after 1995) college students. For the past four years on their term exam, I have given Salve Regina undergrads in my Western Civilization class the opportunity to write about one “good aspect” and one “bad aspect” of their civilization. I provided a list of possible subjects to choose from, but also encouraged them to choose any subject. For three years from 25% to 33% chose to write about the negative impacts of the new technologies. (Note: In some cases they examined it also as a positive aspect.) This past fall, however, for the first time 50% of my students chose to criticize it.

Over these years, the students made different criticisms of the smartphone and social media. Camille said: “I can honestly say that my phone and other electronic devices distract me. Even when I know I need to focus on something, I can’t if I know my phone is there …” Another student indicated: “…people no longer live in the moment. They are always on their phones.”

Several female students addressed the unrealistic beauty standards set by the internet. Lindsay said: “Media in American society is toxic to adolescents. Media portrays celebrities as perfect human specimens without flaws. This causes adolescents, especially girls, to form unrealistic expectations of physical beauty.” Nicole wrote of the ad pop-ups with “…pictures of beautiful people who make normal people self-conscious.” This can lead people to think they are not “good enough,” leading to mental disorders. Hanna made the same point: Social media makes users “feel inadequate,” “…as if you aren’t living up to a social standard.”

Andrew had another criticism: “Many people use social media to put down others and make fun of people,” adding that “social media can ruin a person’s life.”

Nicole addressed an opportunity cost: Smart phone are “taking over people’s lives…. They do not enjoy nature, or other people’s company …. They miss out on parts of life, human interactions, and the many things outdoors ….”

Carla was apocalyptic: “…I think technology is ruining our generation.” “…it is difficult for the modern person to stay connected to what is real.”

By far the most common criticism was what smartphones are doing to inter-personal communication. This past fall nearly every student who criticized smartphones addressed this point. Ray said it has led to “social disconnect.” Al stated bluntly: “The art of public speaking has been lost in my generation.” Sophia said: “We are so dependent on them, a lot of us cannot have a real conversation with someone. Digital screens have completely taken over our lives….” Ainsley pointed out: “…technology has become something that divides us rather than brings us together. It encourages deception and numbs social interaction.”

Finally, several students took their criticisms to a higher level, speaking to the essence of society and human-ness. Kyle said: “…social media can corrupt a human being…. “[It] is limiting our ability to communicate with others in person, which is essential to our nature.” Kristin stated that because of the dependence on technology, “generations lack what is necessary to live fully.” There is “a loss of depth, meaning, and fulfillment in life’s experience….”

Dan addressed this same point and tied it to a giant of Western Civilization: “Smartphones prevent humans from fully exploring all the possibilities life has to offer. And as the immortal Socrates said: “the unexamined life is not worth living.”

Fred Zilian (zilianblog.com; Twitter: @FredZilian) is an adjunct professor of history and politics at Salve Regina University and a regular columnist.






Posted in Uncategorized | Tagged , , , , , , , , , | 2 Comments