Saturday, March 9, 2024

Cherry blossoms – celebrated in Japan for centuries and gifted to Americans – are an appreciation of impermanence and spring

Families relax under lush cherry trees in the Shinjuku Gyoen in Tokyo. shankar s./Flickr, CC BY
Małgorzata (Gosia) K. Citko-DuPlantis, University of Tennessee

Cherry blossoms mark the beginning of spring. Various festivals are regularly organized in California, Colorado, Georgia, Tennessee and Washington, D.C., to celebrate the bloom of cherry trees.

The blossoms, however, are short-lived and usually fall within a week. Indeed, “sakura,” as the cherry tree is known in Japanese, is a recognized symbol of impermanence in Japan and beyond.

Every year, many people all around Japan gather under the cherry trees in parks and gardens for a spring picnic to watch the blossoms fall while they chat with their companions over seasonal drinks and snacks. Such gatherings are called “hanami,” literally meaning “viewing the flowers.”

As a scholar of premodern Japanese literature and culture, I was introduced to the custom of viewing cherry blossoms early on in my education. It is an ancient ritual that has been celebrated and written about in Japan for centuries and continues to be an indispensable element of welcoming spring. In the U.S., the tradition of hanami started with the first cherry trees being planted in Washington D.C. in 1912 as a gift of friendship from Japan.

Poetry about nature

The custom of viewing blooming trees in spring arrived in Japan from the Asian continent. Watching blooming plum trees, often by moonlight, as a symbol of strength, vitality and end of winter was practiced in China since antiquity. It was adopted in Japan sometime in the eighth century.

Poetic examples of blooming plums, or “ume” in Japanese, are found in “Man’yōshū,” or a “Collection of Ten Thousand Leaves,” the oldest collection of Japanese poetry, which dates to the eighth century.

Scholar of East Asian Literatures Wiebke Denecke explains that classical Japanese poets wrote poetry about plum blossoms when they were in season. Their compositions shaped Japanese court poetry, or “waka” in Japanese, which is rooted in nature and its constant seasonal cycle.

However, it is the sakura, not plum trees, that occupies a special place in Japanese culture. Imperial waka anthologies compiled in Japan between 905 and 1439 C.E. usually contain more spring poems composed about cherry blossoms than plum blossoms.

Central to waka composition

The first cherry blossom viewing was held by Emperor Saga in 812 C.E. and soon became a regular event at the imperial court, often accompanied by music, food and writing poetry.

Cherry blossoms became one of the regular topics of waka composition. In fact, I started studying Japanese poetry thanks to a sakura-themed poem written by a classical female poet, Izumi Shikibu, who is believed to have actively composed waka around 1000 C.E. The poem is prefaced with its author’s memory about her ex-lover wishing to see the cherry blossoms again before they fall.

tō o koyo
saku to miru ma ni
chirinu beshi
tsuyu to hana to no
naka zo yo no naka
Come quickly!
As soon as they start to open
they must fall.
Our world dwells
in dew on top of the cherry blossoms.

The poem is not the most famous example of waka about cherry blossoms in premodern Japanese poetry, but it contains layers of traditional imagery symbolizing impermanence. It emphasizes that once cherry blossoms bloom, they are destined to fall. Witnessing the moment of their fall is the very purpose of hanami.

Dew is usually interpreted as a symbol of tears in waka, but it can be also read more erotically as a reference to other bodily fluids. Such an interpretation reveals the poem to be an allusion to a romantic relationship, which is as fragile as evaporating dew on soon-falling cherry blossoms; it does not last long, so it should be appreciated while it exists.

A blossoming Japanese tree laden with clusters of pink flowers in a garden.
In Japan, cherry blossoms symbolize impermanence. Elvin/Flickr, CC BY-NC

The poem can also be interpreted more generally: Dew is a symbol of human life, and the fall of cherry blossoms a metaphor for death.

Militarized by the Empire of Japan

The notion of falling cherry blossoms was used by the Empire of Japan, a historic state that existed from the Meiji Restoration in 1868 until the enactment of the Constitution of Japan in 1947. The empire is known for the colonization of Taiwan and annexation of Korea to expand its territories.

Sasaki Nobutsuna, a scholar of Japanese classics with strong ties to the imperial court, was a supporter of the empire’s nationalistic ideology. In 1894, he composed a lengthy poem, “Shina seibatsu no uta,” or “The Song of the Conquest of the Chinese,” to coincide with the First Sino-Japanese war, which lasted from 1894 to 1895. The poem compares falling cherry blossoms to the sacrifice of Japanese soldiers who fall in battles for their country and emperor.

Commodification of the season

In contemporary Japan, the cherry blossoms are celebrated by many members of society, not only the imperial court. Blooming around the Lunar New Year celebrated in premodern Japan for centuries, they are symbolic of new beginnings in all areas of life.

In the contemporary era, vendors have commodified the cherry blossoms, selling sakura-flavored tea, coffee, ice cream, drinks or cookies, turning the image of blooming sakura into a seasonal brand. Weather forecasts track the cherry trees’ bloom to ensure that everyone has a chance to participate in the ancient ritual of viewing sakura.

The obsession with cherry blossoms may seem trivial, but hanami gathers people during an era when much communication is conducted virtually and remotely, uniting family members, friends, coworkers and sometimes even strangers, as happened to me when I lived in Japan.

Viewing sakura is also evidence of modern Japan’s unique relationship with its own history. At the same time, it is a reminder that impermanence is possibly the only constant in life.

Two rows of tall trees with clusters of pink flowers on either side of a pathway.
Cherry trees, with their lovely blossoms, arrived in Washington D.C. as a gift from Japan. Danny Navarro/Flickr, CC BY-SA

Today, cherry blossoms are celebrated in spring all around the world, encouraging the appreciation of impermanence through observation of nature.The Conversation

Małgorzata (Gosia) K. Citko-DuPlantis, Assistant Professor in Japanese Literature and Culture, University of Tennessee

This article is republished from The Conversation under a Creative Commons license. 

Combatting Loneliness in Older Adults

The bonds found in friendships and other relationships are an important factor in health and wellness – even science says so.

According to the American Psychological Association, forming and maintaining social connections at any age is one of the most reliable predictors of a healthy, happy and long life. Studies show having strong and supportive friendships can fend off depression and anxiety, lower blood pressure and heart rates in stressful situations and change the way people perceive daunting tasks.

However, statistics show approximately half of U.S. adults lack companionship and feel socially disconnected, according to the U.S. Surgeon General’s Advisory on the Healing Effects of Social Connection and Community. In fact, 12% don’t have anyone they consider a close friend, per the Survey Center on American Life. This “epidemic of loneliness,” as coined by U.S. Surgeon General Dr. Vivek Murthy, can take a severe toll on mental and physical health.

As people age, the risks of isolation increase. With America’s older population growing rapidly – the 65 and older population reached more than 55 million in 2020 – discussing how older adults can combat loneliness is relevant to public health and individual well-being.

Consider volunteering, which is one of the best and most rewarding ways to combat loneliness.

Volunteering Combats Loneliness
People often volunteer to find a sense of purpose, learn new skills, improve their communities or establish new routines after retiring or becoming empty nesters. For many, making friends through volunteer work is a welcome bonus. The act of volunteering provides proven benefits for older adults.

Forming connections can make all the difference in a person’s volunteer experience and sense of well-being. People who meet through volunteer work inherently share a common interest and something to bond over. These friendships can carry over outside of volunteer work and lead to bonding over other hobbies and interests.

Connection-Focused Volunteer Opportunities
In addition to making friends with fellow volunteers, many older adults also form relationships with the people they’re serving, especially if those recipients are their peers.

For example, AmeriCorps Seniors is the national service and volunteerism program in the federal agency of AmeriCorps that connects adults aged 55 and up to local service opportunities that match their interests. Its Senior Companion Program pairs volunteers with other older adults or those with disabilities who need companionship or assistance. Volunteers may help with tasks such as paying bills, shopping or getting companions to appointments. In some cases, volunteers may also provide support and respite for family members caring for loved ones with chronic illnesses.

“We often think of volunteering as ‘giving back,’ but we’ve seen firsthand that it often becomes so much more than that,” said Atalaya Sergi, director of AmeriCorps Seniors. “By spending a few hours each week with another older adult in need of support, our volunteers are not only giving back to others, but they’re adding meaning to their own lives and establishing new connections. They’re helping to fight the loneliness epidemic one visit at a time.”

Growing older can come with challenges, but some of those can be minimized with a positive mindset and commitment to remaining connected and engaged – whether with friends, relatives or fellow community members. Fostering relationships is a key ingredient to a healthier and more fulfilling life.

For more information and to find volunteer opportunities near you, visit AmeriCorps.gov/YourMoment.

Meet Friends Who Connected Through Service

Ray Maestas felt unfulfilled post-retirement and began volunteering with the AmeriCorps Seniors Senior Companion Program. He was connected with Bob Finnerty, a man with blindness looking for assistance a few days each week. They quickly struck up a routine of errands, reading and conversation that’s since become a friendship they both cherish.

“The Senior Companion Program has provided an avenue to enrich the lives of not only the participants but the people who are volunteering,” Maestas said. “Bob and I have gotten to the point where he’s a very important part of my life.”

Finnerty echoed those sentiments and shared his own appreciation for Maestas’ friendship.

“I’ve always relished my independence and I feel Ray is not just a person who reads for me – he’s a friend,” Finnerty said.

In the last few years, Maestas moved and now serves with a different chapter of the Senior Companion Program. He and Finnerty keep in touch. Maestas said they talk about every third day.

SOURCE:
AmeriCorps Seniors

Friday, March 8, 2024

The atomic bomb, exile and a test of brotherly bonds: Robert and Frank Oppenheimer

A rift in thinking about who should control powerful new technologies sent the brothers on diverging paths. For one, the story ended with a mission to bring science to the public.

Every now and then, science serves up poison pills. Knowledge gained in the course of exploring how nature works opens doors we might wish had stayed shut: For much of the past year, our newsfeeds were flooded with stories about how computational superpowers can create amoral nonhuman “minds” that may learn to think better than we do (and then what?). On the big screen, the movie Oppenheimer explored a threat people have lived with for nearly 80 years: How the energy of the atom can be unleashed to power unimaginably destructive bombs.

When potentially catastrophic inventions threaten all humanity, who decides how (or whether) they’re used? When even scientists toss around terms like “human extinction,” whose voice matters?

Such questions were at the core of the Oppenheimer film, a blockbuster now nominated for more than a dozen Oscars. To me, the movie hit home for a different reason. I spent a great deal of time with Frank Oppenheimer during the last 15 years of his life. While I never knew his brother Robert, Frank remained anguished over what he felt was Robert’s squandered opportunity to engage the world’s people in candid conversations about how to protect themselves under the shadow of this new threat.

During the post-World War II years, the emotionally close ties between the brothers (Robert — the “father of the atom bomb” — and his younger brother, Frank — the “uncle” of the bomb, as he mischievously called himself) were strained and for a time even fractured. Both hoped that the nascent nuclear technology would remain under global, and peaceful, control. Both hoped that the sheer horror of the weapons they helped to build could lead to a warless world.

They were on the same side, but not on the same page when it came to tactics.

Robert — whose fame surged after the war — believed decisions should be left to experts who understood the issues and had the power to make things happen — that is, people like himself. Frank believed just as fiercely that everyday people had to be involved. It took everyone to win the war, he argued, and it would take everyone to win the peace.

In the end, both lost. Both paid for their efforts with their careers (although Frank eventually resurrected his ideas as a “people’s science museum” that had a worldwide impact).

Given that the question “Who decides?” underlies so much of today’s fast-evolving sciences, the brothers’ story seems more compelling and relevant than ever.

Ethical education

In many ways, the Oppenheimer brothers were very much alike. Both studied physics. Both chain-smoked. Both loved art and literature. Both had piercing blue eyes, inherited from their mother, Ella Friedman Oppenheimer, an artist with a malformed hand always hidden in a glove. Their father, Julius, was a trustee of the Society for Ethical Culture, dedicated to “love of the right.”

They shared a Manhattan apartment with maids, Renoirs, and books piled down the halls and into the bathrooms. Ella was terrified of germs, so tutors and barbers often came to them. Frank had his tonsils out in his bedroom. Both boys attended Ethical Culture schools in New York, so morality was baked into their upbringing.

But they were also in other ways opposites.

Robert was, by his own admission, “an unctuous, repulsively good little boy.” Frank was anything but. He sneaked out at night to scale New York City’s rooftop water towers; by high school, he was using the electric current in the family home to weld whatever metal he could get his hands on. He took apart his father’s player piano (then stayed up all night putting it back together).

Robert got through Harvard in three years and received his PhD from the University of Göttingen two years later, in 1927, at age 23. Frank didn’t get his PhD until he was 27. Robert was arrogant, picky about his company. Frank would talk with anyone and did, later befriending even his FBI tail.

When Robert joined the faculty at Caltech, he was described as “a sort of patron saint,” always center stage, smooth, articulate, captivating. When Frank arrived at Caltech many years later for graduate work, he was described as standing “at the fringe, shoulders hunched over, clothes mussed and frayed, fingers still dirty from the laboratory.”

Still, they loved each other dearly. Frank wept when Robert left for grad school in Europe. Robert wrote Frank that he would gladly give up his vacation “for one evening with you.” He sent his little brother books on physics and chemistry, a sextant, compasses, a metronome, along with letters full of brotherly wisdom. My personal favorite: “To try to be happy is to try to build a machine with no other specifications than it shall run noiselessly.”

In summer, they retreated to a cabin in the mountains of New Mexico, which Robert called Perro Caliente (Spanish for “hot dog”). They rode horses over 13,000-foot peaks, 1,000 miles a summer. During one night ride, Robert got knocked off his horse. “He was very thin anyway,” Frank said. “Here was this little bit of protoplasm on the ground, not moving. It was scary, but he was all right.”

On a road trip back to Caltech, Frank rolled the car into a ditch, breaking Robert’s arm. When Robert stopped at a store to get a sling, he came back with a bright red one, to cheer up his little brother, who he knew was feeling bad about the accident.

The world around them was fraught, with fascism on the rise in Germany, Italy and Spain. The Depression meant people were still out of work. Robert kept mostly aloof from politics, but Frank dived in. He married a UC Berkeley student who was a member of the Young Communist League, then joined himself. He admired the Communists for taking unemployment seriously — and for understanding the threats posed by Hitler and Mussolini. His personal tipping point was the treatment of Blacks at a Pasadena public pool: Blacks were allowed only on Wednesdays; the pool was drained before the whites came back on Thursday. Only the Communist Party seemed concerned.

Robert didn’t approve of Frank’s decision to join the party, and he didn’t approve of his wife, Jackie, either, referring to her as “that waitress.” He accused Frank of being “slow” because it took him what Robert regarded as too long to get his PhD. He called Frank’s marriage “infantile.” The feelings became mutual. Jackie later regarded Robert and his wife, Kitty, as pretentious, phony and tight.

Frank soon realized that he wasn’t cut out to be a Communist, and quit. He felt the party was too authoritarian, and not as interested in social justice as in petty bickering. (Robert never joined, although Kitty had been a party member.)

From quantum theory to atom smashers

The brothers were both working as physicists when the Japanese bombed Pearl Harbor in 1941. Robert, the theorist, was sharing the revolutionary physics of quantum mechanics with his American colleagues at Berkeley and Caltech, where he had joint appointments. Frank, a natural-born experimentalist, was working with Ernest Lawrence at Berkeley on the rapidly developing technology of particle accelerators — known to some as “atom smashers.”

Once it became clear that the enormous energy contained in the atomic nucleus could be used to build a bomb — and that Nazi Germany might well be doing just that — President Franklin Roosevelt approved a major American effort to beat them to it: the Manhattan Project. It came as a surprise to everyone when Gen. Leslie Groves tapped Robert as director. Seemingly overnight, the ethereal young man who enjoyed reading poetry in Sanskrit became the ringleader of the most concentrated collection of brilliant minds ever assembled — scientists summoned from around the world to a makeshift lab on a desolate New Mexico mesa, where they would build an atomic bomb to stop Hitler.

Frank, meanwhile, worked with Lawrence on what he called “racetracks” (officially calutrons) used to coax small but vital amounts of pure uranium-235 out of a dirty mix of isotopes by steering them in circles with magnets. Uranium-235, like plutonium-239, is easily split, just what was needed to set off a chain reaction. Since no one knew how to bring together a critical mass of the stuff to make an explosion, two designs were pursued simultaneously. The plutonium bomb acquired the nickname Fat Man; the uranium bomb was Little Boy.

Frank helped supervise an enormous complex for uranium separation at Oak Ridge, Tennessee. Frank liked Gen. Groves and Groves, in turn, liked Frank — and later defended him when he was booted from physics for his politics.

As the time to test the bomb approached, Frank joined his brother at the Trinity site, a dry scrubby desert formerly part of the Alamogordo Bombing Range. Frank, who saw his job (ironically enough) as a “safety inspector,” mapped escape routes through the desert and made sure workers wore hard hats.

Finally, on July 16, 1945, the go-ahead was given. After a long night on edge watching driving rain and lightning rage around “the gadget” — a Fat Man-style plutonium bomb perched on a 100-foot-tall tower — the proverbial (and literal) button was pushed.

The brothers lay together at the nearest bunker, five miles away, heads to the ground. Frank later described the “unearthly hovering cloud. It was very bright and very purple and very awesome … and all the thunder of the blast was bouncing, bouncing back and forth on the cliffs and hills. The echoing went on and on.…” The cloud, he said “just seemed to hang there forever.”

Frank and his brother embraced each other: “I think we just said: ‘It worked.’”

On August 6, 1945, Little Boy was dropped on the pristine city of Hiroshima — which had been deliberately untouched by US bombs, the better to assess the damage. In an instant, the city was all but flattened, people reduced to charred cinders, survivors hobbling around with their skin peeled off and hanging from their bodies like rags. An estimated 140,000 people were killed in the attack and in the months after, according to Japanese authorities.

Frank heard the news outside his brother’s office at Los Alamos. “Up to then I don’t think I’d really thought of all those flattened people,” he said. The US bombing of Nagasaki with Fat Man just days later brought the death toll even higher.

Some physicists saw their success as a moral failure. Still, many — including Frank and Robert — also hoped this new weapon would cause people to see the world differently; they hoped it would ultimately bring about peace. “Those were the days when we all drank one toast only,” Robert said: “‘No more wars.’”

Intolerable weapon

After the war, the brothers’ lives diverged, driven by circumstance, in ways that were painful to both.

Robert was a hero; he mingled easily with the powerful. Famously, he was Einstein’s boss — director of the Institute for Advanced Study in Princeton. He chaired a committee to advise the government on a new and vastly more powerful type of bomb — the hydrogen bomb. Rather than split atoms, it fused them, using the physics of stars. The H-bomb could be 1,000 times more powerful than Little Boy.

Robert’s committee voted unanimously against developing it. “The extreme dangers to mankind inherent in this proposal wholly outweigh any military advantage that could come from this weapon.” They described it as a “threat to the future of the human race which is intolerable.”

Frank, meanwhile, had joined the physics department at the University of Minnesota, building detectors to catch cosmic rays streaming from space with equipment tethered to balloons he frequently lost but chased gamely through Cuban forests and other remote locations. He was excited about their discovery that the cosmic ray particles were not merely protons, as people had assumed, but the nuclei of many elements — from hydrogen to gold — implying that some were forged in supernova explosions.

At the same time, he was giving speeches “all over the map,” as he put it, trying to educate the public about nuclear bombs, trying to explain what 1,000 times more powerful really meant. He spoke to bankers, civic associations, schools. He argued that so-called “smart” people weren’t all that different from everyone else. The mistrust of the “hoi polloi,” Frank thought, stemmed largely from the tendency of people to credit their own success to a single personal characteristic, which they then “idolize” and use to measure everyone else by the same yardstick.

He believed people would educate themselves if they thought their voices mattered. “All of us have seen, especially during the war, the enormous increase in the competence of people that results from a sense of responsibility,” he said. Building the “racetracks” during the war had required training thousands of people “fresh from farms and woods to operate and repair the weirdest and most complicated equipment.”

Soon, his physics career was cut short. The FBI had been keeping tabs on both brothers for years, pausing only for the war, when military intelligence took over. Agents followed them everywhere, tapped their phones, planted microphones in their houses.

In 1949, Frank received a summons to appear before the House Un-American Activities Committee (HUAC), where he refused to take the fifth, but also refused to testify about anyone other than himself. He was effectively fired from the University of Minnesota physics department, leaving the chair’s office in tears.

Attempts to find work elsewhere were blocked at every turn, despite support from multiple Nobel laureates, Gen. Groves and even H-bomb enthusiast Edward Teller. Finally, an FBI agent told Frank flat out: If he wanted a job, he had to cooperate. “Then I realized what the wall was.”

Out of options, and having just purchased a ranch to live on “someday,” Frank and Jackie became serious cattle ranchers, learning from neighbors and veterinary manuals. (The FBI was right on their tails, pestering neighbors for information, suggesting they were broadcasting atomic secrets to Mexico.) All the while, Frank thought and wrote about physics and peace, civil rights, ethics, education and the critical role of honesty in science and public life.

Robert did not approve of any of Frank’s activities. He thought there wasn’t time to bring the public in on the debate; he thought he could use his fame and power to influence policy in Washington toward peaceful ends. Frank expressed his disgust at what he considered his brother’s futile and elitist approach. Robert made it clear that he thought the idea of becoming a rancher was a little silly — as well as beneath Frank.

Frank felt he could no longer reach him. “I saw my bro in Chicago,” Frank wrote his best friend Robert Wilson at Cornell in an undated letter probably from the early 1950s. “I fear that I merely amused him slightly when, in brotherly love, I told him that I was still confident that someday he would do something that I was proud of.”

A man destroyed

Robert’s now-famous downfall was swift. Many great books have been written about the subject (not to mention Christopher Nolan’s colossal film); in effect, he was punished for his opposition to the H-bomb, probably his arrogance and naivete as well. After a series of secret hearings, his security clearance was revoked; he was, by all accounts, a ruined man.

It wasn’t something Frank liked to talk about. “He trusted his ability to talk to people and convince them,” Frank said. “But he was up against people that weren’t used to being convinced by conversation.”

Some of Robert’s most poignant testimony during the hearings involved Frank. Asked if his brother had ever been a Communist, Robert answered: “Mr. Chairman … I ask you not to press these questions about my brother. If they are important to you, you can ask him. I will answer, if asked, but I beg of you not to ask me these questions.”

The broader tragedy for both brothers was that the creation of the world’s most fearsome weapon of mass destruction — a thing too horrible ever to use — didn’t much change how people viewed war. The H-bomb was just another weapon.

“What undid him,” Frank said, “was not just his fall from official grace, but the fact that this fall represented a defeat for the kind of civilized behavior that he had hoped nations would adopt.”

Robert died at the age of 62, in 1967. Frank’s last memory of his brother is poignantly familial. Robert was lying in bed, in great pain from throat cancer. Frank lay down beside him and together they watched Perry Mason on TV.

A new path

While Robert was being politically destroyed, Frank had started teaching science in a one-room schoolhouse. Before long, students from Pagosa Springs, Colorado, were winning the state science fairs. Eventually allowed into academia by the University of Colorado in 1959, Frank promptly built a “library of experiments” out of equipment scavenged from other labs.

That “library” in time grew into a vast public playground of scientific stuff housed in the abandoned Palace of Fine Arts in San Francisco. Exhibits — sometimes sophisticated and delicate — were meant to be played with, even broken; no guards stopped people from touching anything, no rules prevented theft — and remarkably, there was almost none. He called it an Exploratorium so people wouldn’t think it was a “museum” where good behavior was expected (although he liked the idea that “no one flunks a museum”). Top scientists and artists from around the globe contributed time and talent. Barbara Gamow, wife of the physicist George Gamow, painted a sign to hang over the machine shop: Here is Being Created an Exploratorium, a Community Museum Dedicated to Awareness.

In the end, I like to think Frank proved his brother (and most everyone else) wrong about the willingness of everyday people to engage and learn. The “so-called inattentive public,” he’d said, would come to life if people didn’t feel “fooled and lied to,” if they felt valued and respected. And if people got addicted to figuring things out for themselves, they’d be inoculated against having to take the word of whatever bullies happened to be in power. Society could tap into this collective wisdom to solve pressing global problems — the only way he thought it could work.

Today, Exploratorium-style science centers exist in some form all over the globe.

I count myself as one of Frank’s many thousands of addicts, hooked on science (a subject I’d found boring) the minute I met him in 1971. (In a weird resonance with today, my first foray into journalism was a piece on the Soviet invasion of Czechoslovakia for the New York Times Magazine.) I was interested in peace, not physics. Frank talked me into writing for him, explaining optics and wave mechanics to the public. My first editor was Jackie. Over the years, Frank and I spent endless hours chatting about life, art, science and his family, including his brother.

Nolan’s film Oppenheimer doesn’t offer much insight into Robert’s thoughts on science and peace or science and human morality. However, Robert did think and talk about these ideas, many of which are collected in his 1954 book Science and the Common Understanding, as well as other places.

Frank continued to get upset (and a little drunk) every August 6, the day Hiroshima was bombed. He’d rub his forehead hard, as if he was trying to rub something out. He had much the same reaction to many previous dramatizations of the Oppenheimer story, because he thought they focused too much on the fall of his brother, rather than on the failure of attempts to use the horror of the bomb to build a warless world.

Frank’s fierce integrity permeated our work together: He refused to call me writer/editor because he said that meant writer divided by editor. Instead, I was his Exploratorium Expositor.

If someone said, “It’s impossible to know something, or impossible to adequately thank someone,” he’d argue: It’s not impossible, it’s only very, very, very hard.

No matter what impossible thing Frank was trying to do, he refused to be stopped by so-called “real world” obstacles. “It’s not the real world,” he’d rage. “It’s a world we made up.” We could do better. In fact, so many of what we came to call “Frankisms” seem more relevant today than ever:

“The worst thing a son of a bitch can do to you is turn you into a son of a bitch.”

“Artists and scientists are the official noticers of society.”

“If we stop trying to understand things, we’ll all be sunk.”

Navigating the dark side of science, I think, will require attending closely to all of these. The “real world” we’re presented with is not the way things have to be. We shouldn’t become sons of bitches. We can never stop noticing or trying to understand.

Easy Ideas to Stretch Your Retirement Budget: Ways seniors can save on enjoyable activities

Retirement may mean you have unlimited time to enjoy each day, but it doesn’t mean you have a budget to match.

You probably already know staying active is essential for aging with grace, so instead of letting limited funds keep you at home, explore some ways you can enjoy your leisure time without breaking the bank.

Hit the Gym
Many fitness centers offer special rates and programs for older adults. Hitting the track or joining a group fitness class are easy ways to socialize while getting some exercise. The discounted membership is also an investment since staying fit is important for physical and mental health.

Enjoy Early Dinner Deals
You can still enjoy dining out occasionally, especially if you take advantage of lower-cost meals designed with older adults in mind. Many specials are for meals earlier in the day, which is consistent with a growing trend toward earlier dining. According to Yelp, the number of people eating from 4-6 p.m. has grown 9% (up to 26% from 17% in 2019). Eating earlier promotes better digestion, and earlier meals are often lighter portions for smaller appetites. For example, Cracker Barrel’s Early Dinner Deals feature smaller portions served from 4-6 p.m. on weekdays. Menu items include a variety of homestyle favorites like chicken n’ dumplins, meatloaf, catfish and more.

Check Out the Library
Your local library is filled with hours of free entertainment, but it’s not just the kind you’ll find from getting a library card. You can undoubtedly find a book that covers any genre or interest you can name, but most local libraries also offer programming tailored to special interests and the sessions are typically offered for free or at a low cost. It’s an easy, affordable way to pick up a new skill, meet a favorite author, learn about a topic that intrigues you and more. Other resources to explore include your library’s DVD collection and internet access if you don’t have a computer at home.

Nurture a Garden
Tending a garden may seem like a seasonal activity, but you can make it a year-round hobby. Researching and planning is a good way to carry your gardening enthusiasm into the cooler months and you can start seedlings indoors to extend your growing season. While you’re digging into this low-cost pastime, remember the results of your efforts, such as fresh fruits and veggies, can help cut your grocery costs, too.

Mind Your Money with DIY
Saving money at the grocery store is just one way you can make DIY projects work for you. There are dozens of other examples of ways you can put your skills and interests to use by passing time doing something you enjoy while benefiting your bank account. If you like to tinker with cars, figure out what repairs you can handle yourself and avoid hefty service fees. Crafting and sewing might mean you have ready-made gifts for special occasions and a way to repair or repurpose damaged clothing instead of discarding it.

Ask About Discounts
You may be surprised by how many places offer discounts for older adults that they don’t readily advertise. In some cases, you’ll find the information on their website or signage, but other times, you may find it easier to just ask. When you’re booking an appointment or checking out, inquire about discounts for older adults, including any restrictions, age requirements, the amount of the discount and other pertinent details. Sometimes the discounts are offered on certain days or for specific services, or they may require you to join a loyalty club to access the discounts. When dining out, many restaurants offer a variety of loyalty perks. Rewards members at Cracker Barrel can earn points, or “Pegs,” on qualifying restaurant and retail purchases. Members can also take advantage of bonus birthday, anniversary and surprise rewards throughout the year.

To find a location near you, visit crackerbarrel.com/locations.

 

SOURCE:
Cracker Barrel

Sunday, March 3, 2024

Caitlin Clark’s historic scoring record shines a spotlight on the history of the Association for Intercollegiate Athletics for Women

University of Iowa guard Caitlin Clark celebrates after making the game-winning shot against Michigan State on Jan. 2, 2024. Matthew Holst/Getty Images
Diane Williams, McDaniel College

When University of Iowa women’s basketball star Caitlin Clark drained a 3-pointer against the University of Michigan on Feb. 15, 2024, she secured the NCAA women’s scoring record.

Announcers noted that Clark had surpassed Kelsey Plum’s 3,527 points. But few added that there was still one more Division I women’s scoring title remaining.

That one belonged to guard Lynette Woodard, who scored 3,649 points while playing for the University of Kansas from 1978 to 1981. Her record was set before the NCAA offered women’s championships, when the Association for Intercollegiate Athletics for Women, or AIAW, was in charge.

When Clark surpassed Woodard’s AIAW milestone on Feb. 28, 2024, in the fourth quarter of a game against the University of Minnesota, it opened up another chance to revisit this buried piece of sport history.

The AIAW launched in 1972. Within a decade it was bigger than the NCAA, with nearly 1,000 member colleges and universities. It sponsored 19 sports in three divisions, was the sole organization for women’s intercollegiate athletics and the only one led by women. And the NCAA destroyed it through what SUNY Cortland sports management professor Lindsey Darvin described as a “hostile takeover.”

As a scholar of sport, gender and American culture, I study the AIAW as a key moment in sports history that has been buried, and I’m currently writing a book exploring its philosophy, impact and legacy.

In any history of women’s sports in the U.S., you’ll hear a lot about Title IX, the federal law dictating that female college athletes must receive equal opportunities in sports.

But you’ll rarely hear about the AIAW, a sporting body led by women that fundamentally changed intercollegiate sports. Its student-centered governance model continues to resonate as college athletes chip away at the power of the NCAA, whether it’s through the transfer portal or name, image and likeness deals.

Designed for women, by women

Throughout the early part of the 20th century, female college students participated in physical education classes focused on health and wellness. There were few opportunities for organized team sports.

By the 1960s, however, women students demanded school-sponsored intercollegiate teams and championships like the men had.

Women professors of physical education agreed.. But they had watched the NCAA commercial model of sport descend into exploitation and scandal under what historians have called the “cynical fiction” of amateurism. As the NCAA remained exclusively male, there was an opportunity to create something different for women’s athletics.

The AIAW emerged from that momentum – an intercollegiate athletic governance organization designed for and by women, dedicated to creating high-level competition while maintaining focus on the well-being and education of student-athletes.

Under the AIAW, all teams and athletes were supported equally, not singled out for their ability to generate revenue. They had a right to due process, an appeals system and student representatives on local and national committees. The organization ran on dues from member schools and eventually some advertising and media contracts.

Women’s athletic programs were led by physical educators turned coaches and administrators. Some of the most famous coaches in women’s basketball got their start under the AIAW, including C. Vivian Stringer, Pat Summit and Tara VanDerveer, who recently broke the all-time record for college basketball wins.

In addition to Woodard, other notable AIAW players include Ann Meyers-Drysdale, Nancy Lieberman and Lusia Harris, who was recently the subject of an Oscar-winning documentary.

Young woman with short hair poses while dribbling a basketball and wearing a red, white and blue Team USA jersey.
After starring at the University of Kansas, Lynette Woodard went on to play for the Harlem Globetrotters, Team USA and the WNBA. Tony Duffy/Allsport/Getty Images

Title IX backlash

There is no doubt that Title IX, which was signed into law in 1972, had a big influence on the growth of women’s college sports, mandating that educational activities, including athletics, should be the same for men and women.

Congress passed Title IX just before the AIAW’s first championship season, and the law spurred calls for more equitable resources for women’s sports.

There was immediate backlash from male-dominated sporting organizations, including the NCAA, which saw the addition of women’s sports as a loss for men’s sports. Walter Byers, then the NCAA’s executive director, said, “The possible doom of college sports is near.” One college football official told reporter Sally Jenkins that women’s sports advocates were trying “to tear the shirts off our backs.”

Despite the fearmongering, college sports continued to thrive. Nonetheless, over the past 50 years, even though nearly all schools have been out of athletic compliance with Title IX, none has lost federal funding for violations. As Title IX scholar Sarah Fields has written, “Without punitive damages, the law is limited: it is toothless.”

All along, change has come not from the law’s mere existence but from students filing complaints and lawsuits, and the determination of administrators to use the law to carve out and protect athletic opportunities for women. During the 1970s, those administrators were almost all in the AIAW.

The NCAA elbows its way in

By the late 1970s, the U.S. Department of Health, Education, and Welfare had laid out clearer standards for athletic compliance with Title IX.

While the NCAA and AIAW were not subject to the law, their member institutions were, and the two organizations’ efforts to collaborate failed. Instead, the NCAA, which had long fought Title IX’s application in athletics, changed course and set its sights on taking control of women’s sports.

The NCAA offered women’s championships in all three divisions for the first time during the 1981-82 school year. Leveraging all of its presumed legitimacy and financial resources, the 75-year-old men’s athletic organization offered all-expenses-paid women’s championships on the same weekends as the unpaid AIAW championships.

The strategy worked. The AIAW lost significant members and ceased operations in mid-1982, despite the fact that women athletes, coaches and administrators preferred its educational model and leadership structure.

The NCAA made vague promises to support women’s athletics but refused to give women more than token representation on its governance boards. Women student-athletes were, for the first time, led by a male-dominated governance organization.

To this day, institutional sexism remains entrenched in the NCAA.

Women hold only 41.3% of head coaching positions for women’s teams and 23.9% of athletic director positions – roles that were largely held by women under the AIAW. A recent gender equity review found that the organization under-resourced nearly all of its women’s championships, a result of gender bias and its focus on making money.

The NCAA and its corporate partners would like you to believe that their organization is the be-all and end-all of college sports.

But the story of the AIAW – created by and for women, rejecting the crass commercialism of the NCAA and empowering student-athletes to speak up – offers ideas for a more equitable future for college sports.The Conversation

Diane Williams, Assistant Professor of Kinesiology, McDaniel College

This article is republished from The Conversation under a Creative Commons license. 

Friday, March 1, 2024

Find Your Perfect Spring Escape

4 pet-friendly and affordable destinations for a spring getaway

Whether you’re a family with kids or a young professional looking for a getaway, it isn’t too late – or out of budget – to plan a memorable spring trip. All you need is a full tank of gas and your furry best friend to make an unforgettable getaway.

Data from online travel agency Booking.com shows that half of traveler’s plan to choose vacation destinations where the cost of living is less than their hometowns in 2024. Exploring lesser-known destinations with a variety of outdoor activities, opting for a road trip with your pet rather than a large group and traveling outside of peak season can all help make adventures more affordable.

To help travelers feel confident selecting their road trip destinations this spring, Motel 6, where pets always stay for free, and Bert Sperling’s Best Places recommend these undiscovered destinations that offer sight-seeing, access to dog parks, cultural experiences, green spaces, authentic cuisine, dog-friendly restaurants and affordable lodging.

Santa Fe, New Mexico
Dive into desert culture in Santa Fe with stunning views of the Sangre de Cristo mountains, Pueblo-style architecture, historic landmarks and pet-friendly dining patios. With near-endless activities like shopping for handcrafted jewelry, visiting the Museum of International Art Folk or walking the Santa Fe Plaza, there are entertainment options for everyone to enjoy. There are also plenty of affordable lodging options within walking distance of downtown attractions.

Branson, Missouri
Situated in the iconic Lake of the Ozarks, Branson is a dream small-town getaway for family vacations with a plethora of dining and entertainment options such as Silver Dollar City, Dolly Parton’s Stampede and the Titanic Museum. The city has a dog-friendly culture with plenty of parks and outdoor activities. Located just two miles from many of these local attractions, Motel 6 Branson welcomes the whole family, including those on four legs, at no additional cost. This location also offers amenities like free Wi-Fi, an expansive cable channel selection, a microwave and refrigerator in each room and guest laundry facilities.

Tempe, Arizona
If you’re seeking sunshine and fresh air, look no further than Tempe, a vibrant city located just south of Phoenix. From festivals and outdoor activities like golfing, hiking, kayaking or stand-up paddle boarding on Tempe Town Lake to visiting the Tempe Center for the Arts or local pet-friendly eateries, there are plenty of things to do in the low desert valley.

Chattanooga, Tennessee
For those looking to escape fast-paced city life with an outdoor getaway, Chattanooga is a perfect destination to enjoy outdoor activities, such as exploring Lookout Mountain or walking along the Tennessee River. In a city full of culture and history, visitors can enjoy local artwork at the Hunter Museum of American Art or go sightseeing in the historic Bluff View Art District. As the temperatures rise, embrace the rays at the seasonal pool alongside pet-friendly lodging at Motel 6 Chattanooga.

As you look to plan your getaway, visit Motel6.com to find pet-friendly and affordable lodging accommodations as well as more ideas to point you in the right direction on your spring excursion.

SOURCE:
Motel 6


The future of work: Why we need to think beyond the hype of the four-day week


Yaëlle Amsallem, ESCP Business School and Emmanuelle Léon, ESCP Business School

Is reducing working hours a sign of progress? Since the 19th century, the number of hours spent at work has been steadily declining in developed countries.

The four-day week emerged in the 90s as a political and economic demand for a more equal division of work. The idea was to reduce the number of hours worked so that more people can access employment. This approach, developed in 1993 by French economist Pierre Larrouturou, was tested in 1996 with the de Robien law on the organisation of working hours. In France, business leaders such as Antoine Riboud, CEO of the multinational food-products firm Danone, championed the idea as a way of boosting recruitment. However, the law was repealed in the early 2000s with the labour reform that introduced the 35-hour week. Elsewhere, in Germany, Volkswagen adopted the four-day week in 1994 to save 30,000 jobs, only to abandon it in 2006.

The Covid crisis and its associated lockdowns have brought this debate back into the spotlight. The widespread adoption of working from home, the use of new technologies and the increase in flexibility have profoundly transformed the way we work. This period has also reinforced employees’ desire for a better work-life balance. As a result, 56% of British employees would accept to earn less money in exchange for more free time.

Against this backdrop, the debate on the four-day week is resurfacing. Countries in Asia and Oceania are looking at ways to organise their workforces in order to reengage their employees. In New Zealand, the government introduced a four-day week at the end of the pandemic to boost productivity and improve work-life balance. In Japan, several companies have also come on board, including Hitachi and Microsoft. This measure, presented as a means of combating overwork culture, is also an opportunity to significantly improve productivity (by 40% in the case of Microsoft).

Europe is following suit, starting with the countries of Northern Europe, followed by the UK, Germany, Spain, Portugal and France.

This reform can take various shapes – each of them presenting specific challenges.

A four-day week or a week squeezed into four days?

The first approach is the most popular: an unchanged number of working hours, concentrated over four days. This is the model implemented by Belgium and the Nordic countries. In autumn 2022, Belgium passed a law on the four-day week, called the “deal for employment”: employees can work four days without any reduction in salary because their weekly working time remains the same. In Italy, the Intesa Sanpaolo bank is doing the same. In France, an attempt to do so was proposed in March 2023 to the employees of Urssaf Picardie, but was a complete failure. The cause: parenthood. Long days no longer allow parents to take their children to and from school.

This is a new form of temporal flexibility, without any reduction in working hours. As economist Éric Heyer points out:

“We shouldn’t confuse the ‘four-day’ week, which reduces working time, with the ‘week in four days’, which compresses it.”

The challenge, then, is to work differently so that the quality of work does not suffer as a result of intensification.

Working less, working better

The second approach is the true ideal of the four-day week, namely the 32-hour week: shorter working hours thanks to increased productivity. It has been implemented in Southern Europe (Spain, Portugal).

This formula is based on the idea of maintaining work productivity by identifying and reducing unproductive time, streamlining certain processes, notably reporting and participation in meetings. Working less, yes, but above all working better. It would in fact limit everything considered superfluous. That said, putting the organisation on a diet reduces its ability to adapt to rapid changes in its environment. For example, we now know that “down times” facilitate the exchange of information between teams.

This approach is deeply embedded in the idea that technology will compensate for any loss of productivity, a recurring theme since the publication of The End of Work in 1995 by American essayist Jeremy Rifkin. The arrival of generative artificial intelligence has brought the concept back to the forefront. Bill Gates even talks about the imminent arrival of the three-day week.

Since the advent of the industrial world, organisations have constantly sought to optimise working time. For many years, it simply kept pace with the production line. Working time and time at work were perfectly synonymous. Today, we don’t have to go to the office to work: work has moved into our personal spaces. Working time has become detached from office time. With the four-day week, the aim is to frame work in terms of time rather than space. Sarah Proust, an expert associated with the Fondation Jean-Jaurès, explains:

“What is at issue here is the organisation and distribution of work, rather the place we intend to give to work in society.”

Toward a new work paradigm?

Instead of focusing on the volume of hours, shouldn’t we be talking about the very nature of work? In the words of economist Timothée Parrique, we need to stop predicting the future of work with ideas like the four-day week, and start inventing the work of the future.

A growing body of research, notably in the wake of anthropologist David Graeber, is highlighting the loss of meaning at work, the rise of “bullshit jobs” and the “revolt of the top of the class”, to borrow the title of journalist Jean-Laurent Cassely’s book.

Unfortunately, reorganising working hours will not be enough to reengage one’s workforce. Working time is above all a “hygiene factor”, as psychologist Frederick Irving Herzberg explains. It cannot deliver the motivation so hoped-for by managers. It can only temper employee dissatisfaction. As a source of personal fulfilment and satisfaction, highers-up need to activate genuine “motivational factors”, such as by valuing the work accomplished, employees’ autonomy, or making work tasks more interesting.

Perhaps we need to create new utopias of work along the lines of Ecotopia: The Notebooks and Reports of William Weston, Ernest Callenbach’s book (1975) that imagined three West Coast states seceding from the USA to establish a radically ecological way of life. In it, Callenbach imagines a new model of society where people only work 22 hours a week. This utopia depicts economies where a large proportion of the available hours are devoted to social, political, cultural and environmental activities. Ecotopia advocates personal and collective fulfilment before individual success. Businesses are self-managed, public transport is free, education and health are accessible to all, criminal violence is absent, universal income is in force and recycling, sobriety and degrowth are the rule.

Callenbach wanted to give us a glimpse of a world he believed to be better, not only for the environment, but also for the individual balance of each person. As we live longer than ever, and as work occupies less time in our lives, we need to imagine, not a new way of working, but a new way of living.The Conversation

Yaëlle Amsallem, Doctorante, Assistante de recherche de la Chaire Reinventing Work, ESCP Business School and Emmanuelle Léon, Professeure associée, Directrice scientifique de la Chaire Reinventing Work, ESCP Business School

This article is republished from The Conversation under a Creative Commons license. 

Tuesday, February 27, 2024

Why probation and parole don’t work as advertised

The current system of supervised release in lieu of imprisonment may do more harm than good, some experts say. How can society do a better job of rehabilitating law-breakers while keeping them from re-offending?

Marcella Soto had four children by the time she was 22 and was on and off welfare during the years they were growing up. By 2018, she was working a government job in California when she was charged with six counts of welfare fraud. Unable to prove that she had not misreported her income many years earlier, she pleaded guilty to a single felony count.

Soto was sentenced to the maximum probation term — five years — and was required to check in with a supervisor every month. Because of the conviction, she lost her job; because of the conditions of her probation, she was unable to travel to Texas to attend the birth of her first grandchild.

Being sentenced to probation put Soto in the largest group of people in America’s criminal justice system. About 1.9 million US residents are behind bars, and 3.7 million are being monitored while they are on probation in lieu of incarceration or on parole after being let out of lockup.

Probation and parole — collectively known as community supervision — were originally conceived as alternatives to incarceration, allowing convicted criminals to be rehabilitated under supervision. But criminal justice leaders say the practices have strayed from that original mission and become so ineffective that, ironically, they contribute to America’s overcrowded prisons. These critics call for an overhaul of community supervision, including shorter terms and more support for rehabilitation. In some states, new laws are making headway.

“Nearly half of the people going into jails and prisons are coming in from a failed and broken probation and parole system,” says Robert Rooks, chief executive officer of Reform Alliance, a nonprofit advocacy group. “So if you want to address mass incarceration in this country, you have to address the phase of probation and parole.”

Changing goals

Parole and probation were originally intended as opportunities to rehabilitate offenders through support, such as help finding jobs or housing. This focus on rehabilitation began to recede in the 1970s when a tough-on-crime public sentiment emerged. Parole and probation morphed instead into systems of surveillance — intense scrutiny over long periods of time — supposedly in support of public safety.

With this shift, critics say, community supervision became a form of “net-widening,” keeping people in the criminal justice system rather than easing them back into society. “In many people’s minds, this is a good thing you get instead of going to prison,” says Vincent Schiraldi, a former commissioner of the New York City Department of Probation. “But this is a bad thing. It’s got a lot of bad outcomes for a lot of people.”

For instance, probationers must comply with a growing, often complex list of conditions — which typically number 18 to 20 — that can be difficult to meet, Schiraldi says. Avoiding contact with anyone who has a criminal conviction is hard if an offender’s family or support network includes convicts; returning home by curfew can interfere with keeping a job. In Pennsylvania, where a released prisoner could be on parole for the rest of their life, probationers are prohibited from leaving the state, which means an Uber driver on probation in Philadelphia cannot drive into the suburbs that spill into Delaware.

“That right there shows you that our current policies are arbitrary, unnecessary and hinder people’s ability to do the things they need to do to become stable, get back to work and provide for their families,” Rooks says. “And that’s the opposite of what probation and parole should be doing.”

Indeed, a 1993 survey of people imprisoned in Texas found that 66 percent said they would choose incarceration over a 10-year probation sentence. When Schiraldi was New York City’s probation commissioner in 2010, he saw a woman give up her probation, choosing instead to go to Rikers Island jail, because she was unable to find childcare that she needed for her probation check-ins, where children were not allowed.

All these problems might be acceptable if probation and parole were meeting the goal of reducing incarceration without leading to more crime. Schiraldi and two colleagues examined that proposition in the 2023 Annual Review of Criminology — and concluded that they are not.

Community supervision clearly fails at reducing incarceration. The more people living under probation and parole in one year, the higher the incarceration rate the following year, according to work by Schiraldi and his coauthors. That reflects, in part, the fact that failing to comply with terms of supervision can mean a ticket to jail. In 2017, revocation of parole or probation accounted for 45 percent of prison admissions, and in 20 states, more than half of those revocations were not for new crimes, but for violating the terms of supervision.

The evidence on public safety is more equivocal. If releasing people on parole and probation poses a risk of further crime, then the more people released on supervision in a given year, the higher the crime rate would be in the following year. But that isn’t the case. Overall, a state’s probation, parole and total supervision rate in one year does not predict the state’s rate of index crime — a term that includes violent crime plus several kinds of property crimes — in the following year.

For parole alone, however, the researchers found that the more parolees in a given year, the more violent crime the next year. That implies that parole could be risky.

But looking at the issue in a different way, Urban Institute researchers showed no clear risk, as well as no benefits, from parole. The team reviewed long-term Bureau of Justice Statistics data on 38,624 prisoners from 14 states released from prison in 1994 and found that parole supervision does not substantially affect recidivism or public safety. People simply released from prison without supervision were no more likely to be rearrested than those who were required to complete a term of parole after their sentence, they found — though people paroled before the end of their prison sentences did have lower rearrest rates.

Other evidence also suggests that recidivism may arise early, so long supervision terms may not be helpful in reducing crime. For example, among felony probationers in Oregon who were rearrested within three years of their probation, 69 percent were arrested in the first year, according to data from the Oregon Criminal Justice Commission. That suggests that the early months of probation and parole may be a critical period for helping offenders change their behavior and connecting them with community services.

In addition, making community supervision less punitive has been shown to work. One in-depth study of 283 offenders in an intensive supervision program in Wyoming compared the effect of rewards, such as praise or removal of electronic monitoring, to punishments, such as reprimands or tightened curfew. The offenders were most likely to complete their supervision successfully if they received at least four times as many rewards as sanctions.

Several studies show that employment for people under supervision helps to reduce recidivism. A transitional job program in New York City for people leaving prison led to a 9 percentage-point reduction in any type of recidivism (arrest, conviction or jail) over the three-year follow-up period compared to results with parolees not in the program. That suggests that supporting employment — rather than making it difficult by travel restrictions — is a good idea.

Evidence like this led Schiraldi and Barbara Broderick, former chief probation officer for Maricopa County in Arizona, to launch Exit, a group made up of current and former community supervision leaders, victims, prosecutors and others who want to see probation and parole downsized and returned to their original purpose of rehabilitation.

Among other aims, they want people on probation to be able to earn time off supervision if they maintain good behavior and achieve milestones like graduating from high school or keeping a job. They want the money saved from reduced supervision to be invested in community-based services that support people on probation or parole. They want incarceration for technical violations to be eliminated in favor of more and better supportive services that help people on parole and probation reintegrate into their communities.

A better way

Some of the greatest progress has been in California, where a series of reform laws since 2007 has transformed the incarceration and community supervision landscape, Rooks says. The reforms reduced prison sentences and supervision periods for many offenses and encouraged the use of evidence-based supervision strategies such as needs assessments, tailoring the intensity of supervision to a probationer’s risk of recidivism, and increased referrals to counseling, substance abuse treatment and employment services.

In many people’s minds, [probation] is a good thing you get instead of going to prison. But this is a bad thing.”

VINCENT SCHIRALDI

Because of these and other reforms, the number of Californians under community supervision fell from 477,733 in 2006 to 306,500 in 2020, a decrease of 42 percent. Meanwhile, reported crime declined by 7.4 percent during those years — and a study found the reforms had no measurable effect on violent crime, suggesting that less-punitive treatment did not increase crime.

Another California reform, passed in 2020, limits most probation sentences for misdemeanors to one year and most probation sentences for felonies to two. The change is projected to reduce the number of people on probation by a third, avert more than 48,000 prison stays because of technical violations and save the state $2.1 billion over five years, according to calculations by Recidiviz, a nonprofit organization that compiles and analyzes data to support criminal justice reform.

The saved money could be used to address the root causes of criminal behavior. For example, in early 2021 the El Dorado County Probation Department opened a house where probationers experiencing or at risk of homelessness can live and receive support services. “It allows you to free up the resources to give people the help that they need, which is what probation really should be about,” Rooks says.

When Soto learned about California’s new probation term limits in 2021, her probation officer did not support reducing her probation sentence, so Soto went to court. “The judge right away said, ‘You know what? I got her report. She’s never been in trouble and she’s working,’” she recalls. “And they let me off three years early.”

Now living in Oklahoma, where she works as a warehouse manager and lives with her daughter and the grandson whose birth she missed, Soto remembers that day. “I could travel and be free, and I didn’t have to worry about the visitation from the officer,” she said. “I was able to get my life back.”

By Lola Butcher

Saturday, February 24, 2024

Why is free time still so elusive?

Massive gains in productivity haven’t led to more time free from work. J Studios/DigitalVision via Getty Images
Gary Cross, Penn State

There have been massive gains in productivity over the past century.

So why are people still working so hard for so long?

Output per worker increased by almost 300% between 1950 and 2018 in the U.S. The standard American workweek, meanwhile, has remained unchanged, at about 40 hours.

This paradox is especially notable in the U.S., where the average work year is 1,767 hours compared with 1,354 in Germany, a difference largely due to Americans’ lack of vacation time.

Some might argue that Americans are just more hardworking. But shouldn’t more productive work be rewarded with more time free from work?

This is the central theme of my new book, “Free Time: The History of an Elusive Ideal.”

Keynes misses the mark

Many economists see the status quo mostly as a choice: People would simply rather have more money. So they prioritize work over free time.

However, in the past, many economists assumed that people’s need for more stuff would eventually be met. At that point, they would choose more free time.

In fact, one of the most famous economists of the 20th century, John Maynard Keynes, confidently predicted in 1930 that within a century, the normal workweek would decrease to 15 hours. Yet Americans in their prime working age are still on the job 41.7 hours per week.

Man with white mustache and thinning hair sits for a portrait.
John Maynard Keynes. Library of Congress

Why was Keynes wrong?

Obviously, people’s needs or wants were not fully met. In the first half of the 20th century, advertising shifted in ways that emphasized emotions over utility, making consumers feel like they needed to buy more stuff; planned obsolescence shortened how long products remained functional or fashionable, spurring more frequent purchases; and new, exciting – but costly – goods and services kept consumerism churning.

So workers continued to labor for long hours to earn enough money to spend.

Furthermore, as wages rose, the opportunity cost of time spent away from work also grew. This made more free time less economically appealing. In a consumption-saturated society, time spent neither producing nor consuming goods increasingly appeared as wasted time.

Interest in slower, cheaper activities – reading a book, meeting a friend to catch up over coffee – started to seem less important than buying a pickup truck or spending an hour at the casino, pursuits that demand disposable income.

Forced labor

It’s still important to consider whether there’s even any choice to be made.

Almost everyone who works 40 hours a week or more does so because they have to. There are bills to pay, health insurance coverage to maintain and retirement to squirrel away money for. Some jobs are more precarious than others, and many workers even forego earned vacation time for fear of losing promotions.

This hardly makes for a free choice.

But the 40-hour week isn’t the result of a personal calculation of costs and benefits. Rather, it’s the result of a hard-fought political battle that culminated in the Fair Labor Standards Act of 1938, which established the standard 40-hour workweek, along with a minimum wage.

Pressed by a labor movement that was far more powerful than today’s, the government implemented a range of progressive economic policies during the 1930s to help the nation emerge from the Great Depression.

Many government officials viewed setting a standard workweek as a way to curtail exploitation and unfair competition among employers, who would otherwise be motivated to push their employees to work for as long as possible. It was an emergency measure, not a choice of more time over more personal income. Nor was it a step toward the progressive reduction of hours worked, as Keynes had envisioned.

In fact, it was hardly a radical measure.

Labor leaders had initially proposed a 30-hour week, which government officials resoundingly rejected. Even New Deal liberals saw a shortening of working hours as a potential threat to economic growth.

So the 40-hour week ended up as the compromise, and the standard hasn’t been updated since.

Young women raise their fists and smile. Two of them hold a sign reading 'SIT-DOWN STRIKE - HELP US WIN 40 HOUR WEEK.'
Woolworth’s employees strike for a 40-hour workweek in 1937. Underwood Archives/Getty Images

For most Americans, this was an acceptable trade-off. They might be working long hours, but they could afford television sets, cars and homes in the suburbs. Many families could live on the wages of the full-time work of the father, making the 40-hour week seem reasonable, since the mother had time to care for the family and home.

But this consensus has long since been undermined. Since the 1970s, inflation-adjusted wages haven’t risen with economic growth. In many households that include married or partnered couples, a single wage earner has been replaced by two earners, both of whom find themselves working at least 40 hours per week.

It’s almost as if the 40-hour week has been replaced by an 80-hour week – at least in terms of hours worked per household.

Who has time to raise kids? Who can afford them? It’s no wonder the birth rate has declined.

Separating economic growth from well-being

For decades, the amount of work we do has been talked about as “just the way things are” – an inevitability, almost. It doesn’t seem possible for society to take a different tack and, like flipping a switch, work less.

To me, this resignation points to a need to reconsider the social contracts of the past. Most Americans will not abandon their work ethic and their insistence that most people work. Fair enough.

Many people prefer working over having vast stores of free time, and that’s OK. And there’s still immense value in work that doesn’t produce a paycheck – caregiving and volunteering, for example.

But reducing the standard workweek, perhaps by transitioning to a four-day week, could ease stress for overworked families.

These changes require political action, not just individuals making the personal choice to arrive at a better work-life balance. And yet a national reduction in the standard workweek seems almost impossible. Congress can’t even pass legislation for paid family leave or guaranteed vacation time.

It doesn’t help that elected leaders continue to insist that well-being be measured mostly by economic growth, and when the U.S. media breathlessly reports quarterly economic growth data, with increases deemed “good” and decreases deemed “bad.”

Why shouldn’t free time and its benefits be included in the equation? Why aren’t figures on the social costs of unlimited growth publicized? Does it even matter that the Dow Jones Industrial Average has doubled in less than a decade when economic security is so fragile and so many people are overstressed?

The idea that stratospheric increases in productivity can allow for more time for life is not simply a romantic or sentimental idea. Keynes viewed it as entirely reasonable.

Opportunities like the one that led to the 40-hour workweek in the 1930s rarely appear. But some sort of paradigm shift is urgently needed.

Something has to give.The Conversation

Gary Cross, Distinguished Professor of Modern History, Penn State

This article is republished from The Conversation under a Creative Commons license.