Skip to contentSkip to site index
Max Frankel in February 1968, covering President Lyndon B. Johnson’s trip to Vietnam. Tom Johnson/The New York Times
The Lives They Lived
b. 1930

Max Frankel

As a child, Max Frankel was an outsider. As an editor, he couldn’t resist a good human story.

In April 1943, the Hebrew Tabernacle Congregation in Manhattan’s Washington Heights held a Passover Seder for about 100 refugee children from Nazi-occupied Europe. When, during the retelling of the Jews’ Exodus from slavery in Egypt, the moment came for the blessing over the wine, the honor was given to a 13-year-old boy named Max Frankel, according to an account published at the time in The Yiddish Daily Forward.

“When Max stood up with the wine glass in his hand and started chanting in the traditional melody, one didn’t need to be religious to be transported,” the author wrote, as translated by The Forward in 2018. “He, too, had a past filled with migration and persecution. The role he acted in the Exodus story was not foreign to him.”

Three years earlier, Max’s determined and daring mother had managed to arrange passage from Germany to New York for the two of them, at a moment when the full horror of the Holocaust was unfolding and America was hostile to Jews seeking sanctuary. In the chaos of their collapsing world, his father was imprisoned by the Russians in Siberia and separated from the family for seven years. 

Yet whenever Max would dwell on “the chain of absurd circumstance” that allowed them to escape, his mother would dismissively respond, “Everybody who got out has got a story to tell,” as he would later write in his memoir, “The Times of My Life and My Life With The Times.” “Mom would scoff, speak a word for God or Fate, and tell me to get on with life.”

He obeyed his mother. He added English to his string of childhood languages, became a young self-described American “super patriot,” was admitted to Manhattan’s High School of Music and Art as a painter but fell for the craft of journalism, which he pursued at Columbia University and then The New York Times, where he began working nights as a college student and decades later became its top editor. Still, throughout his 94 years, Frankel told an interviewer in 2011, “a piece of me never stopped being a refugee.”

The struggle and serendipity that led to his rescue on what he called “the ark America” left deep memories, as did the awkwardness of being a foreign boy who stood out in every way — from the length of his pants to his clunky lunch box to his toddler’s English. His personal exodus, those years as a stranger in a new land and his embrace of the promise and principles of his adopted country, were threads that wove throughout his long life.

“It gave him a certain outside-looking-in perspective,” says his daughter, Margot.

The Times sent him to Vienna in 1956 where he witnessed Hungarians fleeing Soviet aggression during the short-lived Hungarian uprising. This was only 16 years after his escape from Germany, and the memories of his own border crossings were still raw. “I instantly recalled the helplessness of a stateless refugee,” he wrote in his memoir. The following year he became a correspondent in Moscow for the paper, just a decade after his own father had fled Siberia. “I despised the Russians and sided passionately with their victims,” he wrote. “I had no patience for any government’s lies and propaganda, but I wrote unashamedly of and for ‘the free world,’ the half that did not censor its news or surround its populations with barbed wire.”

By chance, when the “free world” tried to censor The Times’s publication of the Pentagon Papers in 1971, alleging that it risked disclosing dangerous classified information, Frankel was running the newspaper’s Washington bureau. In an affidavit in The Times’s legal case, he detailed how the government often used the classification system not to protect live-or-die information but rather to strategically create and reveal “secrets” as a kind of currency in dealings with the public, the press, allies and adversaries.

David Rudenstine, a professor and former dean at Cardozo Law School, who recruited Frankel to teach classes in press freedom after his retirement, says that “given his history, Max wrapped himself around the First Amendment in a way that was deep and emotional.”

Alongside his principles, Frankel just couldn’t resist a good human story. In particular, he loved obituaries. “He just liked the storytelling of obits,” his son David says. His interests apparently rubbed off on David, who in the mid-1990s explored the creation of a magazine called Obit, devoted to tales of the departed. It was a short-lived idea, given the slim odds that advertisers would support issue after issue about dead people. But the son’s idea rubbed off on the father, who encouraged this magazine to publish an end-of-year issue on notable deaths.

On Jan. 1, 1995, Lives Well Lived appeared in The New York Times Magazine, with send-offs for such disparate luminaries as Ralph Ellison and Roy J. Plunkett, who discovered Teflon. For the second issue, the name was tweaked to The Lives They Lived, perhaps to accommodate fascinating scoundrels, and it was declared the start of a tradition. So it has remained for 30 years, now outliving the refugee who loved a good story.

Matthew Purdy is the editor at large for The Times. He has recently written about the vagueness of the Trump administration’s rulemaking and the enduring legacy of George Orwell.

Jane Goodall in the jungle surrounding her home and research station in Tanzania, 2014. Michael Christopher Brown
b. 1934

Jane Goodall

She approached chimps as she did people: on their own terms.

In the beginning, Jane Goodall named the chimpanzees. This was controversial. It was 1960, back when scientists weren’t supposed to dignify the animals they studied with names — they were supposed to use numbers, to remain objective, to avoid anthropomorphism at all costs. (This is still largely the case.) But Goodall wasn’t that kind of scientist. Technically, she wasn’t a scientist at all. She was 26, and her only degree was a certificate from secretarial school. In fact, it was partly this inexperience that inspired Goodall’s employer, the paleoanthropologist Louis Leakey, to send her off to study the wild chimps of Tanzania. Fresh eyes, and a novel approach, might finally penetrate the mysterious world of humanity’s closest nonhuman relatives.

This turned out to be true. Day after day, from dawn to dusk, Goodall stalked the forests, often barefoot, trying to establish rapport with the chimps. She waited, quietly filling notebooks, hoping that the creatures would come to accept this “strange white ape.” For months, they mostly avoided her. Finally, mutual curiosity won out. The chimps came closer and closer — eventually all the way into Goodall’s camp. She fed them bananas, groomed them and was groomed in return, celebrated their births, cried over their deaths. And as she got to know them, she gave them names: David Greybeard, Mr. McGregor, Lord Dracula, Flo, Melissa. As the decades passed, as Goodall established herself as the world’s leading chimpanzee expert, the roster of names grew: Beethoven, Freud, Groucho, Midge and on and on.

It occurred to me, reading the remembrances of her life, that “Jane Goodall” itself sounds like a made-up name. It’s like “John Niceguy” — the name of a saint, or of a cartoon character fighting to save the world. It sounds like a name a chimpanzee might make up to describe the first human who really took them seriously. 

But it actually was her name — or at least part of it. She was born Valerie Jane Morris-Goodall, to a middle-class family in England. As a child, she was outdoorsy, adventurous and obsessed with animals. She once coaxed a bird to nest in her bookshelves. Her mother encouraged these enthusiasms; her father was mostly absent. At night, Goodall often dreamed that she was a man, doing things that only men were allowed to do — which often meant traveling to Africa, like the heroes in her favorite storybooks.

As soon as she could, after saving up money from waitressing and secretary jobs, Goodall got herself there. And once she made contact with the chimpanzees, her path was set. For the rest of her life, 65 years, she documented and advocated on behalf of the chimps. In middle age, when Goodall could no longer live in the bush, she traveled the world relentlessly, some 300 days a year, as an educator and environmental activist. When she died, at age 91, she was in the middle of a speaking tour.

Just as Jane Goodall was her real name but not quite all of it, “Jane Goodall” was a real person but also an invention — a media sensation created largely by the National Geographic Society, which funded her research. One curiosity of Goodall’s career is that her earthshaking scientific discoveries — that chimpanzees eat meat and make and use tools — all came from her first stint in Tanzania, before the cameras arrived. The wider world still had no idea what Goodall looked like, and she would have been happy to keep it that way. But National Geographic sensed a visual gold mine: this photogenic blonde woman in khaki shorts, face to face with Africa’s primordial great apes. They insisted on sending a photographer along with a checklist of poses: Jane looking through binoculars, Jane laughing, Jane cooking, Jane washing her hair in a stream.

Goodall resented this. But she agreed to play along. She understood that, in the same way that a chimp could use a blade of grass to fish termites out of a nest, she could use fame to advance her larger project: studying and protecting the chimps. “Certainly I understand that it is necessary to build up a story around ‘Jane Goodall,’” she conceded in an early letter. The resulting books and articles and photos and films made her an international celebrity. A 1965 documentary, “Miss Goodall and the Wild Chimpanzees,” drew an estimated 25 million viewers on CBS. (The photographer and videographer, the Dutchman Hugo Van Lawick, would become Goodall’s first husband and the father of her only child, a son who grew up partly in the bush, playing inside a cage so the chimpanzees wouldn’t hurt him.)

Goodall eventually got a Ph.D. from Cambridge — another tool, one that would make the scientific community take her more seriously. But she continued to bring her whole self to the work. As a performer, Goodall was sly and charismatic. She delighted human crowds by demonstrating her mastery of chimp language, an elaborate dialect of pant-hoots, food-grunts, waa-barks and SOS screams. (I love this clip of her insulting John Oliver, chimp-style, before eating a banana sideways.)

In Goodall’s books, chimpanzee society is as convoluted as anything in Jane Austen. There is infinite drama: jealousy, alliances, schisms, betrayals. Goodall’s chimps tickle each other, get colds, feast on delicacies (termites, army ants) and tease baboons by shaking wet branches over their heads. They appear to dance in the rain, mourn their dead and sometimes die of grief. She presents them as distinct, reasoning, worthy individuals. “David Greybeard had the most beautiful eyes of them all, large and lustrous, set wide apart,” she writes in her book “Through a Window.” “They somehow expressed his whole personality, his serene self-assurance, his inherent dignity — and, from time to time, his utter determination to get his way.”

But chimps can also be shockingly violent. Once, Goodall wrote, a male named Frodo stomped on her head so hard that it almost broke her neck. She describes males competing in apparent dominance displays: scowling, hair bristling, sprinting around and dragging branches on the ground. She documents chimps killing baby monkeys with a bite to the skull. She once chronicled a coordinated campaign of intrachimp extermination that resembled organized human warfare.

Still, Goodall never lost her admiration and devotion. She and the chimpanzees were a perfect match. They were both in-between figures. Goodall was a human living among nonhumans, a media-savvy scientist, a woman in a misogynistic world. (“Comely Miss Spends Her Time Eyeing Apes,” an early headline read.) And chimpanzees, as Goodall put it, “blurred the line between humans and animals.” Most of our genetic code is nearly 99 percent identical to chimps’; they are more closely related to us than they are to gorillas. Chimps can sort objects, learn sign language and recognize their own reflection. To a species as narcissistic as humans, this presents a serious identity crisis. Chimpanzees are a fun-house mirror, both us and not-us — familiar yet alien, fascinating but terrifying.

One of Goodall’s greatest virtues was that she refused to approach this relationship from a position of superiority or fear. She engaged the chimpanzees, as much as humanly possible, on their own terms. “For a long time I never liked to look a chimpanzee straight in the eye,” she wrote. “I assumed that, as is the case with most primates, this would be interpreted as a threat or at least as a breach of good manners. Not so. As long as one looks with gentleness, without arrogance, a chimpanzee will understand, and may even return the look.”

Sam Anderson is a staff writer for the magazine. He last wrote about Dwayne Johnson, the actor also known as the Rock.

Marcia Marcus in her New York City loft in 1966. Marcia Marcus Media Inc/ARS-NY
b. 1928

Marcia Marcus

The painter never wavered, whether she was being celebrated or overlooked.

It was 1956 and Marcia Marcus was 28, single and sure of herself. She pushed open the heavy door of a warehouse loft with no heat but plenty of room to paint. She looked at the tin ceilings and the hardwood floors and thought: I will build my life here.

And she did. For the next seven decades, Marcus lived in Manhattan, and she painted. She painted as she raised her two daughters, and she painted as her husband taught gym class for a living. She painted when collectors and galleries sought her out, and she painted when no one cared.

“Why can’t you be like other moms?” her daughters lamented. 

“It’s not so much that she worked at home,” her daughter Kate Prendergast says. “It’s that she raised her family in her studio.”

Marcus often painted herself, hair wild, eyes unflinching, but she also painted portraits of downtown friends and artists such as Jack Kerouac, LeRoi Jones and Lucas Samaras, many of whom achieved the recognition she craved. She discovered her style — dramatic portraiture, bold colors, inventive wardrobes — when she was young, and she rarely wavered.

She found success early, with a show in 1957 and inclusion in some high-profile group exhibits in the 1960s. She won a Fulbright award in 1962 and spent the spring traveling and soaking up art across Europe. A letter she wrote about that trip captured two themes that coursed through her life: unblinking self-confidence and burning umbrage.

The letter was addressed to the Internal Revenue Service, protesting its decision to reject her travel expenses as legitimate business spending. Her terse frustration at I.R.S. bean counters comes through, but so does a resentment of the oblivious, pinch-lipped world that was refusing to recognize her.

“My prime business asset is myself,” she wrote. “My trip, therefore, was a way of putting profits back into the business, by improving my ability to do my work.”

With exquisite disdain, she pointed out that she was not an “average tourist” but belonged to a lineage of artists from Michelangelo to Whistler who sought inspiration across Europe. Implicit in the art history lesson was her certainty that the great artists were her peers.

It’s unclear whether her arguments swayed the taxman, but her family likes to think so.

Marcus showed her work regularly into the 1980s, but by 1987 her star had begun to dim. A gallery in Bridgehampton, N.Y., showed her work that fall; it would be her last significant show for nearly 30 years.

She got by on her husband’s paycheck and by teaching art classes. She worked the floor at the Gap near the World Trade Center, folding T-shirts and helping women match skirts and turtlenecks.

All the while, she kept painting. She found a pediatrician and a dentist who treated her daughters in exchange for artworks. The family spent summers in Provincetown, Mass., where they scavenged castaway furniture and ate free fish caught by a fisherman pal. They knew where vegetables grew wild, and they shared tomato-and-peanut-butter sandwiches for lunch.

She depicted herself in her canvases because she was always at hand, but also as an act of defiance in a sexist world. “She’s basically saying that I’m doing this to show that women were there, here and everywhere in different historical settings,” her daughter Kate says.

She was so sure of her gifts that she regularly sent off boxes of her papers to the Archives of American Art. She saw herself more clearly as part of the cultural narrative than the culture did. “Rauschenberg and de Kooning and people who had done better financially than she had — she didn’t think of herself as apart from them,” her other daughter, Jane Barrell Yadav, says.

That’s not to say that she took the indignities in stride. “She could be very, very angry,” Jane recalls.

“Self-Portrait in Fur Jacket” (1959) Marcia Marcus Media Inc/ARS-NY

When Marcus was in her 80s, a curator who was putting together an exhibition of downtown New York artists came across her name, which she was not familiar with. The curator sought out her work and wound up including a striking Marcus self-portrait — the artist poised and stylish in a fur jacket — in a 2017 show at the Grey Art Gallery in Manhattan.

By the time the show opened, Marcus was nearly 89. So much time had passed since her last major show that she could have been the grandmother of her younger self gazing back from the wall. In his review for The Times, Holland Cotter singled out her painting as a “way-ahead-of-its-time self-portrait.”

Within the year, another New York gallery staged a solo show of Marcus’s work. Other galleries and critics took notice.

Finally.

Since the last time she was onstage, her daughters had become mothers and her peers had become luminaries. Galleries had opened and shuttered, trends had exploded and disappeared. And without any affirmation, Marcus had been painting all along.

She enjoyed the late-in-life recognition, but the fanfare didn’t change her conception of herself. The world had simply caught on, and not too late. Kate says, “It was like: This is who I am. I’m really good at what I do. I’m great.”

Sam Dolnick is a deputy managing editor for The Times. He wrote about the basketball player and executive Jerry West in last year’s The Lives They Lived issue.

George Foreman squaring off with Muhammad Ali in the Rumble in the Jungle, in Kinshasa, Zaire, 1974. Abbas/Magnum Photos
b. 1949

George Foreman

Before he became a lovable pitchman, he was just plain mean.

Before the hamburger grills, and the ABC sitcom (“George”) that lasted eight episodes, and the boisterous appearances on “Late Night With David Letterman,” and the monthly “30-Second Sermons” for Esquire magazine — before it became impossible to picture him without that big teddy-bear smile — George Foreman was, in his own words, “the kind of man it was easy to root against.” To use more of his own words, he “concocted reasons for cold-cocking someone,” and he couldn’t stop, “because it was the one thing I did well.”

According to the legend of the boxing match billed as the Rumble in the Jungle — Foreman’s epic heavyweight title defense against Muhammad Ali in Kinshasa, Zaire, now the Democratic Republic of Congo, which began at about 4 a.m. local time on Oct. 30, 1974, so that it could air in prime time back in America — Foreman is the villain of the story. He is terse, glowering, a pitiless 25-year-old punching machine with “Samson’s arms” and “nuclearology in his fists,” as Norman Mailer put it. The two fighters would split a record-setting $10.45 million purse, but Mailer, who would later write a book about the fight, feared that the aging Ali, then 32, wouldn’t survive to collect any of that money.

But nine days before the bout, Foreman was hit above his right eye during a sparring session, opening up a cut that forced the fight to be delayed six weeks. He was never able to get comfortable in Zaire — he hated the food, the bugs, the stares wherever he went — and he rarely left his hotel except to train. Ali jogged through villages with local kids and charmed the continent. On the night of the fight, he entered the ring in the Stade Tata-Raphaël to a chorus of 60,000 spectators chanting: “Ali, boma ye! Ali, boma ye!” (“Ali, kill him.”) 

Back home, in the parts of Black America where the two fighters were raised — Foreman in Houston, Ali in Louisville — Ali was a civil rights hero who had forfeited his heavyweight title over his refusal to serve in Vietnam, and then fought his criminal conviction all the way to the Supreme Court. Foreman had won a gold medal in boxing for the United States at the 1968 Summer Olympics in Mexico City, but even some of his kin were mortified by his celebration, prancing around the ring with a tiny American flag at the same Games where John Carlos and Tommie Smith raised their fists.

Foreman’s gradual transfiguration, from global villain to beloved pitchman, is one of the great reverse heel turns in all of sports, and it began that morning in Kinshasa. In the fourth round, Foreman wobbled Ali and moved in for the kill, only to hesitate because he spotted one of his closest friends in the crowd — “a man I’d considered family” — booing and rooting against him. “That sight is, in fact, the image I recall most vividly from the fight,” Foreman later wrote in his 1995 memoir, “By George.” “In a state of shock, I couldn’t deliver the punch that probably would have ended the fight.”

Four rounds later, Ali ended it instead. Foreman had thrown so many punches he could barely lift his arms. He had exhausted all his nuclearology. It was his first defeat in 41 fights.

Losing his title humiliated him. He fled from city to city — Paris for a week, then Los Angeles, then Houston, then back to Los Angeles — holing up in hotels, sleeping with different women nightly, embarrassed to show his face in public. Eventually he retreated to his ranch a few hours north of Houston with little but his German shepherds and pet lion to keep him company. He grew consumed with regaining the title, even putting up a photo on his wall of Ali knocking him to the mat in Zaire.

“After I’d lost to Ali,” he wrote, “I’d decided I needed more hate. I’d hit you in the kidneys or on the back of the head.”

Six months later, Foreman began his comeback by knocking out five men in one night at an exhibition described in Sports Illustrated as “border[ing] on vaudeville.” He pulverized an aging Joe Frazier and loomed over him on the mat, drinking in the boos from the crowd. “He needs a shrink and a trainer, in that order,” Howard Cosell later declared.

By March 1977, Foreman and Ali had agreed in principle to a rematch. The purse would be $13 million. Foreman already had a fight scheduled in San Juan, Puerto Rico, against another title contender; all he had to do was win and he would get his shot at revenge. But he was fixated on proving he had the stamina to go the distance in a fight, and once again he pulled the decisive punch, got knocked down in the 12th round and wound up losing in a unanimous decision. The rematch was gone.

In his training room after the fight, Foreman was so consumed by despair and self-loathing that his body crumbled, he felt himself sink into a deep void — “the bottom of the bottom of the bottom” — and on the very brink of death, according to his account, he saw Jesus and was born again.

The boiling rage he had always felt “was no longer in my computer programming,” he wrote. “This was nothing like being hungry and forcing yourself not to eat. The anger just wasn’t there.” Foreman was 28, and he was done with boxing. “To knock a guy out, I needed to psych myself into a state of viciousness,” he went on. “But that George Foreman didn’t exist anymore.”

Within months, he was standing on a street corner in tiny Marshall, Texas, with a portable amplifier and a microphone, preaching the word of God. Then he drove to Shreveport, La., and did the same thing. He had shaved his head and muttonchops and mustache, and he was smiling ear to ear, not scowling, so it took people a minute to recognize him as the former heavyweight champion of the world. In town after town, he would get out of his car somewhere and start preaching, and people would gather and slowly realize to their astonishment who it was.

Word spread, of course. He got invitations to appear on “The 700 Club” and “The PTL Club” with Jim and Tammy Faye Bakker. He baptized hardened criminals in a small tub at San Quentin. Soon he opened up his own church, the Church of the Lord Jesus Christ — nothing special, just an old building that he and his congregants fixed up with the $25,000 he got from selling a tractor. “It’s not Madison Square Garden, but I like it,” he told Sports Illustrated.

His sermons were funny, and the butt of his jokes was always George Foreman: his waistline, his hairline, his failures at marriage (four divorces before the fifth union stuck) and fatherhood (12 children in all, including five boys named George and a girl named Georgetta). This was another explicit instruction from God: Shed the fearsome persona and become “a pure entertainer.” His job, he wrote, “was to keep them laughing; never get serious; always have a joke; turn all dark questions into something funny.”

At one point after his rebirth, Foreman was invited to join an evangelical mission to Africa, and in Kinshasa, just steps from the spot where Ali had knocked him out in 1974, he delivered a sermon in front of 60,000 people. “This time,” he wrote, “they cheered.”

Devin Gordon is a contributing writer for the magazine. He wrote about Willie Mays in last year’s The Lives They Lived issue.

Anna Ornstein in Hungary in 1942. From the Ornstein family
b. 1927

Anna Ornstein

An Auschwitz survivor, she pushed psychoanalysts to think about the Holocaust in a new way.

Dr. Anna Ornstein survived Auschwitz-Birkenau at age 17. There, her father and grandmother were killed in the gas chambers. Her two brothers were pressed into labor for the Axis armies and never returned. Later, as a psychoanalyst, she published academic writing that sometimes took a personal turn and held a muted yet unmistakable rage. That anger was not focused on Hitler or the memory of especially cruel SS guards; it was aimed at a prevalent psychoanalytic perspective that she felt failed to see, let alone learn from, the experience of Holocaust survivors.

Ornstein grew up in a Hungarian farming town, where, for the tiny Jewish minority, antisemitism was severe but not insurmountable. Then came German occupation and the packed cattle cars that hauled Jews toward almost inevitable extermination. At Auschwitz-Birkenau, living on a once-daily chunk of bread and “some kind of cooked grass,” as Ornstein would recall in a short memoir, she and her mother fended for each other. Ornstein “became my mother’s eyes” after her mother’s glasses were confiscated. She masked her mother’s weakness so she wouldn’t be marked for death. Her mother, for her part, persuaded Ornstein that it was only rumor when prisoners spoke of the distinctive smell in the air as coming from burning bodies.

After stints in two more Nazi camps and, finally, liberation, mother and daughter made their way back to Hungary, where Ornstein’s mother took charge of an orphanage for 40 Jewish children who lost their families. She insisted that every child be bathed in attention by any available adult — that “the cook, the gardener, the maid, whoever else was around,” Ornstein wrote, should stay at the bedside of any child who struggled to fall asleep. The healing of the children was essential to her mother’s own. Ornstein herself found healing in her marriage to a young man, Paul Ornstein, whom she had adored since meeting him when she was 14 and he 17. Paul had escaped from forced labor with the German Army and, later, Soviet Army detention. The children from the orphanage gathered in a choir to sing at the wedding. 

If the orphanage and the wedding sound like sentimental set pieces in a Holocaust movie, they would also become crucial to Ornstein’s vision as a clinician and an academic. Ornstein followed Paul to medical school and, after they immigrated to the United States, into psychoanalytic training in the 1960s. But Ornstein felt that classical psychoanalysis failed to reckon with the individual and complex experiences of what she and other Holocaust survivors endured — and how, in a great many cases, they overcame what they went through.

Analysts in training undergo their own analyses. One day, Ornstein took a written account of her experiences to her analyst, who was also the chairman of the psychiatry department at the University of Cincinnati, where she had studied. Two of Ornstein’s grown children, Rafael and Sharone Ornstein, both psychoanalysts themselves, told me about the incident. They didn’t know exactly what their mother had written, but they guessed that she made an urgent effort to have her specific story — and her capacity to recover — understood.

The next day, the chairman’s secretary handed the pages back, saying only, “This is yours.” Ornstein reiterated that the document was meant for her analyst. The secretary replied, “He said, ‘This is yours.’” “From his responses,” Ornstein recounted in a 2014 essay, “I learned early on that it was preferable for me not to share my Holocaust experiences with him.”

Psychoanalysis was built on Freud’s ideas about the unconscious sexual drives of childhood and the guilt, fear, repression and other drives that follow from early erotic yearnings. In classic psychoanalytic treatment, a patient’s troubles were seen as almost purely internal. Breakthroughs depended on unburying conflicts and torment. The Holocaust, so immensely and devastatingly external, didn’t fit readily within this paradigm. It posed a tremendous challenge for mainstream psychoanalytic theory. If the field didn’t always look away from the Holocaust, as Ornstein’s analyst seems to have done, it often did something that was, in Ornstein’s mind, worse. It reduced survivors to extreme victimhood, presuming that they could be summarily categorized as broken beings.

But Ornstein and her husband found an emerging alternative. In the late ’60s and early ’70s, a psychoanalyst named Heinz Kohut was beginning to focus on the fundamental need for human connection. “The role of the other in the experience of the self — we take this as a given now,” Sharone told me. “But back in the day, the whole idea was the isolated mind with drives and instincts.” Ornstein and her husband became part of a small circle who helped Kohut develop his insights. They emphasized the necessity of human bonds.

This spoke to Ornstein’s experiences. She saw herself not as broken but as resilient — because of the bond with her mother, because of bonds with other concentration-camp prisoners, because of fortifying familial bonds built into her life before the Holocaust, because of the bond between her and Paul. She recognized similar strength and foundations in many survivors.

In her 2014 essay, she lamented, in tones of open anger, that “psychiatrists and psychoanalysts had missed a unique opportunity to research a most remarkable phenomenon in modern history” — they failed to ask “what made psychological survival in concentration camps” possible. “Instead, the professionals restricted their inquiry to the study of the pathological consequences of this unparalleled historical event and then proceeded to theorize about the transgenerational transmission only” — the italics are hers — “of the traumatic aspects of the survivors’ experiences.” Her field had reduced and failed to examine not only her experiences but also the experiences of her children.

All three of Ornstein’s children — Miriam, a child psychiatrist, as well as Rafael and Sharone — brought up in separate conversations how much their mother loved to dance. At weddings and bar and bat mitzvahs, she threw herself into the hora, the traditional Jewish dance usually done in a circle of held hands, with wild communal hopping and kicking. Her dancing was, in her children’s words, fueled by “pleasure at being with people” and filled “with a sense of triumph.”

Rafael spoke, too, of the defiance that accompanied his mother’s capacity for joy. “It was, Screw you, look at me, look at my kids,” he said. “She was militant about resilience.”

Daniel Bergner is a contributing writer for the magazine and the author of “The Mind and the Moon: My Brother’s Story, the Science of Our Brains and the Search for Our Psyches.”

Diane Keaton in 1975. Norman Seeff
b. 1946

Diane Keaton

She worried that she was inadequate at love, until she became a mother.

When Diane Keaton was nearly 50 years old, she found herself on a bumpy flight home from Cannes, terrified. She hated flying, and turbulence sent her into a panic. The film she was promoting had been a modest critical success — a relief after many years of middling reviews — but as she surveyed her life, she felt lonely. She missed her grouchy dog. How could it be this was the only creature she longed for? And where was the loving partner who could hold her hand in this moment of fear?

When she finally got home, she steadied herself by writing in her journal — a long habit she had learned from her mother, who meticulously documented her thoughts for 40 years. Keaton wondered how she had ended up here, and whether she needed to take more risks in her life, “especially those revolving around intimacy,” she wrote in her memoir “Then Again.” “I know I have to make a decision that will or will not lead to the experience of a different kind of love, a love of less expectations on the receiving end,” she wrote, speaking of her own disappointments. It was then that she decided to adopt a baby.

Becoming a mother at age 50 to her daughter, Dexter, and again at 55 when she adopted her son, Duke, quelled her unease. Motherhood seemed to come easily to her — she was embracing, fun, totally engaged. She saved nearly everything her children created — drawings, cards, scraps of projects and also the letters she wrote to them every so often (several of which she included in “Then Again”), attempts to cross “the barriers of our 50-year age gap, sort of explanations and apologies pertaining to who I am.” 

She was aware of the unconventionality of her choice to start a family as a single, older woman and was open with her kids about the emotional challenges that adoption might involve. “Being adopted is to start life with loss,” she wrote in a letter to Duke. “It’s not necessarily a bad thing. Loss helps us learn how to handle goodbyes.” Starting out knowing something others will have to learn, she went on, has its advantages. “You will already have the tools to make you more open to the many varieties of love.” This introspection and emotional transparency were typical of Keaton; they were the wellspring of her acting as well as her mothering.

Keaton always credited her first New York acting teacher, Sandy Meisner, with her unpretentious, lucid style. When she arrived in the city at 19, the beloved eldest daughter in a family of four siblings who were raised in California, she desperately wanted to succeed, but she worried she wasn’t beautiful enough. She hadn’t yet taken possession of her expressive face or her offhand elegance. She soon fell in with a group of eccentric artists and met Woody Allen, whom she dated for a few years. He cast her in her breakout role, at age 31, as Annie Hall. The character is based on Keaton, and her indelible wardrobe — trench coats, fetching ties and bowler hats — was pulled from Keaton’s own closet. Annie revealed to the world Keaton’s essence: a conspiring warmth overlaid with a jangling, neurotic charm.

In short order, Keaton became a major star, playing a variety of complex roles — Theresa Dunn in “Looking for Mr. Goodbar” and Louise Bryant in “Reds” — and having love affairs with Warren Beatty and Al Pacino. She admired these men all her life, writing sharp portraits of them in her memoirs, but she seemed to blame herself for failing to settle into a long-term partnership. This is a leitmotif in Keaton’s writing: a sense that she was inept at intimacy and needed to fix herself, until the moment she became a mother.

Despite her huge early success, there was a stretch of years — in the late ’80s and early ’90s — when many of the films she was in did poorly; she described to a friend having to essentially beg for roles. But her insecurities lived alongside a sturdy trust in her own artistic impulses. She was a collector — of California Monterey furniture, Bauer pottery, houses that she would exquisitely renovate and sell. She edited books of photographs (California Spanish architecture, tabloid photos and old Hollywood promotional images, to name a few) and kept reams of journals, spending decades in psychoanalysis laying herself bare. When her mother got sick with Alzheimer’s disease, Keaton documented her decline, even lovingly transcribing confused voice mail messages.

After her mother died, Keaton wrote “Then Again,” the richest of her three memoirs, in which she layered her mother’s account of her life in her journals (anguished, determined, beautifully composed) with her own. It’s a relentlessly self-inspecting double portrait, in which Keaton reckons with having lived the varied, freewheeling life her mother never got to. As her mother’s mind was fading, Keaton was immersed in the clamor of motherhood — driving Dexter to swim practice, roughhousing with Duke. Her deepest desire, she wrote, was to give her children what her mother had given her, the freedom to become themselves. “I believe people evolve into who they want to be,” she wrote to Dexter in a letter. “In a way you create who you are.”

Amid all the change in her life, Keaton’s career revived. She had transformed herself from the elusive, uncertain woman of her early films into a stately one whose comedy was bolstered by worldliness. In “Father of the Bride,” “The First Wives Club” and — her favorite film — “Something’s Gotta Give,” she plays women of a certain age, firmly planted in their lives with wisdom and mirth to spare. But she never lost that quiver of sensitivity, her ready access to emotional truth.

In “Something’s Gotta Give,” Keaton plays Erica Barry, a lauded playwright who has forsworn romance. Jack Nicholson is Harry Sanborn, a notorious cad who dates only women under 30. Unexpectedly thrown together in a summer house, they find themselves submitting to a surprising passion. After they first have sex, Harry clams up, getting up from bed and declaring that he prefers to sleep alone. “I’m an old dog,” he tells her. In response, Keaton delivers a performance that in the span of 90 seconds lets us in on a whole history of emotional defense and disarmament. Her face, more radiant than ever, opens and closes like a fan, revealing all the nakedness and yearning that new love can still summon, and then the desolation it can bring, the gathering of courage it requires. Moved by her, Harry eventually comes back to bed. This capacity to ask for and receive love may be this accomplished woman’s biggest accomplishment.

Erica’s self-possession is Keaton’s. She was a woman whose unyielding singularity, her urge to really know herself, gave her access to her deepest feelings — and eventually, to the boundless, giving self she had always wanted to be.

Sasha Weiss is a deputy editor at the magazine.

Clint Hill tried to shield President John F. Kennedy and Jacqueline Kennedy on Nov. 22, 1963. Justin Newman/Associated Press
b. 1932

Clint Hill

The Secret Service agent was haunted by J.F.K.’s death and his failure to prevent it.

For nearly all his life, Clint Hill discussed the events of Nov. 22, 1963 — “the situation in Dallas,” he called it — exactly twice: in 1964, with the Warren Commission, and in 1975, with Mike Wallace of “60 Minutes.”

He did not talk about that day with friends, or medical professionals, or his wife at the time. He did not talk about it with fellow Secret Service agents, to whom he had flashed a desperate thumbs-down from the president’s car. He did not talk about it with Jacqueline Kennedy, whose life he probably saved in Dallas, or with Robert F. Kennedy, whom Hill had informed by phone from Parkland Memorial Hospital, “It’s as bad as it can get.” The attorney general hung up.

By then, Hill’s Friday afternoon — chronicled in due course across feature film, amateur film, good television, bad television, oral history, federal inquiry, pre-internet fever swamp, internet fever swamp, podcast-era fever swamp — was already doomed to shadow his next six decades. 

Hill, then 31, had been assigned to protect the first lady. He was riding through Dealey Plaza on the running board of the follow-up car when he heard an explosive noise from his right. He saw President John F. Kennedy grabbing at his throat and sagging to his left.

Sprinting from his vehicle to the Kennedys’, Hill grasped a trunk handle and yanked himself aboard — the man in the business suit, almost superheroic in the Zapruder footage, racing into history but too late to reverse it.

Mrs. Kennedy, in her pink suit and pillbox hat, had climbed onto the limousine’s rear, reflexively collecting “the material,” Hill said later, that had escaped the president’s head. Hill pushed her back into her seat. If he hadn’t, according to commission testimony, she most likely would have fallen to the roadway, in the motorcade’s path.

Hill threw his body over the president and first lady, gripping the door frame to elevate himself into the line of anything else coming. He could see a hole in the president’s skull.

Roughly six seconds had passed between the first shot and the fatal third shot, Hill noted afterward. He would forever believe he might have been able to save Kennedy with half a second more.

“It was my fault,” he told Wallace 12 years later, sitting next to his wife and fighting tears while clawing at a cigarette.

“No one has ever suggested that for an instant,” Wallace said, accurately.

“If I had reacted just a little bit quicker — and I could have, I guess,” Hill insisted. “And I’ll live with that to my grave.” He looked like a man who would keep his word.

Until his death last February at 93, Hill was consigned to a life spent revisiting the best view of the worst thing, the best chance to have no real chance to save the president.

He had attempted suicide 37 days after Dallas, he shared in 2022, walking fully clothed into the ocean off Palm Beach, Fla., where he was accompanying Mrs. Kennedy and her children. (A police officer rescued him.)

Instead, Hill lived long enough to shudder through future executive assassination attempts, from Gerald Ford to Donald Trump; a reel of fresh and harrowing Kennedy family tragedies; an entwined escalation of political hostilities and tinfoil-hatted plots to rationalize them, for which Nov. 22 remains an American ur-text; a procession of lone-gunman mass shooters who reminded Hill of Lee Harvey Oswald.

He winced at every grassy-knolled yarn about phantom accomplices and magic bullets, affirming the nonmagic of the mainstream Dallas account not because he was an apologist or a lackey but because no one has ever remembered and re-examined more about six seconds of their workday in the history of employment.

Hill on duty with Jacqueline Kennedy in Italy in 1962. Associated Press

Hill left the Secret Service in 1975 but had never wanted to retire from the greatest job he’d known — the one that had turned a former North Dakota orphan into “Code Name: Dazzle,” delivering him to Brazil with Dwight D. Eisenhower and Australia with Lyndon B. Johnson and Greece with Mrs. Kennedy, where her husband had instructed Hill not to let her near Aristotle Onassis. No, Hill was retired, by medical dictate, after failing a physical with what nobody yet called severe PTSD.

He spent much of the next decade self-medicating in his basement. He developed a habit of walking around Washington with his head down, lest anyone recognize him as the agent who couldn’t quite get there.

In 1990, Hill decided to visit Oswald’s sixth-floor perch at the Texas School Book Depository building. He signed in under a pseudonym. He interrogated every angle. He concluded that he’d never had a true chance after all, though he would revert to sporadic self-flagellation in perpetuity.

In his later years, at the urging of Lisa McCubbin Hill, an author who became his second wife, Hill started talking and writing about Dallas — what he saw, what he did, what it did to him — in books, on television and at public events. It was cathartic, he found.

He could talk about other things, too: his efforts to bring women into the Secret Service; his frustration with merry Kennedy conspiracists like Oliver Stone and Donald Trump, who in 2016 pushed a lie tying Ted Cruz’s father to Oswald.

By 2017, when the government released a long-awaited trove of Kennedy documents, Hill was comfortable enough to make the capital rounds again — and identifiable enough to be ambushed by TMZ outside his hotel.

“People always have these different conspiracy theories,” the cameraman reminded him, in an Olympic feat of Kennedy-splaining. Didn’t he ever wonder?

“Never,” Hill said. “I was there.”

Matt Flegenheimer is a correspondent for The Times focusing on in-depth profiles of powerful figures.

Jill Sobule in 1996. Yves Beauvais
b. 1959

Jill Sobule

After her label dropped her, fans showed her she could still make it in music.

“Thank God we finally have a straight female singer-songwriter,” executives from Atlantic Records told Jill Sobule after signing her in 1995. The industry had already seen hits from openly gay artists like Melissa Etheridge, but the executives were apparently hoping for someone with even more mainstream appeal.

Sobule was then 36 years old. She’d had crushes on girls since she was a seventh grader in Catholic school wearing a Batman utility belt. She had her first affair with a woman when she was a 20-year-old busking on the streets of Seville, Spain. She’d worked as a cocktail waitress at the Cubbyhole, a lesbian bar in New York’s West Village. She’d also had relationships with men, but she certainly didn’t consider herself “straight.”

Sobule, a tiny blond performer known for her girlish voice, serious guitar chops and sly storytelling, already had one record come and go; now she was being given a second chance. She didn’t correct the executives, but she soon handed them a song with a giddy chorus and a triumphant confession: “I Kissed a Girl.” Two friends get together — “we had a drink, we had a smoke” — complain about their dopey boyfriends and discover they have chemistry. The executives thought it was a possible hit; it was catchy, just the right balance of sweet and subversive. They were right: In 1995, “I Kissed a Girl” became the first song with an openly gay theme to reach the Top 20 on the Billboard chart. (Thirteen years later, a song with the same title would deliver a breakout hit for Katy Perry, who had been signed by one of the same executives who signed Sobule.) 

In the music video, Sobule is a 1950s-era housewife living in suburbia and cheating on her husband (played by the superhunk Fabio) with a woman. Sobule thought the video was going to end with the two women kissing, but the executives balked. The video ended instead with Sobule’s character pregnant with Fabio’s baby. When the time came to make a video for her next song, another hit titled “Supermodel,” Sobule created a spoof of the movie “Carrie,” in which her character burns down a fashion show. MTV never got behind it.

Sobule’s subsequent album, “Happy Town,” failed to deliver a hit despite rapturous reviews, and the label dropped her. At 39, she began to piece together a life as a touring musician. She had enough hard-core fans, many in high places, to keep her afloat. In 2008, a year before Kickstarter made crowdsourcing mainstream, she raised $85,0000 by offering fans the opportunity to sponsor her next album. “Some of the levels of sponsorship we created were just like jokes, things I thought nobody would ever actually do,” she told one interviewer. “‘For 10 grand, you will be my personal lord and savior.’ Very funny, right?” The producer Joss Whedon bought it.

Sobule was a kind of demicelebrity, talented enough to be embraced by the famous yet humble enough to keep singing for anyone who would listen. She opened for Neil Young and flew on a private jet to perform at a party alongside Arianna Huffington. But she also played house concerts in small ranches, Sears homes or condos whose owners gathered 20 or so of their friends to pay $40 each. “You’re small but mighty,” she’d tell the crowd, and sing her heart out for them. At almost every concert, people approached her to tell her that “I Kissed a Girl” changed their lives.

For the next three decades, Sobule continued to perform, writing soulful songs in the tradition of John Prine and Leonard Cohen, and light, witty ones like those of her contemporary Lyle Lovett. She sang with charm and, occasionally, ferocity, about antidepressants and eating disorders, nostalgia and heartbreak. She often wrote with humor about the ongoing struggle to make it big (“Almost Great” was the title of one song; “Sold My Soul” was another).

Her lyrics were loose and unpredictable, like Sobule herself, who spent most of her time on the road. She could have afforded to buy a house, but she never so much as signed a rental lease, instead staying for days or weeks at the homes of well-placed friends. If she was a hobo, she was “the fanciest hobo that ever lived,” her friend Bill DeMain said. More than one friend noted that she had fallen asleep doing the crossword puzzle with a fountain pen and ruined the sheets on the guest bed.

The life of a working musician entails a lot of physical labor: lugging gear into venues, standing for hours onstage, driving all day for a gig that might not draw a crowd. By her 60s, Jill sometimes laughed with friends about how she’d do something else if she had any other marketable skills. “I think about it about once a day,” she wrote in a memoir she was drafting. In April, she posted a rare public confession on Instagram about the toll that performing was taking on her body. “It’s been hard being on tour,” she wrote. “I somehow messed up my back and have sciatica.” She asked her fans for advice, and it came pouring in with love.

On May 1, she was crashing in Minneapolis at the stately home of a lawyer and superfan. She texted back and forth with her best friend, a fellow artist named Marykate O’Neil, comparing notes about movies and their favorite podcast, “A History of Rock Music in 500 Songs.”

The next day, O’Neil received a call: There had been a house fire in the early morning. Sobule didn’t make it out. O’Neil, stricken, saw that she’d missed a text from Sobule, sent around 2 a.m. It was a clip of the Ramsey Lewis Trio recording a smooth jazz version of the 1972 hit “Summer Breeze.” The musicians, resplendent in wide collars and groovy patterns, look blissed out on the melody and the sheer joy of making music. O’Neil watched it dozens of times while she grieved. “I find comfort in thinking that she drifted off to sleep listening to this amazing music,” O’Neil said. “It was very her.”

Susan Dominus is a staff writer at the magazine. Last year she wrote about Bob Newhart for The Lives They Lived issue.

Assata Shakur at a Black Panthers political-education class in Harlem, N.Y., in 1970. Stephen Shames
b. 1947

Assata Shakur

Freedom came with shattering costs for her and her family.

The United States government called her one of the world’s most-wanted terrorists. Assata Shakur called herself a 20th-century escaped slave.

Claiming the runaway slave narrative proved a powerful and inspirational metaphor. Drawing on historical memory, Shakur placed herself in the pantheon of Black freedom fighters from Nat Turner to Harriet Tubman who, by any means necessary, took their liberation into their own hands. Shakur was lionized in rap songs and taught in college classes, and her likeness could be found in classrooms and community centers in Black neighborhoods across the nation.

But the lore of Assata Shakur, as lores often do, obscured more complicated truths. Like many of those who ran before her, Shakur claimed her freedom only at a devastating cost: It meant relinquishing the ability to raise her only child; it meant she could never again return home, not to bury her mother, not to see her own grandchildren, not to be buried herself. 

Born JoAnne Deborah Byron in 1947 into a family of strivers in Queens, she split her time between her mother’s home in New York and her maternal grandparents’ in Wilmington, N.C. (She changed her birth name in 1971, rejecting it as a slave name.)

Her grandparents in the segregated South imbued Shakur with an unshakable pride and dignity in being Black. In her 1987 autobiography, “Assata,” Shakur describes being forbidden from acting subservient around white people: Hold your head up high, look white people in the eye, “don’t you respect nobody that don’t respect you.”

Coming of age during the throes of the civil rights movement, while witnessing the Northern version of segregation, poverty and police brutality that seemed impervious to it, radicalized her.

She joined the Black Panther Party just as it, and other Black movements, were being decimated by the often illegal tactics of the F.B.I.’s secret spy program, COINTELPRO. Facing constant surveillance as she watched the party’s leadership imprisoned, discredited and assassinated, Shakur came to believe in the necessity of a covert, armed revolution.

She joined the Black Liberation Army, a loosely confederated antiracist and anticapitalist underground guerrilla movement. Its members were accused of bombings, robberies and murdering police officers. By the early ’70s, Shakur had been indicted 10 times, but only one indictment resulted in a conviction. In 1977, an all-white jury found her guilty of murdering a New Jersey state trooper who died in a shootout after a car that Shakur and her colleagues were riding in was stopped by the police. Officers later claimed Shakur fired the first shot. Shakur, who was shot twice, said her hands were in the air and she didn’t shoot anyone.

While Shakur was incarcerated pending her murder trial, she was tried for robbing a bank in the Bronx, along with Kamau Sadiki. The pair were removed from the courtroom after disrupting the proceedings and spent the remainder of the trial locked together in a holding cell, where Shakur fell in love and became pregnant. The woman who had vowed to never bring a child into the world decided that “if a child comes from that union, I’m going to rejoice,” she wrote in her autobiography. “Because our children are our futures, and I believe in the future and in the strength and rightness of our struggle.”

Shakur gave birth to a girl she named Kakuya in a hospital surrounded by police officers. While she maintained her innocence, Shakur was sentenced to life plus 33 years and surrendered Kakuya to her mother.

In 1979, when her daughter was 5, Shakur helped plot her own daring escape from prison, and disappeared. In the years after, every time the doorbell rang, Kakuya’s heart skipped a beat, thinking her mother would be standing there.

But as time passed without a word, Kakuya hardened herself, coming to believe that her mother must be dead. Until one day, five years after what she now calls her mother’s liberation, Kakuya found herself sitting in her aunt’s law office, phone pressed to her ear, talking to her mom. “It was surreal,” Kakuya told me from her Chicago home. “When I heard her voice, I realized I didn’t even remember what she looked like.”

Shakur had been hidden in the United States for several years by a sort of Underground Railroad before being smuggled into Cuba and granted asylum as a political prisoner. She sent for her daughter to come live with her. But when Kakuya got there, she remembers not wanting to hold her mother’s hand, not trusting that she wouldn’t disappear again, not understanding why she had chosen to have a child she knew she could not raise.

“We had to really work through my grief and her grief,” Kakuya said. “There was a part of me that was angry and a part of me that always, you know, wanted to be with my mother.”

Shakur met her daughter’s resistance with a love both fierce and patient. “She reminded me that for us there was never an idea that we were born free,” Kakuya said. “It was very important for me to feel her love and to understand that her struggle was for me and for all children.”

When she turned 15, Kakuya decided to return to her grandmother and her life in America, assuming she would always be able to visit her mom. And for a while, she could. Protected by her asylum status, Shakur lived openly in Cuba. She worked as a translator, jogged daily, read voraciously and continued to write and speak out against oppression.

But in 2005, more than two decades after her escape, the F.B.I. classified Shakur as a domestic terrorist, and in 2013 placed her on its list of most-wanted terrorists, the first woman to earn that designation.

In an open letter, Shakur once posed the question: “Why, I wonder, do I warrant such attention? What do I represent that is such a threat?”

Angela Davis, the activist who was wrongly imprisoned during that same tumultuous period, told me women were the backbone of Black radical movements and “the government probably recognized more than even our own people did the power of Black women.” In relentlessly targeting Shakur, she said, “it’s my opinion that the government was attempting to deter Black women from joining the liberation struggle.”

With a $2 million bounty for her capture, Shakur was forced back into hiding, and Kakuya stopped visiting for fear of revealing her location to the F.B.I. Kakuya never saw her mother again. It haunts her.

“Most of my life has been defined by this history of trying to be with my mother,” she said, “and always holding onto the hope that one day I would be able to be with my mother again.”

Liberation came with unbearable costs. But Shakur, who saw herself as an escaped slave, died free.

Nikole Hannah-Jones is a domestic correspondent for The New York Times Magazine covering racial injustice and civil rights.

Derek Humphry in his office in Santa Monica, Calif., circa 1991. From the Humphry family
b. 1930

Derek Humphry

He pioneered the Right to Die movement but thought it didn’t go far enough.

The book was slim and to the point. There was a chapter on how to kill yourself with pills and a plastic bag, and another on death by starvation. Also sections on electrocution, hanging and poisonous plants. The author promised, in an interview, straightforward “instructions for a perfect death, with no mess, no autopsy, no post-mortem.” And he urged his readers to be cleareyed about what was to come. “This is the scenario: You are terminally ill, all medical treatments acceptable to you have been exhausted,” the book read. “The dilemma is awesome. But it has to be faced. Should you battle on, take the pain, endure the indignity and await the inevitable end, which may be days, weeks or months away? Or should you take control of the situation … ?”

By the time he wrote that book, “Final Exit,” Derek Humphry had been a prominent Right to Die campaigner — sometimes called the founding father of the movement — for over a decade. Still, when he tried to find a publisher for his book, he couldn’t. Nobody, it turned out, really wanted to publish a suicide manual — or what its author called a guide to “self-deliverance.” So Humphry got Hemlock to print the book in 1991 and started handing out free copies around Los Angeles. He was as surprised as anyone when, later that year, the book made its way onto the New York Times best-seller list and stayed there for 18 weeks, eventually selling over one million copies.

Commentators described the book’s success as an indictment of the medical profession: of doctors whose duty to heal had, in recent years, become a compulsion to prolong life at all costs. The people who bought “Final Exit” were afraid of dying badly, of dying slowly, of dying in a hospital, hooked up to machines. 

Humphry’s introduction to assisted death came in the early 1970s, when he was still living in his native England. His first wife and the mother of his three sons, Jean, had discovered a lump in her breast. Later the cancer metastasized. Jean didn’t want to die slowly and in pain, the way her mother had, and she asked her husband to help her find another way.

Two years later, on an afternoon in March, Humphry made Jean, who was just 42, a cup of coffee laced with painkillers and sleeping pills. She drank it and died quickly, which was a relief to Humphry, who worried that the medications would fail and that he would have to suffocate his wife with a pillow.

Soon after, Humphry moved to Santa Monica, Calif., and started the Hemlock Society, one of the country’s first Right to Die advocacy groups, named for the poison that Socrates drank in ancient Athens. This was the 1980s. It was the time of Ronald Reagan, of Jerry Falwell, of the so-called Moral Majority. “My God,” someone cried out at an early meeting organized by Humphry. “They firebomb the houses of pro-abortion people. … What do you think they’ll do to us?”

Over the next decade, Humphry opened 80 chapters and enrolled around 45,000 members who raised money to fund Death With Dignity ballot initiatives in a handful of states.

But then a decade passed, and all of Hemlock’s efforts had come to very little. And Humphry, in turn, had grown impatient with his own creation. It was, in part, that everything was moving too slowly; after all those years, physician-assisted death was still illegal everywhere. Also, legal change didn’t seem to be what Hemlock’s rank and file really wanted right now. Humphry had noticed that at chapter meetings, members seemed less interested in consciousness-raising than they were about the practicalities of how to die painlessly. The people with AIDS especially wanted specific instructions: What pills? How many?

These people knew that it wasn’t always easy to die. Some deaths were agonizing. Some took hours and left panicked bystanders to resort to guns or pillows. Others were violent and left a mess behind. Some people tried to end their lives only to wake up feeling worse off than before. In small members-only seminars, and later in brash public gatherings, Humphry instructed attendees on what he claimed were more foolproof methods.

By 1994, when Oregon passed the Death With Dignity Act, the world’s first law permitting physician-assisted dying, Humphry was disenchanted with the political process he had helped to set in motion. The Oregon law was so much narrower — so much less radical — than he had hoped. It applied only to people who were terminally ill and within six months of a natural death — people who were going to die soon anyway. It did nothing for so many other people who were suffering: those with degenerative conditions like multiple sclerosis and Alzheimer’s; those with spinal cord injuries and paralysis; those with unrelenting chronic pain. Also it left so much power in the hands of the medical profession and its bureaucracies. It meant that dying people had to beg for approval from their doctors to be allowed to stop living.

Humphry and the larger movement parted ways. Humphry left Hemlock, which, in the early 2000s, merged with another group and then splintered. The largest faction reconstituted itself as Compassion & Choices, an advocacy group with an eight-figure annual budget committed to passing Oregon-style laws in other states. (Medical Aid-in-Dying is now legal in 12 American jurisdictions.) Later, Compassion & Choice’s founder would dismiss Humphry’s plastic-bag method as “sort of the end-of-life equivalent of the coat hanger.”

Humphry retreated to a small house near Eugene, Ore., by the Willamette River. He continued advocating and writing and, from time to time, published revised editions of “Final Exit,” updating drug doses and adding a new section on inert gas canisters.

In earlier editions of the book, Humphry had included his personal phone number; he still got a call or two from readers almost every day. Most of them were dying and were scared.

One might have expected Humphry to “self-deliver” — particularly when he started to experience more symptoms of congestive heart failure. He did consider it, and friends say he had “a stash of a lot of stuff” to use. But his wife, Gretchen, explained that her husband ended up in the hospital and then was transferred to hospice, where his organs began to fail and where he died a so-called natural death. In the end, she said, he was comfortable enough to hold on.

Katie Engelhart is a contributing writer for the magazine. She received a Pulitzer Prize in 2024 for feature writing.

David Lynch in 2001. Kiino Villand/Trunk Archive
b. 1946

David Lynch

He lived a disciplined life to make space for his wild, surrealist visions.

When David Lynch and his younger brother, John, were kids, they were out biking near their home in Boise, Idaho, one evening when they saw a startling image. “Out of the darkness — it was so incredible — came this nude woman with white skin,” Lynch wrote in his memoir, “Room to Dream.” Her skin seemed to be the color of milk but she had a bloodied mouth. John cried, but David was fascinated. “It was very mysterious, like we were seeing something otherworldly,” he recalled.

That mingling of the familiar and unsettling, the quotidian and the uncanny, would characterize Lynch’s artistic career. A folksy trickster whose wholesome persona had little evident overlap with the horrors he brought to life on film, Lynch charmed audiences into exploring perverse evils and secret desires. Across an oeuvre that included 10 feature films, a television show and myriad musical projects, short films and prankish internet clips, his off-kilter perspective shined a light on the surreal violence that gathered at the periphery of the American dream.

Born in 1946 to his father, Donald, a research scientist for the U.S. Department of Agriculture, and mother, Edwina, a tutor, Lynch and his two siblings grew up in a succession of Middle American idylls, moving from Missoula, Mont., to Spokane, Wash., to Durham, N.C., to Boise to Alexandria, Va., as Donald pursued his research. The family was close and devoutly Christian. Lynch himself was a charismatic boy who was happiest playing war outside with his friends and drawing. Edwina stoked his interest in art, favoring blank sheets of paper over coloring books, fearing that they would constrain her son’s imagination. 

Lynch described his childhood as happy: He experienced immense care and support from his family and enjoyed the material comforts of America’s golden age. “My childhood was elegant homes, tree-lined streets, the milkman, building backyard forts, droning airplanes, blue skies, picket fences, green grass, cherry trees,” he said in an interview. “Middle America as it’s supposed to be.” But his attention was also drawn to undercurrents of decay and disorder like that naked woman. Riding his bike around his Boise neighborhood at night, he’d take note of homes whose lights were out. “I’d get a feeling from these houses of stuff going on that wasn’t happy.”

Lynch’s fascination with the underbelly of Middle American life found its form in “Blue Velvet,” which opens with a vision of cloudless blue skies and white picket fences before cutting to a friendly neighborhood fireman waving to the camera from the side of his fire truck. Eventually, the eye notices that something is off: That sky is a sickening shade of blue that seems impossible in nature; the firefighter’s grinning face is obscured by the flickering shadows of the lush trees his truck passes beneath, just out of perception’s reach. The dream seems to be curdling into a nightmare.

The substance of that nightmare became clear in the television series “Twin Peaks,” a cosmology of the spiritual struggle against evil disguised as a weekly mystery drama about the murder of a homecoming queen, Laura Palmer. The show was interested in the detestable impulses — brutal misogyny, incest and child sexual abuse among them — that shadowed a seemingly wholesome Pacific Northwest town. But Lynch had also perfected, in both his person and his work, a strangeness that endeared viewers to his vision despite its darkness. Characters like the Log Lady, who carried around a log whose voice only she could hear, turned the show into a pop phenomenon — even as audiences learned that Laura’s killer was her own father, who raped and murdered her under the influence of a demonic entity.

Lynch arrived at his visions through serendipity, dream states and intuition. Chance encounters often found their way into his work. The presence of Bob, the entity that haunts the Palmer family, for instance, emerged from a mistake: After filming a scene of Laura’s mother grieving in front of a mirror, Lynch noticed that the camera had accidentally caught the set dresser Frank Silva’s reflection — and Bob was born. But Lynch embraced rigid structure to facilitate that creativity: He wore a uniform of khakis with a collared white shirt and ate the same meals every day — a chocolate milkshake and coffee at Bob’s Big Boy in Burbank, for example — for long stretches. (Bob’s Big Boy became a pilgrimage site for fans seeking to emulate his approach.) More than anything, Lynch credited Transcendental Meditation with making his creative life possible, allowing him to access “an ocean of pure, vibrant consciousness.” He meditated twice a day, every day.

Even as his increasingly hallucinatory, nonlinear storytelling mystified audiences, Lynch matured into a guru figure who expounded on art in the folksy idiom of his Missoula origins. “I always say getting inspiration is like fishing,” he said in a Marie Claire interview. “If you’re quiet and sitting there and you have the right bait, you’re going to catch a fish eventually.” That sage status was reaffirmed in Steven Spielberg’s 2022 autobiographical film, “The Fabelmans,” in which he cast Lynch, who was then 75 years old, as a wizened John Ford giving advice to a young filmmaker, a stand-in for Spielberg. “When the horizon’s at the bottom, it’s interesting. When the horizon’s at the top, it’s interesting,” he says, speaking for a sensibility that echoes his own. “When the horizon’s in the middle, it’s boring!”

Ismail Muhammad is a story editor at the magazine. His writing often examines the tensions and temptations of identity.

Roberta Flack in 1971. Anthony Barboza/Getty Images
b. 1937

Roberta Flack

She dreamed of being a classical pianist but made pop history instead.

The piano, she remembered, didn’t smell very good. Her father had rescued it from a junkyard, and a distinct odor lingered, left by what she imagined to be “little rat tiny people” that once lived inside. Still, it was the piano her family could afford, and she took to it immediately, the beauty of a Chopin prelude or a Mozart sonata overcoming the smell and any other rigors of regular practice. By the time Roberta Flack was in high school in Arlington, Va., she had grown so proficient that she was voted most musical, and classmates predicted she would make it to Carnegie Hall. “I was very popular,” she said, “because I could play anything.”

She graduated at 15 and won a scholarship to Howard University, where she dreamed of becoming a professional classical pianist — not the easiest pursuit, especially given the headwinds of racism and sexism. But the music community at Howard was rich; the future R&B great Donny Hathaway would meet her there later, among others. And at least one Black woman, the opera singer Leontyne Price, had risen to the top of the classical field in the 1950s. Why couldn’t Flack, with her talent, be next?

But those headwinds proved too strong, and she realized that the prospects for a classical career weren’t very promising. “The idea in my mind was always to do something musical — it was not necessarily if I can’t be a concert pianist, I’m going to quit,” she told NPR in 1989. “I mean, after you practice all those years and all those hours, you must find a light of some kind.” She got a degree in music education, sometimes teaching Bach chorales to kids who ran in from tobacco fields to rehearse with her. The “concert artiste” alter-ego she imagined, “Rubina Flake,” was never far away, waiting for her chance to emerge. 

It would take almost a decade, but she figured out a strategy: applying classical leanings to other types of music. There were club dates and, in 1969, an album, “First Take,” the blueprint for an entire career. It had a compelling mixology of classical, jazz, gospel, folk and blues, as well as lyrics about civil rights, gay men (who were among her early supporters), justice, all presented in a plain-spoken, slowly unspooling style. The jazz musician Les McCann, who helped discover Flack, wrote in the album’s liner notes that she carried “the listener beyond every barrier as though it never existed.”

The larger public wasn’t quite ready for her. The Top 40 was still mostly the province of car-radio caffeine and not Flack’s brand of slow-drip testimonies. But one listener with more patient ears was Clint Eastwood, who two years later put Flack’s version of “The First Time Ever I Saw Your Face” in the movie “Play Misty for Me.” An old folk song turned into a five-minute-plus meditation on erotic bliss, it played in its entirety, with no dialogue. Something clicked. It became a huge hit. But it was only a warm-up for her magnum opus, “Killing Me Softly With His Song,” another long piece of musical mesmerism. Even if you didn’t recognize the classical touches, these records made you feel smarter. And of course there were her trademark deep steeps of emotion. Fans would come up to her and say, “Yeah, that’s my life,’’ she recalled in a radio interview.

Now the 36-year-old former schoolteacher was the premier female vocalist in pop music, in an era that included Diana Ross and Aretha Franklin. There were Grammys and an apartment in the Dakota, where she lived next door to John Lennon and Yoko Ono. With the jazzy, spritzy “Feel Like Makin’ Love” in 1974, she became the first solo female artist of the rock era to land No. 1 pop hits in three consecutive years. “All of a sudden you get that rush of 20,000, 30,000, 50,000 people — THE WORLD,” she told Time in 1975. “All these people love me, you think. Then you’re back in a hotel room by yourself in Missouri, your stomach hurts and your humanness just overwhelms you.”

As her stature grew, the headwinds picked up again. Some critics started to suggest that her sound was a little too easy on the ears, maybe a little too … white. “One of the hassles of being a Black female musician,” she said in 1975, “is that people are always backing you into a corner and telling you to sing soul. I’m a serious artist. I feel a kinship with people like Arthur Rubinstein and Glenn Gould. If I can’t play Bartok when I want to play Bartok, then nothing else matters.” She added, “It doesn’t make me very popular in certain communities.”

She continued to defy trends and critics. In 1978, at the height of disco fever, she released a luxuriant ballad, “The Closer I Get to You,” a duet with Hathaway. It sold a million copies. A few years later she put out maybe her most placid single, “Making Love,” the theme song from one of the first mainstream movies about a gay romance. The lyrics were general, but this was the 1980s, when just being associated with the movie posed a risk. “I could never be afraid to sing a song about love, whether between a man and woman, two men or two women,” she said. The song made the Top 20.

Flack was still giving concerts in her 80s, until A.L.S. forced her to stop singing in 2022. She did make it to Carnegie Hall — 16 times, in fact. While she may never have played a program of her beloved Chopin or Bartok there, she did perform works by other master composers: the likes of Stevie Wonder, Carole King, Lennon and McCartney. And with pieces like “Killing Me Softly” and “Feel Like Makin’ Love,” there was no shortage of classics for her to play — her own.

Rob Hoerburger, the former copy chief of the magazine, is the author of the novel “Why Do Birds.”

Ranger Doug at Lake McDonald in Montana, with Mount Cannon in the distance. From the Follett family
b. 1926

Douglas Follett

The park ranger witnessed America’s glaciers melting in real time.

Douglas Follett believed that when you visited Sperry Glacier, you left your spirit behind. That made him the Hermes of Glacier National Park in Montana, for it was his job to guide souls to the glacier. Over his six-decade tenure as a park ranger, however, the nine-hour hike from Lake McDonald lengthened. Not because Follett grew older and somewhat less spry. No, the glacier moved.

For decades its edge had kissed the top step of a narrow staircase, carved into a cliff, that marked the trail’s terminus. But in the late 1990s, upon emerging from the stair one spring, Follett saw red: six inches of exposed red argillite.

“That’ll be covered up next year,” he told himself. 

The following year he encountered six feet of red rock.

“This has to stop,” he commanded the glacier.

But the third spring, the red patch extended 16 feet — then 60, then 600. Today you have to hike another half-mile from the stair before you can even see the tip of the glacier, glinting in a distant hollow of Gunsight Mountain.

In 1961, when Follett, then 35, became a ranger, there were at least 37 glaciers in the park. All have shrunk since then, with a dozen relegated, by the United States Geological Survey, to anonymity. Follett was among the last rangers to witness the mighty Two Ocean Glacier — as well as Gem, Hudson and Baby — and among the first to connect the phenomenon to global warming.

The world, or at least a substantial percentage of the three million people who annually visit Glacier, knew him as Ranger Doug. But Follett didn’t call himself a ranger. He preferred “interpreter,” a Park Service appellation that in Follett found its fullest expression. In boat tours, campfire talks and impromptu exchanges at the visitor center — where he became the main attraction — Follett interpreted the meaning of Glacier to generations of disciples. The mountains were a fount of wonder, sanctuary, even democracy. Short-tailed weasels were “friends.” Waterfalls, carrying away a glacier’s “lifeblood,” ventured forth “to seek their destiny” in the sea. A cottonwood tree didn’t die but “decided to lay down.” While Follett spoke, he also drew out his listeners, hoping to awaken an intimacy with the splendor all around.

The story Follett told about his own life began when he was 1 year old and his father, a railroad man for the Great Northern, was assigned to East Glacier Park. While his father worked the depot, Follett and his mother roamed the grand lodge, where the terraces were draped with bear pelts and visitors were greeted by hired Blackfoot Indians in feathered headdresses.

In high school he met Anastasia, his wife of more than seven decades, by offering her a stick of gum. After serving in the Air Force at the end of World War II, he never boarded another airplane. Instead, he liked to say, the world came to him.

During the school year he taught American history in Columbia Falls — another act of exuberant interpretation. At home the stories continued, with his four daughters each night before bed, and on road trips in their 1948 Chevy. The unifying theme, said his daughter Karen, was that “the world was all tied together.”

Unable to abide the conventions of town life, Follett bought seven acres of cedar forest on Whitefish Lake, within earshot of the Great Northern. He did not get around to installing plumbing until the year after they moved in; the Folletts hauled five-pail buckets from the lake. The house was warmed by a fireplace Follett assembled from slabs of green argillite. When his daughter Jen missed three days of high school one particularly freezing winter, a teacher asked after her absence. “I had to keep the home fires burning,” she said. She wasn’t joking. The summers at Glacier, where the family lived in a rustic cabin, felt like a vacation.

In the ’80s, Ponderosa pine saplings began to colonize their backyard — an ominous indicator of warmer, drier weather. Then Follett noticed the red rock devouring Sperry. The stories he told about Glacier began to change. Some park visitors question the scale of climate change, said Michael Faist, a former colleague about 70 years Follett’s junior. Faist summoned data, charts. “But Doug could say, ‘I walked to the edge when I was 40.’ That affected people differently.”

Follett’s medium changed too. One afternoon, struck by the glory of a white mountain goat, he decided to commit his thoughts to paper; to his surprise, they emerged in quatrains. Follett described the poems as his “memories” of life, though fittingly they are often narrated by nonhuman spirits. From “The Mountain”:

“I wonder / If they notice me / And think / That I don’t move / At all / That standing here / Has been my call / That this is where / I’ll always be / From now unto eternity / But if that’s so / I need to say / It’s wishful thinking / Gone astray / For I am changing / Day by day … ”

Nathaniel Rich is a contributing writer for the magazine. He is a 2025 Guggenheim fellow.

Robert Jay Lifton in 1992. From the Lifton family
b. 1926

Robert Jay Lifton

He peered into the darkest corners of humanity and remained an optimist.

When Robert Jay Lifton visited Hiroshima for the first time in the spring of 1962, he spent five fevered days interviewing the leaders of various survivor groups, religious figures, university professors and many others. It had been 17 years since the United States dropped an atomic bomb on the Japanese city, and it had since been rebuilt. But the new buildings and carefully planned streets belied a population still very much grappling with the existential impact of having lived through a near-apocalyptic act of destruction and death.

Lifton was 35 years old and had recently been awarded a chair in the psychiatry department at Yale School of Medicine. But after his brief, intense trip to Hiroshima, he wrote to the head of his department with a new plan. He was going to return to the city to conduct a study of the psychological effects of the world’s first nuclear strike.

At the time, Lifton was still early in a career that would span seven decades, but he was already in the process of carving an unusual path for himself. After finishing his psychiatry residency in New York in 1951, he served as an Air Force psychiatrist during the Korean War. The field of psychiatry was booming in the United States; following his discharge, in 1954, it would have been easy for him to return home with his wife and open a lucrative practice, as many of his peers were doing. Instead, he put his training to a different sort of use, remaining in the Far East to write a book about Communist China’s efforts to manipulate the minds of its citizens and captives through “thought reform.” 

Lifton undertook a similar study in Hiroshima, though his new interview subjects were survivors of a much more visceral atrocity. The physical effects of the atomic bomb had been the subject of previous scientific inquiries, but no one had taken the measure of its all-encompassing impact on the mind. Some of his interviewees had been near the hypocenter of the blast and witnessed the mass, indiscriminate death firsthand. “My body seemed all black, everything seemed dark, dark all over,” one of them recalled. And “then I thought, The world is ending.”

Lifton was confronting death in its most complete sense — nuclear extinction — while looking for shared themes that might help him bring a psychological dimension to the suffering. The research transformed him, or it may be more accurate to say that it nourished a different part of his identity as a healer. Reeling from a wrenching interview with a man who was orphaned by the bomb as a boy, he experienced “a feeling that I must do whatever I can, by means of this research or any other path, toward preventing the kind of thing that I was hearing described,” he wrote in his journal. He titled his nearly 600-page book on Hiroshima “Death in Life,” an allusion to a ghostlike existence described by many of his subjects. It won the National Book Award in 1969.

For the remainder of his career, Lifton would live a dual existence as both scholar and activist. As a pioneer in the field of psychohistory, he examined the intersection of individual psychology with broader historical forces. He was drawn, in particular, to the victims and perpetrators of what he called “atrocity-producing situations.” His many books included exhaustively researched portraits of Vietnam veterans, Nazi doctors who participated in Hitler’s genocide and the Japanese cult that launched a sarin gas attack on the Tokyo subway in 1995.

To Lifton, bearing witness was critical to recovering truth and inspiring resistance to totalitarian ideologies. In this sense, his writing was itself a form of activism. But he also felt compelled to act, joining and leading protests against nuclear weapons and the Vietnam War. He was, in his words, “a relatively well-behaved rebel,” but his moral fervor nevertheless troubled some of his critics; to them, “scholar-activist” was a contradiction in terms. Time magazine denounced “Home From the War,” his study of Vietnam veterans, as “a polemic in which moralizing smothers analysis.”

And yet for all of his moral passion, Lifton wasn’t really a polemicist. On the contrary, his decades studying the psychological fallout from mass atrocities gave him a hard-earned appreciation for the value of nuance and dissent. Reflecting on the contemporary American political scene in an interview with M. Gessen for The New Yorker in 2023, he warned against “totalism” — authoritarian political or religious systems that demand an all-or-nothing commitment to an ideology, and in so doing suppress all independent thought. “A totalist seeks to own reality,” he said.

Though he spent much of his career peering into the darkest corners of humanity, Lifton was, by nature, an optimist. When his daughter was young, she once asked him why he didn’t write about more-uplifting events. He didn’t have a good answer for her. But over the decades, he came to believe that only by studying a society’s response to catastrophe could you gain a sense of its resilience and health. His work delved into some of the worst atrocities of the modern era. But it also affirmed the continuity of life, exploring how victims transformed themselves into survivors.

Lifton’s final book,Surviving Our Catastrophes,” which he completed just after his 96th birthday in 2022, was a meditation on the collective power of these survivors. As he saw it, their stories of enduring extreme grief and pain could light the way for future generations. Ever hopeful, he wrote that their voices can inspire us to confront — rather than turn away from — whatever daunting challenges we as a society might be facing: “What we speak of as future renewal is inseparable from present engagement.”

Jonathan Mahler, a staff writer for The New York Times Magazine, has been writing for the magazine since 2001.

Kanzi in 2011, holding a lexigram device that he used to communicate. Gregg Segal
b. 1980

Kanzi

The bonobo who had a lot to say.

Nearly everything about Kanzi’s life was manipulated by humans, including his conception — handlers at Emory University in Atlanta handpicked his bonobo parents. He was born inside a man-made enclosure on Oct. 28, 1980, and soon after was snatched from his birth mother by another female bonobo named Matata whom researchers had been trying to teach language. His birth mother went back to a zoo, but baby Kanzi tagged along with his adopted mother to the Language Research Center at Georgia State University. There, he clambered on furniture and stole treats while scientists trained Matata to recognize objects, like a banana or a peanut, and point to the corresponding symbol on a lexigram keyboard.

At the time, a fierce debate was raging in the fields of linguistics and psychology: Could an ape learn language, or did only humans have that capacity? Matata, for her part, wasn’t showing much promise. Then one day when she was elsewhere, Kanzi began using the symbols, despite having received no training. He made an expression that primate researchers call a play face, touched “apple” and “chase” and then picked up an apple and ran away. He used the lexigrams 120 times that first day. The researchers were shocked: Had this lanky little bonobo with the large eyes acquired language through social exposure, much as a human child would?

Scientists had long believed that apes lack the neural control over their vocal tract muscles required to speak. So in the 1960s, researchers began training apes in a variety of nonverbal linguistic systems, including American Sign Language. A chimp named Washoe was taken home and taught ASL signs by married psychology professors. Later, Koko the gorilla and Chantek the orangutan learned signs too. Other apes communicated using tokens and symbols on a lexigram board. But the notion that a nonhuman species could ever grok grammar was controversial. The linguist Noam Chomsky famously said, “It’s about as likely that an ape will prove to have a language ability as that there is an island somewhere with a species of flightless birds waiting for human beings to teach them to fly.” 

That kind of skepticism angered some researchers, who harbored a dream that they would one day take these linguistic apes back to their native habitats where the animals would translate between their wild counterparts and humans. It was a very 1970s vision, one shared by a psychologist named Herbert S. Terrace who set out to prove that apes could, in fact, learn language. His team worked with a chimp called Nim Chimpsky, a jab at their chief intellectual adversary. But after reviewing hundreds of hours of videotape of the chimp signing, Terrace published a study in Science concluding that the chimp wasn’t using words, let alone syntax: Nim was essentially just begging. The paper put a damper on talking-ape research.

Back in Georgia, though, Kanzi and a psychologist and primatologist named Sue Savage-Rumbaugh pressed on with the lexigram symbols. Together they roamed 55 acres of wooded property surrounding the center with a printed lexigram board in tow. Increasingly, Kanzi seemed to exhibit an ability to understand spoken English. When he was 4, Kanzi was featured on the front page of The New York Times in an article titled “Pygmy Chimp Readily Learns Language Skill.” At 8, he spent months in a head-to-head contest with a 2-year-old girl in which Savage-Rumbaugh gave them 660 novel sentences, including “Put the toothbrush in the lemonade.” Kanzi completed the requests 72 percent of the time, more than his toddler rival.

But questions were inevitable: What was he actually capable of? Were Kanzi’s communication skills language or something else? Perhaps it was all what behavioral researchers call the Clever Hans effect, named for a famous horse in early 20th-century Germany who appeared to be doing arithmetic when quizzed, but turned out to be responding to some nearly imperceptible clues — like changes in breathing — from his questioners.

By the time the millennium turned over, scientists’ interest in the question of whether apes were capable of language had waned. Even as talking animals moved back into the realm of myth and fable, Kanzi kept his place in the culture. He played music with Paul McCartney and Peter Gabriel, who later called the cross-species jam session one of the most remarkable experiences of his life. He was featured on “The Oprah Winfrey Show,” and Anderson Cooper paid him a visit dressed in a bunny suit. Kanzi lived out his days at a 230-acre bonobo sanctuary in Des Moines, funded for a while by a meatpacking magnate. At the start, Kanzi and the bonobos ate, groomed and sometimes slept alongside the researchers in what the humans called a hybrid “Pan/Homo” culture (“Pan” for Pan paniscus, the Latin name for bonobos, and “Homo” for Homo sapiens).

But communes are notoriously difficult to sustain even when all of their members share a mother tongue; eventually the humans were told they could no longer enter the bonobos’ enclosure. Still, through plexiglass, a YouTube influencer taught Kanzi to play Minecraft. The actor Jesse Eisenberg screened a movie for the bonobo in which Eisenberg plays a sasquatch. Right up to his death, the last of the “talking” apes was a bright-eyed, empathic creature, inclined to connect with other apes — including us humans, with our lofty ideas and our odd requests.

Malia Wollan is a contributing writer for the magazine. She wrote about an orphaned sea otter named Rosa in last year’s The Lives They Lived issue.

A self-portrait from 2017. Thomas Sayers Ellis
b. 1963

Thomas Sayers Ellis

A wily poet whose art captured the rhythms of his communities.

Before Thomas Sayers Ellis was an award-winning poet, he was drawn to music. Before he even knew very much about poetry, he was surrounded by soul. “I came to reading and writing via the oral tradition, via soul music,” he said in an interview. Other, darker influences pushed Ellis to take up drumming: It reminded him, he later recounted, not just of his father’s profession as a boxer, but also of “a violent noise, a home noise, my father’s noise, him dominating the household, his yelling and his hitting.”

From an early age, Ellis converted that noise into a medley of artistic acts. Music — especially go-go, a popular funk variant created by Black D.C. musicians — was central to his life. He played the drums and rototoms in local clubs, and go-go’s communal quality became the bedrock of an artistic career that celebrated Black culture’s collective spirit.

All the while, he was writing poetry. When a substitute teacher gave him the poet Robert Hayden’s obituary, Ellis noticed that Hayden had worked for the Library of Congress, where his Aunt Doris also worked. In the poem “View of the Library of Congress from Paul Laurence Dunbar High School,” Ellis wrote that he began thinking like a poet when he realized that he and Hayden “Were connected, but years apart, / As was Dunbar to other institutions— / Ones I could see, ones I could not.” 

In 1986, intrigued by the possibilities of other forms of creativity, Ellis purchased a stolen camera and started snapping grainy black-and-white photographs at local clubs. Shortly after, he moved with his son, Finn, to Cambridge, Mass., with dreams of studying film and poetry. There, he curated his own idiosyncratic artistic apprenticeship, becoming a student of Seamus Heaney and Derek Walcott, and serving as a teaching assistant for Spike Lee. At the Harvard Film Archive, he met another poet, Sharan Strange, and the two bonded over their shared interest in film and Black art.

Soon Ellis moved into an old Victorian house on Inman Street where Strange was living with other artists. Ellis and Strange were both book collectors, and over time they began to amass so many books by Black authors of the diaspora that they converted a former darkroom in the Inman house into an archive and library, which they called the Dark Room.

When James Baldwin died the following year, the duo bundled into a car with housemates and drove to New York City to attend his funeral. The service was so moving that upon coming home, they transformed Inman into the site of a reading series. Their mission was simple: to ensure that their literary heroes were honored with an audience. Pooling together their resources to afford airfare, hotels and dinners for their readers, they attracted the likes of Alice Walker, Rita Dove and Yusef Komunyakaa to read alongside graduate students and young writers. Gradually, and with the help of a fellow poet, Janice Lowe, the series grew into the Dark Room Collective, which the poet Cornelius Eady has argued might be “as important to American letters as the Harlem Renaissance.”

Over the next decade, Ellis won fellowships like the Guggenheim and developed a reputation as a cerebral poet with a determination to celebrate the multiplicity of Black life. In the poem “All Their Stanzas Look Alike,” a critique of the homogeneity of American letters, Ellis ends the poem questioning the legitimacy of being celebrated for his supposedly singular genius. “Exceptional or not, / One is not enough,” Ellis wrote, celebrating the communal quality that he thought animated Black art.

His art juxtaposed the larger city with the intimacies of his own life. “The Maverick Room,” his award-winning first collection, was an ode to his birth city that borrowed its title from a go-go club Ellis once frequented. The poems in that collection shout out specific neighborhoods, clubs, streets, landmarks and history with enough care and detail that one might use it to draw a map of the District of Columbia. Meanwhile, he continued his exploration of different forms. With the tenor saxophonist James Brandon Lewis, he founded Heroes are Gang Leaders, an experimental jazz band formed in the wake of Amiri Baraka’s death in 2014.

In 2016, the women’s literary group VIDA published anonymous accusations that it described as sexual misconduct against Ellis. The Iowa Writers’ Workshop, where he was teaching, canceled his classes. Criminal charges were never pursued; Ellis, who made no public comment, retreated from the literary spotlight.

He moved to St. Petersburg, Fla., where he spent his final years working. In 2023, he became St. Petersburg’s first photographer laureate, and the next year a book of his photographs titled “PARADISE | PARADISE layered” was published. The photos used double and possibly triple exposures to build a multitudinous story of a year in the life of that city. Ellis seemed to be piecing together an artistic vision of the new community around him.

Reginald Dwayne Betts is the founder and C.E.O. of Freedom Reads and author of the poetry collection “Doggerel.” He is the recipient of a 2018 Guggenheim Fellowship and a 2021 MacArthur Fellowship.

Michelle Trachtenberg in 2005. Bryan Adams
b. 1985

Michelle Trachtenberg

She seemed to channel her childhood bullies when playing the villain.

For the past 18 years, a four-second video clip has shot all across the internet like a low-orbit satellite. It’s just a woman raising her glass, smiling coolly at someone across the table. But the measured twitch of her cheek is a dare, the swift wink of her eye a guillotine. She is Georgina Sparks, a money-, boyfriend- and scene-stealing miscreant from the CW show “Gossip Girl,” as played by a peak-of-her-powers Michelle Trachtenberg. Georgina is lovably scabrous and brilliantly bratty; she spikes drinks, puppets breakups and plots the downfall of entire family trees, just because she can. When Trachtenberg died in February, mourning fans posted GIFs of the character’s most infamous moments: Wicked as she was, Georgina was also mesmerizingly layered, aching and pitiable and devilishly amusing.

She was far from Trachtenberg’s only great mean girl. Before and after “Gossip Girl,” Trachtenberg schemed and tantrumed on “Six Feet Under” and “Law & Order,” “Inspector Gadget” and “Weeds,” and she tormented her sister as the sly, petulant Dawn Summers on “Buffy the Vampire Slayer.” It’s not hard to see why directors jockeyed to have Trachtenberg play nefariousness incarnate. No one could intimidate or menace as she could. Her sighs and smirks were little Russian dolls of cunning, yearning and disdain. To say she had a top-notch “resting bitch face” would be to do it an injustice, for her face on the screen never rests. It works overtime, smacking you with a half-dozen maladies until you’re left dizzied and wondering: How can villainy be so complicated, so fun?

Maybe it’s because before her eyebrows became synonyms for vengeance, Trachtenberg was on the other side of the equation. 

Born to immigrant parents in 1985 in Brooklyn, Trachtenberg got her acting start at the audacious age of 3. (As she remembered it, she staked her claim after seeing a friend appear in an ad: “Mommy, I want to be on TV!”) Armed with vast blue eyes and an impish grin, she was leading diaper and detergent commercials when most kids are just learning how to brush their teeth. Her big film break came at age 11, when she starred as the precocious child detective of “Harriet the Spy.” Critics whistled, casting directors chased, fandom assembled.

Other children, though — not so much. Schoolmates scoffed at Trachtenberg on the playground and shunned her when exchanging valentines. Kids called her “Harriet the Slut, Harriet the Bitch, Harriet the Bitchy Spy,” Trachtenberg once confessed to her friend Mara Wilson, a fellow child actor (“Mrs. Doubtfire,” “Matilda”). Wilson was shocked: “She never let anybody know it got to her,” she told me recently. Michelle “could talk to anyone. She could hold anyone’s attention and make anybody laugh,” she said. “I never saw her as anything other than able to take on the world.”

As Trachtenberg bagged more Hollywood projects — from the heartwarming Disney sports film “Ice Princess” to the blistering “Mysterious Skin” — her bullies grew nastier. Girls at her high school pushed her into lockers. They stole her clothes during gym class, threw glass bottles at her and even knocked her down a flight of stairs, fracturing her ribs and nose. Still, in photo shoots and on red carpets, Trachtenberg oozed impervious cool. Learning of her behind-the-scenes torment was like “seeing your parents cry,” Wilson says — “seeing somebody you think is so strong admit to being hurt.” The adult Trachtenberg was tight-lipped about her upbringing and personal life, but details emerged here and there. “People always contact me on social media saying, ‘Oh, my brother, my sister went to school with you, you were best friends!’” she once wrote on Instagram. “False. The kids were cruel.”

A different kind of person might have spent a lifetime trying to forget that cruelty. Trachtenberg seemed to alchemize it. She imbued her spitfire characters with a bully’s viciousness, but also a profound, empathetic understanding of what could simmer underneath meanness. Her villains are so compelling — so memorable, so endlessly memeable — because they are not storybook antagonists. They’re frightened, insecure; they feel pain; their lashing out is an obvious attempt to carpet over that pain. Georgina Sparks spends 20-odd episodes plotting to ruin her ex-friend’s life, yet you feel for her the entire time: You get the sense she is only desperate to bring someone else down to hell with her so that she can have the company.

There comes a point in “Gossip Girl” when Georgina’s lies are exposed and her machinations laid bare. Cornered, she makes a final plea to one of the show’s lead characters, Dan Humphrey: “So you’re just going to go back to Serena like nothing happened and just leave me all alone?” On that alone, Trachtenberg’s mouth becomes, for just a moment, a quivering gawp, a yawing chasm of something that you want to wrap in blankets and a bear hug. Then it toughens again, weakness papered over. It is the kind of thing that stays with you long after the scene or the TV series ends — the kind of gut-punch acting that put Trachtenberg’s face into permanent circulation online, her protean expressions as funny and sinister as they are wrenching.

“Do not put your value in someone else,” the adult Trachtenberg once told kids dealing with bullying. “Not letting them win is your win.” She herself did one better. Trachtenberg flipped the script, cackling infectiously as she played her former foes on TV. She beat the mean girls at their own game.

Amy X. Wang is a story editor for the magazine. She last wrote for The Lives They Lived about the poet Louise Glück and the beloved pets that people left behind.

Mabel Landry at Stagg Field in Chicago in 1955. Chicago Sun-Times/Chicago Daily News Collection, Chicago History Museum
b. 1932

Mabel Landry Staton

The track and field phenom raced her way through the Jim Crow South.

In the spring of 1945, Mabel Landry and her father, Paul, were walking near the family’s apartment in the Ida B. Well Homes, a vast housing project on the South Side of Chicago, when they spotted a few teenagers running sprints across the lawn. Paul turned to Mabel and said, “Dolly, I think you might be faster than them,” recalls Paula Staton, Mabel’s daughter. “That was what he called her, by the way — Dolly. Never Mabel. She was his only girl. His little doll.” Thirteen-year-old Mabel agreed to take up the challenge and, as her father had predicted, easily won. Soon after, says Patricia Staton, Mabel’s older daughter, “Joe Robichaux was knocking at her door.”

As a coach for the Catholic Youth Organization, Robichaux was looking to start a girls’ track team. Would Mabel consider becoming the inaugural member? “She couldn’t say yes fast enough,” Patricia says.

Unfortunately, Paul did not live long enough to see how far running would take his daughter. Later that same year, he was diagnosed with tuberculosis and died in a segregated sanitarium outside Chicago. His last words to Mabel were “Run, Dolly, run.” 

Mabel ran her way through middle school and high school and much of the Jim Crow South, too. She seemed to view the whole experience with a teenager’s proud defiance: “It was fun going into the white bathrooms when we weren’t supposed to,” she recalled in a newspaper interview, while later conceding that “segregation wasn’t a bed of roses.”

Proof arrived in 1949, the year Mabel, running for the C.Y.O., boarded a train bound for Odessa, Texas. “In those days, Caucasians usually sat at the back of the train, farther away from the engine smoke,” Paula says. “And that’s where Mr. Robichaux put my mom — she was the star of the team, and he wanted her to have the rest.” But once the train crossed the Mason-Dixon line, a conductor told Mabel that she had to move to the front. Mabel happily obeyed: She was lonely in her sleeper. Robichaux felt differently, and upon their return to the South Side, he filed a lawsuit against the Illinois Central Railroad for discrimination. Robichaux and the C.Y.O. won, and Robichaux used the resulting damages to expand the girls’ track squad to include both Black and white runners, which was nearly unheard-of at the time.

“My mom didn’t think she’d done anything spectacular,” Paula says. “She just wanted to keep competing.” During her sophomore year at DePaul University, she became the only American woman to qualify for the 1952 Olympic Games in the long jump and, along with a largely white contingent of athletes, flew to Helsinki, Finland. Less than a decade after that formative race on the lawn of the Ida B. Wells Homes, she had managed to run her way onto the biggest stage on earth.

Her first jump, in the preliminaries, was measured at 19 feet 3½ inches, then an Olympic record. But in the finals, she succumbed to nerves and fouled out during the next five jumps, putting her four spots away from a podium finish. “Still, this was a resilient, courageous woman,” says Thad Dohrn, the former senior associate athletic director for development at DePaul. Returning to the United States, Mabel won national championships in the 50-meter dash and the long jump; later, in 1955, she was a member of the winning 4x100-meter relay team at the Pan American Games.

Then, as quickly as she’d taken up running, she gave it up. At a U.S.O. event, she met a sailor named Rodman Staton and followed him to his home state of New Jersey. “Of course, the four of us kids knew that our mom had been an Olympian,” Paula says. “But she didn’t like to dwell on her past accomplishments.” Instead, Mabel threw herself into caring for her family and her work, teaching high school physical education.

In 2004, Paula saw a story on the Philadelphia local news about a resident who volunteered at the Summer Games in Athens and called the station. “I go, ‘Hey, you know, there’s a famous Olympian right in your backyard,’” Paula says. “They jumped on that, did a segment, the whole thing, and before you knew it, the old stories had started to resurface again.”

Mabel, a widow since 2000, allowed herself to revel in the attention, attending events at the Chicagoland Sports Hall of Fame and the Hall of Fame at DePaul University. “It was like something in her had been opened up again,” Dohrn says. “I was watching all these generations of athletes cluster around her, posing for pictures, realizing the person they were meeting was a living part of history.”

Matthew Shaer is a contributing writer at the magazine and the host of the podcast “Origin Stories.”

Agnes Gund in 2018 in her New York apartment with Mark Rothko’s “Two Greens and Red Stripe” (1964) and Christo’s “Wrapped Champagne Bottles” (1965). Stefan Ruiz; Kate Rothko Prizel & Christopher Rothko/Artists Rights Society (ARS), New York
b. 1938

Agnes Gund

She sold a painting for $165 million and gave it all away.

One December morning in 2016, I got a call from Darren Walker, who was then the president of the Ford Foundation and my boss. “I need you to meet me at Aggie’s at 4 o’clock today,” he said. “We need to help her build something.” We met at the elegant Park Avenue apartment of Agnes Gund — or Aggie, as she was known to everyone. That afternoon, she sat on her sofa beneath the intricate universe of Jasper Johns’s 1963 painting “Map.” A Roni Horn totem stood near her like a sentry, and I could see the smooth, ribbonlike curve of a Martin Puryear wood sculpture in the dining room.

Gund was an avid art collector and philanthropist, and over the years she had served as the president of MoMA and donated more than 1,000 works to the museum. “I feel guilty about having so much more than most people,” she once explained in an interview. “If I can have it, others should be able to enjoy it.” She rarely sold the work, preferring to give it to art institutions.

But that afternoon, she explained that she had sold a painting from her collection, one that she loved and had lived with for 41 years. It was the iconic midcentury Pop Art work “Masterpiece,” by Roy Lichtenstein, which hung for years over the mantel in her dining room and, before that, in her home in Concord, Mass. For her, the painting was a daily and beloved connection to Lichtenstein and his wife, Dorothy, both of whom had been her friends. 

The painting sold for $165 million, at the time one of the highest prices for a work of art, and Gund wanted to use the money on a campaign to end mass incarceration in the United States. It was a wildly ambitious notion, but she was determined. We were there to help her design the vision.

Over time, I came to understand how incisive Gund was about all aspects of her life, from how she dressed to the institutions she shaped. She welcomed people into her home and her life with generosity and grace, especially artists, whom she adored. She had a dry wit, a love for family of blood and choice and a painfully felt sense of right and wrong.

She grew up in a wealthy family in Cleveland, part of a generation and class that emphasized divisions. She once said she knew only one Black person, Henrietta Givens, whom her parents employed as a housekeeper and cook. Early on, she was pulled toward art, taking drawing classes as a child and memorizing the works she loved most. When her father died, she inherited a trust and became a wealthy woman. It was then that she bought her first significant work, a sculpture by the English midcentury artist Henry Moore.

Her wealth — and a divorce — gave her the freedom to move to New York with her children in her 40s. She continued to study art history and collect, going on to develop deep relationships with some of the greatest artists of the 20th and 21st centuries — Johns, Robert Rauschenberg, Willem de Kooning, Louise Bourgeois, Kara Walker and many more. As Marina Abramovic put it: “So many collectors never have any relationship with the artist at all. They will buy the art, and then never meet the artist.” But Gund “found the key to our hearts and really made friends.”

Gund believed that art should be available to everyone, and this tenet informed her philanthropy. When New York City significantly cut funding for art in its public schools in 1977, she founded Studio in a School, a nonprofit to bring artists into the schools. In the 2010s, her sense of responsibility expanded after a series of infamous killings of Black men and boys, including Trayvon Martin and Eric Garner. She was influenced by books, including “The New Jim Crow” by Michelle Alexander and “Just Mercy” by Bryan Stevenson. It was after watching Ava DuVernay’s documentary “13th,” which explored interlocking questions of race, injustice and mass incarceration, that Gund told Darren Walker she wanted to use her resources to fund criminal-justice reform and creative expression.

Over the six months after our meeting in her living room, our small team helped her design what would become the Art for Justice Fund: an organization that, from 2017 to 2023, awarded $127 million in grants to support bail reform, criminal-justice advocacy and individual artists, many of whom had themselves been in prison. Gund welcomed formerly incarcerated people into her home to share their work and ideas.

There are many stately photographs of Gund, some taken in that Park Avenue living room, attired in pale lemon, saffron, coral, teal, her hair perfectly in place — and often with a beloved painting behind her. But a different photograph is my favorite. We see her from behind, next to a barbed-wire fence, in a quilted vest. She is walking beside an African American man wearing a jacket that says “prisoner” across the back. She is touching his arm, leaning toward him.

They are standing together inside the San Quentin prison in California, where the man she is speaking to has been incarcerated. And Aggie is listening to him closely.

Elizabeth Alexander is a poet and the president of the Mellon Foundation.

Angie Stone in 2003. Marc Baptiste
b. 1961

Angie Stone

She always knew she had what it took to make it, even if the music industry did not.

On Oct. 20, 1979, 17-year-old Angie Brown and her two best friends, Gwendolyn Chisolm and Cheryl Cook, were whisked into a concert headlined by the Sugarhill Gang, whose rollicking hit “Rapper’s Delight” was taking over the airwaves. The friends were a burgeoning girl group, singing gospel at their churches in Columbia, S.C., but they were also experimenting with hip-hop, crafting spunky songs inspired by funk music and early rap records. A road manager for the band spotted the girls and offered to let them into the show. Before they knew quite what was happening, they were backstage, standing face to face with Sylvia Robinson, the ruthless, glamorous co-owner of the fledgling Sugar Hill Records, which would go on to become hip-hop’s premiere label in the early ’80s.

The trio boasted that they could rap better than the Sugarhill Gang, and started performing for her. Robinson was only mildly encouraging, but on their way out the door, Cheryl remembered that they hadn’t done one of their best songs and persuaded Robinson to listen to one more. “Funk You Up” had a playful chorus that mimicked a game of musical chairs: “Get up, get up, get up, get up, get up, get up, get up, get up, get up” — and then, suddenly, “sit back down.” The girls barely got through it before Robinson stopped them. She’d found what she was looking for: another bona fide hit to anchor her new label. That night, they went onstage and danced with the Sugarhill Gang.

The song was released just weeks later, and soon the Sequence (the name the girls took on, because of their stair-step ages and heights) toured the United States with the Gang before Angie even graduated from high school. “Funk You Up” was the first charting rap hit performed by women. But that burst of recognition was brief. By the mid-80s, Angie left the label, feeling her contract was exploitative — the deal paid a pittance and locked her in for too many years. She decided to go out on her own. 

Over the next decade, Angie hustled, hard. She married (and took her husband’s last name, Stone), had a daughter named Diamond and divorced. She sang jingles for commercials and became sought after as a songwriting collaborator for other artists. In 1990, she sang background and played saxophone on tour with Lenny Kravitz. He remembers that she would say, “Lenny, I am overly confident,” and this confidence must have sustained her spirit during the years she spent supporting other people’s talents while still hoping to focus on her own.

In 1994, Stone met 19-year-old D’Angelo, a singing and composing savant. She was asked to help him put the finishing touches on an album that would become “Brown Sugar,” one of the greatest soul albums of the last century. Her commitment to long hours in the studio, honed over 15 years as a pro, ensured he would finish, and her writing and arranging helped ground his dreamy keyboard musings. While working together, Stone and D’Angelo fell in love.

“Brown Sugar” was a hit, and it turned D’Angelo into a sex symbol. But the people managing his career didn’t approve of the romance with Stone — they wanted him to appear single, or with someone younger and slinkier. The nadir of their relationship may have been a moment in 1996, when D’Angelo and Stone, who was pregnant with their son, Michael, were headed to a party given in D’Angelo’s honor by Giorgio Armani. When they were about to arrive, one of D’Angelo’s handlers had him get out of the car he was riding in with Stone and hop into another one, where a woman they thought was more suited to his image was waiting to accompany him on the red carpet.

Not content to be relegated to the wings, Stone decided to place herself center stage. Stone and D’Angelo broke up, and in the fall of 1999, at age 37, she released her first solo album, “Black Diamond,” which sold more than 500,000 copies, buoyed by a single that promised that the tears were dried up, that “there’s no more rain in this cloud.” Her second album, “Mahogany Soul” (2001), also went gold. These records modulate between moods: oceanic longing, effected by the undertow in Stone’s rich, resonant tone, mixing with waves of great, insurgent belief. Just as she did 20 years earlier with hip-hop, she became an architect of neo-soul, the subgenre that focused on feeling and musicianship in an era of rampant commercialism.

Over the next two decades, she released eight more studio albums and acted in films. But sustaining a career remained difficult. “Funk You Up” was sampled or interpolated more than 50 times, but the Sequence’s publisher had added her name to the writing credit, claiming a chunk of the royalties for herself. Mark Ronson and Bruno Mars used it on their megahit “Uptown Funk,” according to Stone and her bandmates, but they got no royalties for that. In her 60s, when Stone was beset by health challenges, she toured more than she wanted to, and lamented her financial situation. “Everybody that works hard for 50 years should be able to retire in peace,” she said in a livestream last year.

Instead of saying that she died, Stone’s friends have been saying that she’s still on tour. Perhaps it’s a joke to ease the sorrow of her absence, or it could be a dark acknowledgment of her overworked later years. But it might also have something to do with way-finding as a through line in her life. She never lost the sense of herself as an innovator. “She had a hustle and a grind in her that wouldn’t stop,” Chisolm told me. She sowed and sowed and stayed on the go. All roads lead back to that.

Niela Orr is a contributing writer for the magazine. She interviewed Angie Stone in 2023 for a feature on women in hip-hop.

Norma Swenson (right) with other members of the Boston Women’s Health Book Collective in the late 1970s. From the Schlesinger Library
b. 1932

Norma Swenson

She became radicalized by working on ‘Our Bodies, Ourselves.’

Norma Swenson was 26 and pregnant when her obstetrician told her that during labor she would receive an injection of scopolamine. Swenson left the office, put a nickel in a pay phone and called a local library. Scopolamine, the librarian explained, is a cerebral sedative. That was all Swenson needed to know: She told her OB that she would skip the medication.

When Swenson was ready to give birth in April 1958 on a 12-bed ward at Boston’s Lying-In Hospital, she watched as women received a cocktail of scopolamine and morphine, designed to create “twilight sleep,” during which they would feel less pain and, in theory, forget any pain they did experience. But scopolamine disoriented them. On the ward, Swenson heard women screaming and saw them thrashing in their beds after the injection. Some tried to climb out, leading nurses to restrain them with cloth ties. They cried and begged for their mothers. The morphine made them groggy. Their contractions slowed, making it harder for them to push. Doctors used forceps to extract the babies.

Since high school, when she read an excerpt from Grantly Dick-Read’s “Childbirth Without Fear” in her parents’ Reader’s Digest, Swenson had been intrigued by what is known as natural childbirth. Now, a decade later on the maternity ward, medical residents gathered to watch the woman who refused painkillers. She put one foot on a doctor’s shoulder and her other foot on another’s. Nurses held her hands. And she pushed and pushed. 

In the years after her daughter, Sarah, was born, Swenson became a childbirth activist and the president of the Boston Association for Childbirth Education. One day in 1971, a friend handed her a booklet. Originally called “Women and Their Bodies,” that year the authors renamed it “Our Bodies, Ourselves.” More than 130 pages, printed on newsprint and stapled together, the publication was largely circulated underground.

As she first read the pages, Swenson was shocked — and then awakened. The book was an explicit guide to taboos: abortion, which was illegal in most states; venereal diseases; sexuality. The images were detailed, the writing frank. One chapter highlighted women’s stories about masturbation and descriptions of the clitoris. “Hardly anybody knew there was such a thing,” Swenson later said.

She soon joined the roughly one dozen women, known as the Boston Women’s Health Book Collective, who had written and edited “Our Bodies, Ourselves,” or “OBOS,” as they called it. At 39, with a 13-year-old daughter, Swenson was about a decade older than most of the other women. To the group, she brought her vast knowledge about childbirth and the medical system.

In turn, the women and the work radicalized her. As a childbirth activist, she was in her “pre-feminist consciousness,” as she put it, charming doctors to make them allies and focusing only on childbirth. The women in the collective took a different approach, writing critically about systemic sexism in the medical establishment in chapters like “Women, Medicine and Capitalism,” “Some Myths About Women” and “Medical Institutions.”

They worked collaboratively, usually meeting weekly in living rooms in Cambridge, Somerville, Newton and Brookline, where they debated whether they should take grants from medical groups that, Swenson worried, would influence their writings and whether to leave their small left-wing publisher for a mainstream one. (Simon & Schuster began publishing “OBOS” in 1973; the women used their royalties to give grants to small women’s-health organizations.) Throughout, Swenson contributed and helped write sections on pregnancy, childbirth, postpartum care and, later, menopause and aging.

In the 1980s, the book grew to more than 600 pages (it would eventually balloon to 900-plus) and became a fixture in women’s bedrooms, given as gifts among friends and sisters, handed out to first-year college students, used in medical schools, distributed to farmworkers in Spanish. It eventually sold more than four million copies and was translated and adapted into 34 languages.

Swenson, meanwhile, was the first coordinator of international programs for the collective and traveled the world — the Philippines, India, Brazil, Mexico, Uganda — to help women adapt “OBOS” for their own countries and organize for abortion rights, birth control and the rights of mothers.

At home, she also helped build a new generation of activists, including through a class she taught at Harvard University until she was in her early 80s called Women, Health and Development. Many of her international students planned to return home after graduation to work in women’s health. Swenson pushed them to hone their research and public-speaking skills so they could testify in front of governments and talk to NGOs and the press. They practiced and performed for Swenson, who wrote up single-spaced, multipage critiques. She knew her students would have to be meticulous, factual and relentless. She also knew firsthand that it was entirely possible — if frustratingly slow and filled with setbacks — to shift the course of women’s health in ways no one could have imagined.

Maggie Jones is a contributing writer for the magazine and teaches writing at the University of Pittsburgh.

D’Wayne Wiggins in 2022. Bryon Malik
b. 1961

D’Wayne Wiggins

With Tony! Toni! Toné!, his songs captured the mood of how people lived and died in Oakland.

When his family moved from West to East Oakland, Calif., young D’Wayne Wiggins had to make new friends amid a tsunami of disinvestment and white flight. Determined to find community, he corralled stray dogs into a deserted house and soon had a crew of pals feeding the pack scraps from their parents’ tables. “The way D’Wayne talked about roaming Oakland as a kid,” his wife, Dori Caminong Wiggins, told me “it was like ‘Stand by Me’ and ‘The Wonder Years.’”

Wiggins often began his days with grits and eggs served by the Black Panthers’ Free Breakfast for Children program. When he walked or biked to school, he often heard a snatch of the Pointer Sisters or Jimi Hendrix blasting from a car window. “Oakland was different then,” Wiggins once said. “It was all community.” Wiggins, at the beginning of his journey as a member of what turned out to be the last great American R&B band, believed in connection, cooperation and what so often emerged from both: fellowship, mercy, song.

Wiggins was a product of an Oakland that still glowed with the communal energies of Black Power activism. His talents were nurtured by a mix of neighborhood creativity and now long-defunct public school music programs. As a teenager, Wiggins fronted a handful of local bands, and his skills were polished in outfits like the backing band for Castlemont High’s famed choral group the Castleers (along with his half brother Charles Ray Wiggins, now known as Raphael Saadiq). He eventually toured with the gospel maverick Tramaine Hawkins, while Raphael and their cousin Timothy toured with Sheila E. on Prince’s 1986 tour. Later that year, when the three were home from their journeys, they founded Tony! Toni! Toné! — the band that became the main professional project of D’Wayne Wiggins’ life. 

In the late 1980s, Oakland was a mosh pit of musical scenes. MC Hammer and Too Short led rap and En Vogue reclaimed the girl group, but the Tonys (as they came to be known) both worked from and re-engineered the traditions of gospel, soul and funk. In fact, the Tonys pulled so hard from gospel that their dirgelike 1988 single “Little Walter” repurposed “Wade in the Water.” It was hit. A cautionary tale about the quick money to be made during the crack era, it reflected, without chastisement, the mood of how so many people lived and died in Oakland.

By 1990, after the destructive Loma Prieta earthquake and the murder of the Black Panthers co-founder Huey P. Newton, the Tonys released “Feels Good,” a song so celebratory, wry and wicked with funk that it felt like all of the backyard barbecues and wedding receptions it would ignite. The band itself — Make you feel good / Like Tony! Toni! Toné!, the Notorious B.I.G. rapped five years later — became synonymous with joy.

The fraternal rapport that made the band so distinctive began to fade after its 1996 album, “House of Music,” and the trio went their separate ways. In West Oakland, his original stomping grounds, Wiggins opened House of Music, a recording studio and artistic refuge. In the ’90s, he helped Keyshia Cole develop her sound and signed Destiny’s Child to his Grassroots Entertainment label, producing tracks on their debut. He went on to work with Alicia Keys, with whom he won a Grammy, and the multi-instrumentalist H.E.R. Outside of music, Wiggins opened Jahva House, a cafe near Oakland’s Lake Merritt, with Michelle Lochin, to whom he was married for 13 years. He eventually joined a successful reunion tour in 2023 with his half brother and cousin.

Wiggins last performed at a hotel ballroom in Oakland with a band he had created post-Tonys. There was an oxygen tank just out of view and two doctors in attendance. A week later, D’Wayne Patrice Wiggins — husband, father and top-tier charmer in a town that creates them by the thousands — succumbed to what had been a private war with bladder cancer.

There were 17 songs on D’Wayne’s last playlist, his final demonstration of taste and act of curation. What he created for comfort as he lay dying was a mixtape of artists like Grace Jones, Earth, Wind & Fire and Elton John, featuring their very best collaborative works. D’Wayne found solace in the band dynamic that he’d found so much joy in — especially with Raphael and Timothy. D’Wayne, in his last moments, wanted us to feel good. According to Dori Wiggins, the song that her husband was listening to as he left his electric life was written by D’Wayne and his cousin Tim. It’s from Tony! Toni! Toné’s last album, and on it D’Wayne sings, When I pass away / Party — don’t cry.

Danyel Smith is a contributing writer at the magazine. She is the author of “Shine Bright: A Very Personal History of Black Women in Pop,” and the creator and host of the podcast “Black Girl Songbook.”

Terence Stamp in 1983. Clive Arrowsmith/Camera Press​, via Redux
b. 1938

Terence Stamp

He left acting for an ashram — and returned to the screen transformed.

In 1969, after filming “The Mind of Mr. Soames”his 10th film in a decade during which he had been, according to one critic, “the seductive dark prince of British cinema” — Terence Stamp, then 31, came to the realization that his career was over. A certain enigmatic quality, coupled with extraordinary beauty, had made him the epitome of cool in his era. (His highly publicized affair with the British ur-model Jean Shrimpton didn’t hurt.) He had worked with major directors, including William Wyler and Federico Fellini. But suddenly the film offers had dried up. Looking back at that moment 45 years later, he told Charlie Rose, “I was so identified with the ’60s, but when the ’60s ended, I kind of ended with it.”

His reaction to perceived rejection was abrupt — and decisive. As he described it in his 2017 memoir, “The Ocean Fell Into the Drop”: “Carefully packed my weekend Gucci holdall and set off. First stop: Delhi.”

Lots of celebrities at the time were making that pilgrimage, but Stamp’s sojourn turned into something else entirely: an eight-year retreat, prompted as much by a life-changing lunch with the Indian sage Krishnamurti as by his career decline. What Krishnamurti tapped into was Stamp’s loss of contentment with “the outer trappings of success.” In India, Stamp found himself drawn to “the phenomenon of breath,” and became obsessed with studying it. His immersion went deep enough that he eventually took on a new name, Swami Deva Veeten, and moved into an ashram. 

But as serious as his Sufi practice became, the ambitious East End boy was always convinced that “the call” back to the movies would come, and he wanted to be prepared. Studying Sufi breathing, he seemed never to have forgotten the practical advice Laurence Olivier gave him at the start of his career: “Always work on your voice, Terence, as when your looks fade, the voice continues to be empowered.”

The call finally did come, in 1978, in the form of a telegram from his agent, inviting him to London to appear as General Zod in “Superman” and “Superman II.”

One of the legends that Stamp liked to spread about himself had him boarding a plane, still in his ashram clothes, and being approached on the set by his perplexed co-star Marlon Brando. He must have known how hard that story would be to accept, coming from a man who included the “Gucci holdall” in his quest narrative, and who insisted that his conversations with Krishnamurti “invariably began with a current discussion of the price of bespoke shoes.” He amended the story to one in which he met Brando, his boyhood idol, accompanied by two young women, whom Brando immediately offered to share.

Whatever the truth of Stamp’s arrival on set, his performance as General Zod led to a remarkable second act. He had become a different actor — no trace of the Cockney was left, his voice having taken on an impressive authority. (All that breath work had paid off.) His time away had “transformed” him, making it easier to accept supporting-actor status.

It was inevitable, though, that he would get back to playing leads. His role as a trans woman in “The Adventures of Priscilla, Queen of the Desert” came first, in 1994, but it was Steven Soderbergh’s “The Limey,” in 1999, that amounted to Stamp’s true restoration.

In that film he plays a man named Wilson, recently released from a British prison, headed to Los Angeles to avenge the murder of his daughter. Gone is the Etonian grace that Stamp’s voice had affected in all those ’80s and ’90s supporting roles. “Wilson would speak as my own father had,” he wrote in his memoir. The entire performance can be seen as a homage to that father, a coal stoker on merchant ships in World War II, a “silent” man who never offered his son a word of encouragement, but whom Stamp came to admire for having supported five children on 12 pounds a week. Harking back by way of his father to his own rough beginnings — embracing the self-described “East End spiv” he had worked to distance himself from — seemed to free him as an actor. For long passages, the film becomes a study of Stamp’s aging face in troubled repose. Of all his roles, it’s the one where he seems most completely there.

“The Limey” did not, alas, lead to a string of great leading roles. But perhaps it offered Stamp the opportunity for a deeper reckoning with an early career that had been cut short.

The wonderful conceit of Soderbergh’s film was to use, as back story for the aging Wilson, clips from Stamp’s performance in 1967’s “Poor Cow,” his loosest, most appealing early role. When an interviewer asked him, after a screening of the film, where the guy who played the young Wilson was, Stamp caught himself “feeling nostalgic for my own 25-year-old physique.” But then he responded: “He had his 15 minutes of fame. I haven’t heard from him lately.”

Anthony Giardina’s most recent novel, “Remember This,” will be published in paperback in March.