Haunted by His Brother, He Revolutionized Physics

To John Archibald Wheeler, the race to explain time was personal.

Nautilus|getpocket.com

  • Amanda Gefter

Illustration by Wesley Allsbrook.

The postcard contained only two words: “Hurry up.”

John Archibald Wheeler, a 33-year-old physicist, was in Hanford, Wash., working on the nuclear reactor that was feeding plutonium to Los Alamos, when he received the postcard from his younger brother, Joe. It was late summer, 1944. Joe was fighting on the front lines of World War II in Italy. He had a good idea what his older brother was up to. He knew that five years earlier, Wheeler had sat down with Danish scientist Niels Bohr and worked out the physics of nuclear fission, showing that unstable isotopes of elements like uranium or soon-to-be-discovered plutonium would, when bombarded with neutrons, split down the seams, releasing unimaginable stores of atomic energy. Enough to flatten a city. Enough to end a war.

After the postcard’s arrival, Wheeler worked as quickly as he could, and the Manhattan Project completed its construction of the atomic bomb the following summer. Over the Jornada del Muerto Desert in New Mexico, physicists detonated the first nuclear explosion in human history, turning 1,000 feet of desert sand to glass. J. Robert Oppenheimer, the project’s director, watched from the safety of a base camp 10 miles away and silently quoted Hindu scripture from the Bhagavad Gita: “Now I am become Death, the destroyer of worlds.” In Hanford, Wheeler was thinking something different: I hope I’m not too late. He didn’t know that on a hillside near Florence, lying in a foxhole, Joe was already dead.

When Wheeler learned the news, he was devastated. He blamed himself. “One cannot escape the conclusion that an atomic bomb program started a year earlier and concluded a year sooner would have spared 15 million lives, my brother Joe’s among them,” he wrote in his memoir. “I could—probably—have influenced the decision makers if I had tried.”

Time. As a physicist, Wheeler had always been curious to untangle the nature of that mysterious dimension. But now, in the wake of Joe’s death, it was personal.

Wheeler would spend the rest of his life struggling against time. His journals, which he always kept at hand (and which today are stashed, unpublished, in the archives of the American Philosophical Society Library in Philadelphia), reveal a stunning portrait of an obsessed thinker, ever-aware of his looming mortality, caught in a race against time to answer not a question, but the question: “How come existence?”

“Of all obstacles to a thoroughly penetrating account of existence, none looms up more dismayingly than ‘time,’” Wheeler wrote. “Explain time? Not without explaining existence. Explain existence? Not without explaining time.”

As the years raced on, Wheeler’s journal entries about time grew more frequent and urgent, their lines shakier. In one entry, he quoted the Danish scientist and poet Piet Hein:

 “I’d like to know
what this whole show
is all about
before it’s out.”

Before his curtain came down, Wheeler changed our understanding of time more radically than any thinker before him or since—a change driven by the memory of his brother, a revolution fueled by regret.

Wheeler was thinking: I hope I’m not too late. He didn’t know that on a hillside near Florence, lying in a foxhole, Joe was already dead.

***

In 1905, six years before Wheeler was born, Einstein formulated his theory of special relativity. He discovered that time does not flow at a steady pace everywhere for everyone; instead, it’s relative to the motion of an observer. The faster you go, the slower time goes. If you could go as fast as light, you’d see time come to a halt and disappear.

But in the years following Einstein’s discovery, the formulation of quantum mechanics led physicists to the opposite conclusion about time. Quantum systems are described by mathematical waves called wavefunctions, which encode the probabilities for finding the system in any given state upon measurement. But the wavefunction isn’t static. It changes. It evolves in time. Time, in other words, is defined outside the quantum system, an external clock that ticks away second after absolute second, in direct defiance of Einstein.

That’s where things stood—the two theories in a stalemate, the nature of time up in the air—when Wheeler first came onto the physics scene in the 1930s. As he settled into an academic career at Princeton University, Wheeler was soft-spoken and impossibly polite, donning neatly pressed suits and ties. But behind his conservative demeanor lay a fearlessly radical mind. Raised by a family of librarians, Wheeler was a voracious reader. As he struggled with thorny problems in general relativity and quantum mechanics, he consulted not only Einstein and Bohr but the novels of Henry James and the poetry of Spanish writer Antonio Machado. He lugged a thesaurus in his suitcase when he travelled.

Wheeler’s first inkling that time wasn’t quite what it seemed came one night in the spring of 1940 at Princeton. He was thinking about positrons. Positrons are the antiparticle alter egos of electrons: same mass, same spin, opposite charge. But why should such alter egos exist at all? When the idea struck, Wheeler called his student Richard Feynman and announced, “They are all the same particle!”

Imagine there’s only one lone electron in the whole universe, Wheeler said, winding its way through space and time, tracing paths so convoluted that this single particle takes on the illusion of countless particles, including positrons. A positron, Wheeler declared, is just an electron moving backwards in time. (A good-natured Feynman, in his acceptance speech for the 1965 Nobel Prize in Physics, said he stole that idea from Wheeler.)

The puzzle of existence: “I am not ‘I’ unless I continue to hammer at that nut,” wrote John Archibald Wheeler. Photo from Corbis Images.

After working on the Manhattan Project in the 1940s, Wheeler was eager to get back to Princeton and theoretical physics. Yet his return was delayed. In 1950, still haunted by his failure to act quickly enough to save his brother, he joined physicist Edward Teller in Los Alamos to build a weapon even deadlier than the atomic bomb—the hydrogen bomb. On November 1, 1952, Wheeler was on board the S.S. Curtis, about 35 miles from the island of Elugelab in the Pacific. He watched the U.S. detonate an H-bomb with 700 times the energy of the bomb that destroyed Hiroshima. When the test was over, so was the island of Elugelab.

With his work at Los Alamos complete, Wheeler “fell in love with general relativity and gravitation.” Back at Princeton, just down the street from Einstein’s home, he stood at a chalkboard and gave the first course ever taught on the subject. General relativity described how mass could warp spacetime into strange geometries that we call gravity. Wheeler wanted to know just how strange those geometries could get. As he pushed the theory to its limits, he became fascinated by an object that seemed to turn time on its head. It was called an Einstein-Rosen bridge, and it was a kind of tunnel that carves out a cosmic shortcut, connecting distant points in spacetime so that by entering one end and emerging from the other, one could travel faster than light or backward in time. Wheeler, who loved language, knew that one could breathe life into obscure convolutions of mathematics by giving them names; in 1957, he gave this warped bit of reality a name: wormhole.

As he pushed further through spacetime, he came upon another gravitational anomaly, a place where mass is so densely packed that gravity grows infinitely strong and spacetime infinitely mangled. This, too, he gave a name: black hole. It was a place where “time” lost all meaning, as if it never existed in the first place. “Every black hole brings an end to time,” Wheeler wrote.

As he pushed further through spacetime, Wheeler came upon another gravitational anomaly. This, too, he gave a name: black hole.

***

In the 1960s, as the Vietnam War tore the fabric of American culture, Wheeler struggled to mend a rift in physics between general relativity and quantum mechanics—a rift called time. One day in 1965, while waiting out a layover in North Carolina, Wheeler asked his colleague Bryce DeWitt to keep him company for a few hours at the airport. In the terminal, Wheeler and DeWitt wrote down an equation for a wavefunction, which Wheeler called the Einstein-Schrödinger equation, and which everyone else later called the Wheeler-DeWitt equation. (DeWitt eventually called it “that damned equation.”)

Instead of a wavefunction describing some system of particles moving around in a lab, Wheeler and DeWitt’s wavefunction described the whole universe. The only problem was where to put the clock. They couldn’t put it outside the universe, because the universe, by definition, has no outside. So while their equation successfully combined the best of both relativity and quantum theory, it also described a universe that couldn’t evolve—a frozen universe, stuck in a single, eternal instant.

Wheeler’s work on wormholes had already shown him that, like electrons and positrons, we too might be capable of bending and breaking time’s arrow. Meanwhile his work on the physics of black holes had led him to suspect that time, deep down, does not exist. Now, at the Raleigh International Airport, that damned equation left Wheeler with a nagging hunch that time couldn’t be a fundamental ingredient of reality. It had to be, as Einstein said, a stubbornly persistent illusion, a result of the fact that we are stuck inside a universe that only has an inside.

Wheeler was convinced the central clue to the puzzle of existence—and in turn of time—was quantum measurement. He saw that the profound strangeness of quantum theory lies in the fact that when an observer makes a measurement, he doesn’t measure something that already exists in the world. Instead, his measurement somehow brings that very thing into existence—a bizarre fact that no one in his right mind would have bought, except that it had been proven again and again with a mind-melting experiment known as the double-slit. It was an experiment that Wheeler could not get out of his head.

In the experiment, single photons are shot from a laser at a screen with two tiny parallel slits, then land on a photographic plate on the other side, where they leave a dot of light. Each photon has a 50/50 chance of passing through either slit, so after many rounds of this, you’d expect to see two big blobs of light on the plate, one showing the pile of photons that passed through slit A and the other showing the pile that passed through slit B. You don’t. Instead you see a series of black and white stripes—an interference pattern. “Watching this actual experiment in progress makes vivid the quantum behavior,” Wheeler wrote. “Simple though it is in concept, it strikingly brings out the mind-bending strangeness of quantum theory.”

As impossible as it sounds, the interference pattern can only mean one thing: each photon went through both slits simultaneously. As the photon heads toward the screen, it is described by a quantum wavefunction. At the screen, the wavefunction splits in two. The two versions of the same photon travel through each slit, and when they emerge on the other side, their wavefunctions recombine—only now they are partially out of phase. Where the waves align, the light is amplified, producing stripes of bright light on the plate. Where they are out of sync, the light cancels itself out, leaving stripes of darkness.

Things get even stranger, however, when you try to catch the photons passing through the slits. Place a detector at each slit and run the experiment again, photon after photon. Dot by dot, a pattern begins to emerge. It’s not the stripes. There are two big blobs on the plate, one opposite each slit. Each photon took only one path at a time. As if it knows it’s being watched.

Photons, of course, don’t know anything. But by choosing which property of a system to measure, we determine the state of the system. If we don’t ask which path the photon takes, it takes both. Our asking creates the path.

Could the same idea be scaled up, Wheeler wondered. Could our asking about the origin of existence, about the Big Bang and 13.8 billion years of cosmic history, could that create the universe? “Quantum principle as tiny tip of giant iceberg, as umbilicus of the world,” Wheeler scrawled in his journal on June 27, 1974. “Past present and future tied more intimately than one realizes.”

In his journal, Wheeler drew a picture of a capital-U for “universe,” with a giant eye perched atop the left-hand peak, staring across the letter’s abyss to the tip of the right-hand side: the origin of time. As you follow the swoop of the U from right to left, time marches forward and the universe grows. Stars form and then die, spewing their carbon ashes into the emptiness of space. In a corner of the sky, some carbon lands on a rocky planet, merges into some primordial goo, grows, evolves until … an eye! The universe has created an observer and now, in an act of quantum measurement, the observer looks back and creates the universe. Wheeler scribbled a caption beneath the drawing: “The universe as a self-excited system.”

The problem with the picture, Wheeler knew, was that it conflicted with our most basic understanding of time. It was one thing for electrons to zip backward through time, or for wormholes to skirt time’s arrow. It was something else entirely to talk about creation and causation. The past flows to the present and then the present turns around and causes the past?

Have to come through to a resolution of these issues, whatever the cost,” Wheeler wrote in his journal. “Nowhere more than here can I try to live up to my responsibilities to mankind living and dead, to [his wife] Janette and my children and grandchildren; to the child that might have been but was not; to Joe…” He glued into the journal a newspaper clipping from The Daily Telegraph. The headline read: “Days are Getting Shorter.”

***

In 1979, Wheeler gave a lecture at the University of Maryland in which he proposed a bold new thought experiment, one that would become the most dramatic application of his ideas about time: the delayed choice.

Wheeler had realized that it would be possible to arrange the usual double slit experiment in such a way that the observer can decide whether he wants to see stripes or blobs—that is, he can create a bit of reality—after the photon has already passed through the screen. At the last possible second, he can choose to remove the photographic plate, revealing two small telescopes: one pointed at the left slit, the other at the right. The telescopes can tell which slit the photon has passed through. But if the observer leaves the plate in place, the interference pattern forms. The observer’s delayed choice determines whether the photon has taken one path or two after it has presumably already done one or the other.

For Wheeler, this wasn’t a mere curiosity. This was a clue to the universe’s existence. It was the mechanism he needed to get his U-drawing to work, a bending of the rules of time that might allow the universe—one that was born in a Big Bang 13.8 billion years ago—to be created right now. By us.

To see the point, Wheeler said, just take the delayed choice experiment and scale it up. Imagine light traveling toward Earth from a quasar a billion light years away. A massive galaxy sits between the quasar and the Earth, diverting the light’s path with its gravitational field like a lens. The light bends around the galaxy, skirting either left or right with equal probability and, for the sake of the thought experiment, arrives on Earth a single photon at a time. Again we are faced with a similar choice: We can center a photographic plate at the light’s arrival spot, where an interference pattern will gradually emerge, or we can point our telescope to the left or right of the galaxy to see which path the light took. Our choice determines which of two mutually exclusive histories the photon lived. We determine its route (or routes) start to finish, right now—despite the fact that it began its journey a billion years ago.

Listening intently in the audience was a physicist named Carroll Alley. Alley had known Wheeler in Princeton, where he had studied under the physicist Robert Henry Dicke, whose research group had come up with the idea of putting mirrors on the moon.

Dicke and his team were interested in studying general relativity by looking at subtle gravitational interactions between the moon and the Earth, which would require exquisitely accurate measurements of the distance to the moon as it swept along its orbit. They realized if they could put mirrors on the lunar surface, they could bounce lasers off of them and time how long it took the light to return. Alley became the principle investigator of the NASA project and got three mirrors on the moon; the first one was set down in 1969 by Neil Armstrong.

Now, as Alley listened to Wheeler speak, it dawned on him that he might be able to use the same techniques he had used for measuring laser light bouncing off the moon to realize Wheeler’s vision in the lab. The light signals returning from the mirrors on the moon had been so weak that Alley and his team had developed sophisticated ways to measure single photons, which was exactly what Wheeler’s delayed choice setup required.

In 1984, Alley—along with Oleg Jakubowicz and William Wickes, both of whom had also been in the audience that day—finally got the experiment to run. It worked just as Wheeler had imagined: measurements made in the present can create the past. Time as we once knew it does not exist; past does not come indelibly before future. History, Wheeler discovered—the kind that brews guilt, the kind that lies dormant in foxholes—is never set in stone.

Later that year, he wrote, “How come existence? How come the quantum? Is death the penalty for raising such a question?”

Still, some fundamental insight eluded Wheeler. He knew that quantum measurement allowed observers in the present to create the past, the universe hoisting itself into existence by its bootstraps. But how did quantum measurement do it? And if time was not a primordial category, why was it so relentless? Wheeler’s journals became a postcard of their own, written again and again to himself. Hurry up. The puzzle of existence taunted him. “I am not ‘I’ unless I continue to hammer at that nut,” he wrote. “Stop and I become a shrunken old man. Continue and I have a gleam in my eye.”

In 1988, Wheeler’s health was wavering; he had already undergone cardiac surgery two years before. Now, his doctors gave him an expiration date. They told him he could expect to live for another three to five years. Under the threat of his own mortality, Wheeler grew despondent, worried that he would not solve the mystery of existence in time to even the score for what he saw his personal failure to save his brother. Under the heading “Apology,” he wrote in his journal, “It will take years of work to develop these ideas. I—76—don’t have them.”

Luckily, like scientists before them, the doctors had gotten the nature of time all wrong. The gleam in Wheeler’s eye continued to shine, and he hammered away at the mystery of quantum mechanics and the strange loops of time. “Behind the glory of the quantum—shame,” he wrote on June 11, 1999. “Why shame? Because we still don’t understand how come the quantum. Quantum as signal of self-created universe?” Later that year, he wrote, “How come existence? How come the quantum? Is death the penalty for raising such a question—”

Although Wheeler’s journals reveal a driven man on a lonely quest, his influence was widespread. In his last years, Stephen Hawking, along with his collaborator Thomas Hertog of the Institute for Theoretical Physics at the KU Leuven in Belgium, developed an approach known as top-down cosmology, a direct descendant of Wheeler’s delayed choice. Just as photons from a distant quasar take multiple paths simultaneously when no one’s looking, the universe, Hawking and Hertog argued, has multiple histories. And just as observers can make measurements that determine a photon’s history stretching back billions of years, the history of the universe only becomes reality when an observer makes a measurement. By applying the laws of quantum mechanics to the universe as a whole, Hawking carried the torch that Wheeler lit that day back at the North Carolina airport, and challenges every intuition we have about time in the process. The top-down approach “leads to a profoundly different view of cosmology,” Hawking wrote, “and the relation between cause and effect.” It’s exactly what Wheeler had been driving at when he drew the eye atop his self-creating universe.

In 2003, Wheeler was still chasing the meaning of existence. “I am as far as can be imagined from being able to speak so reasonably about ‘How come existence’!” he wrote in his journal. “Not much time left to find out!”

On April 13, 2008, in Hightstown, N.J., at the age of 96, John Archibald Wheeler finally lost his race against time. That stubbornly persistent illusion.

Amanda Gefter is a physics writer and author of Trespassing on Einstein’s Lawn: A father, a daughter, the meaning of nothing and the beginning of everything. She lives in Cambridge, Massachusetts.

This article was originally published on November 8, 2018, by Nautilus, and is republished here with permission.

The Prophecies of Jane Jacobs

She is renowned for championing urban diversity, but her real prescience lay in her fears about the fragility of democracy.

The Atlantic|

  • Nathaniel Rich (getpocket.com)

Photo by Josh Cochran.

The year she turned 18, Jane Butzner traveled from her hometown of Scranton, Pennsylvania, to the Appalachian hamlet of Higgins, North Carolina, where she encountered a mystery that haunted her for the rest of her life. It was 1934, the midpoint of the Great Depression, a difficult time to hold a job, even an unpaid one. Butzner—later Jacobs—had been laid off from The Scranton Republican after almost a year working without pay as a cub reporter. At her parents’ suggestion, she went to live in the mountains with her aunt Martha Robison, a Presbyterian missionary. Robison had come to Higgins 12 years earlier on a mission and was so staggered by its poverty that she refused to leave. There were no paved roads, school was rarely in session, the illiterate preacher believed the world was flat, and commerce was conducted by barter. Robison built a church and a community center, adopted children, and established classes in pottery, weaving, and woodwork. Nevertheless, the townspeople continued to live a primitive existence in which, as Robison’s niece later said, “the snapping of a pitchfork or the rusting of a plow posed a serious financial crisis.”

Jane Jacobs wrote about Higgins in Cities and the Wealth of Nations (1984) and Dark Age Ahead (2004), but its negative example looms over her entire body of work. Higgins had not always been backwards. In the early 1700s, as Jacobs noted in Cities and the Wealth of Nations, its founders, three English brothers named Higgins and their families, possessed a wide range of knowledge and skills:

spinning and weaving, loom construction, cabinetmaking, corn milling, house and water-mill construction, dairying, poultry and hog raising, gardening, whiskey distilling, hound breeding, molasses making from sorghum cane, basket weaving, biscuit baking, music making with violins …

By the 1920s, the brothers’ descendants had lost nearly all of these, apart from making molasses, which was sold in the county seat, Burnsville, 12 miles away. Most residents had never traveled that far, however, because the only way to get there was by mule on a rough mountain track. Candles were a vanishing luxury. After the few remaining cows died, there would be no more milk or butter. One woman still remembered how to weave baskets, but she was close to death. When Robison suggested building the church with large stones from the creek, the community elders rebuked her. Over generations the townspeople had not only forgotten how to build with stone. They had lost the knowledge that such a thing was possible.

How had Higgins fallen so low? Mountain isolation contributed, but it was not the only factor. The same fate, after all, had befallen much larger cities, and even empires—Rome, the Olmecs, the New Kingdom of Egypt, and perhaps other civilizations, like the people who painted the Lascaux caves, for which we don’t even have names. “Suppose, hypothetically, that the world were to behave like a single sluggish empire in decline,” wrote Jacobs.

Such a thing could happen if cities in too many places stagnated simultaneously or in quick succession. Or it could happen if the world were to become, in fact, one single sluggish empire … We all have our nightmares about the future of economic life; that one is mine.

In the centenary of her birth, Jacobs has been remembered as our Solon of cities: a shrewd theorist who revealed how cities work, why they thrive, and why they fail. Jacobs lived to the age of 89, long enough to see her renegade theories become conventional wisdom. No one questions anymore that lively neighborhoods require diversity of use and function, that more roads lead to more cars, that historic buildings should be preserved, that investment in public transportation reduces traffic and promotes neighborhood activity, that “flexible and gradual change” is almost always preferable to “cataclysmic,” broad-stroke redevelopment.

Urban life was Jacobs’s great subject. But her great theme was the fragility of democracy—how difficult it is to maintain, how easily it can crumble. A city offered the perfect laboratory in which to study democracy’s intricate, interconnected gears and ballistics. “When we deal with cities,” she wrote in The Death and Life of Great American Cities (1961), “we are dealing with life at its most complex and intense.” When cities succeed, they represent the purest manifestation of democratic ideals: “Cities have the capability of providing something for everybody, only because, and only when, they are created by everybody.” When cities fail, they fail for the same reasons democracies fail: corruption, tyranny, homogenization, overspecialization, cultural drift and atrophy.

In a year when American democracy has courted despotism, Jacobs’s work offers a warning and a challenge. Her goal was never merely to enlighten urban planners. In her work she argued, with increasing urgency, that the distance between New York City and Higgins is not as great as it seems. It is not very great at all, and it is shrinking.

***

Four books are united in their determination to undermine the seductive myth that Jacobs, as her biographer Peter L. Laurence puts it, “was primarily a housewife with unusual abilities to observe and defend the domestic surroundings of her Greenwich Village home.” This line was codified in 1962 by The New Yorker’s architecture critic Lewis Mumford in a 30-page review of Death and Life that called the book “a mingling of sense and sentimentality, of mature judgments and schoolgirl howlers.” (If Mumford was responsible for the article’s headline, “Mother Jacobs’ Home Remedies,” he seems to have regretted it. In his 1968 collection, The Urban Prospect, it appears under the moderately less chauvinistic “Home Remedies for Urban Cancer.”) The allegation of amateurism often went unchallenged because most of Jacobs’s considerable body of writing before The Death and Life of Great American Cities had been published without a byline.

13986fd21.jpg

University of Pennsylvania Press

Two biographies—Laurence’s Becoming Jane Jacobs, a close, vivid study of Jacobs’s intellectual development, and Robert Kanigel’s broader Eyes on the Street: The Life of Jane Jacobs—as well as an anthology of previously uncollected articles and speeches, Vital Little Plans: The Short Works of Jane Jacobs, and Jane Jacobs: The Last Interview and Other Conversations correct the record. By the time she published her masterpiece, at the age of 45, she had been writing about urban redevelopment for nearly a decade in dozens of lengthy articles for Architectural Forum. Before that she had written about, and in direct service of, American democracy.

After her six-month purgatory in Higgins, Jacobs moved to Brooklyn with the goal of becoming a writer. Within a year, she began writing a succession of Lieblingesque columns for Vogue about New York’s fur, diamond, leather, and flower districts: “All the ingredients of a lavender-and-old-lace story, with a rip-roaring, contrasting background, are in New York’s wholesale flower district.” She signed up for classes at Columbia’s University Extension Program, many of them in economic geography, the interdisciplinary study of economics, history, culture, and the environment. (She would later call herself a “city naturalist.”) In one of these courses, Jacobs likely encountered Henri Pirenne’s Medieval Cities: Their Origins and the Revival of Trade (1925), which explained how cities promoted democratic values, and which she cited frequently throughout her career. But it was a pair of classes in American constitutional law that inspired her first book.

Constitutional Chaff was published by Columbia University Press despite the author’s age (24) and lack of a college degree. It is a compilation of failed proposals from the Constitutional Convention of 1787, such as a third house of Congress and direct election of a Senate that never went out of session. In her introduction, Jacobs argued that the vigorous debate over the text of the Constitution reflected the soul of American democracy as vividly as the ratified document itself did. The losers deserved to be heard, even a century and a half after their arguments had been defeated. It was a sentiment the Founders, at pains to protect the rights of minority factions, would have cheered.

a052f6369.jpg

Knopf

After writing about the United States government, Jacobs went to work for it. She spent most of the next decade as a professional propagandist. At the U.S. Office of War Information, which she joined in the fall of 1943, she wrote articles about American history, industry, and politics for placement in the foreign press. Her bureau chief praised “her quick grasp of the propaganda job to be done.” After the war, she was hired by the State Department to join the staff of Amerika, one of the more glorious efforts in auto-mythopoeia that the nation has produced.

The publication, which originated in an agreement between Franklin D. Roosevelt and Joseph Stalin at Yalta to expand cultural diplomacy between the two nations, was designed to resemble Life magazine, with illustrated features about Radio City Music Hall, Benjamin Franklin, Arizona deserts, and the Senate. The circulation was initially limited to 10,000 copies, not nearly enough to satisfy demand; Laurence writes that though the official price in 1946 was 10 rubles (83 cents), it sold on the black market for 1,000. (The Soviets produced a counterpart publication, Soviet Life, but despite its editors’ best efforts—“Leonid I. Brezhnev’s Reminiscences,” “A Guide to the 15 Union Republics,” “Tashkent, Seattle’s Sister City”—it somehow failed to attract a commensurate following in the U.S.) In a Manhattan office building near Columbus Circle, Jacobs wrote articles about American cafeterias, the World Series, and modern art, and modeled maternity clothes for a feature on women’s fashion.

She was sensitive to reader opinion. Kanigel mentions one criticism that may have helped shape her later career. In 1949, V. Kusakov of the Academy of Architecture in the U.S.S.R. complained in a Soviet publication about two articles, uncredited but written by Jacobs, praising the work of Frank Lloyd Wright and other modern architects. Kusakov attacked Amerika for neglecting to cover the more important story: “the ever increasing housing crisis which the cities of America are experiencing.” American capitalism, Kusakov wrote, “dooms the majority of the population to a negative existence and death in ill-smelling cesspools, in slums deprived of air, sunlight, and trees or shrubs.”

The column unsettled Jacobs, who responded with a thorough investigation of life in America’s inner-city neighborhoods. In her article, she proposed slum-clearing and the construction of high-rise apartment towers, remedies she would later excoriate. Kanigel suggests that Jacobs was even then not entirely satisfied with these arguments. “This seemingly narrow question would slip out of its original borders,” he writes, “become something big to chew on, broaden into one of the biggest questions of all: What, really, was the Good Life?” How, in other words, could urban policy promote life, liberty, and the pursuit of happiness?

66580e384.jpg

Random House

For her devoted, thoughtful work in service of American mythos, Jacobs came under federal investigation for suspected ties to the Soviet Union. At the State Department, she’d had the misfortune of serving under Alger Hiss. When she tried to travel to Siberia in 1945 to report a freelance article, she asked Hiss for help securing a visa. As Hiss was already under secret investigation for espionage, the request roused the suspicion of the FBI. Jacobs, Laurence writes, was likely one of the names on Joseph McCarthy’s infamous list of known Communists “working in and shaping policy in the State Department.” J. Edgar Hoover demanded to oversee her investigation himself. During the course of four years, Jacobs was required to sign multiple Oaths of Office, declare that she was not a Communist or a Fascist, and endure a series of interrogations by the State Department’s Loyalty Security Board. At least 13 of her friends, family members, and colleagues were interviewed by FBI agents. One informant said that he believed her to be a Communist sympathizer because she lived in Greenwich Village.

By 1952, she’d had enough. After yet another inquiry—requesting her views on, among other things, the Marshall Plan, the United Public Workers of America, and atomic energy—she sent the board an 8,000-word defense that remains the most powerful declaration of her moral convictions. “I was brought up to believe that there is no virtue in conforming meekly to the dominant opinion of the moment,” she wrote, defending her integrity and shaming her inquisitors.

I was encouraged to believe that simple conformity results in stagnation for a society, and that American progress has been largely owing to the opportunity for experimentation, the leeway given initiative, and to a gusto and a freedom for chewing over odd ideas. I was taught that the American’s right to be a free individual, not at the mercy of the state, was hard-won and that its price was eternal vigilance, that I too would have to be vigilant.

In The Death and Life of Great American Cities, Jacobs sought to translate these principles of individual liberty into urban design.

***

This was not an intuitive process. What ratio of green space to residential acreage was most conducive to individual liberty? Did tenement buildings or high-rise towers create better opportunities for experimentation? What block length, what width of sidewalk, what frequency of stoplights best encouraged the chewing-over of odd ideas?

She pursued this line of questioning at Architectural Forum, the leading architectural publication of its day, after resigning from the State Department in 1952. The magazine was edited by Douglas Haskell, whom Laurence identifies as the crucial figure in Jacobs’s intellectual maturation. Like Jacobs, Haskell lacked an academic pedigree and paired “an anti-utopian streak” with a faith in the power of architecture to bring about social change, for better and for worse. Shortly after her hire, Haskell announced that Forum would intensify its emphasis, already significant, on “the problems of cities.” Was urban renewal—which James Baldwin would later call “Negro removal”—improving the slums of America’s great cities? If not, what else should be done?

ed899efad.jpg

Melville House

Jacobs wrote numerous un-bylined articles in favor of theories she would later ridicule. She lionized her future nemeses, the city planners, writing that “the first—the most elementary—lesson for downtown is simply the importance of planning.” She continued to argue in favor of superblocks and for demolishing entire neighborhoods of blighted buildings. In a 24-page feature about shopping centers, she called for downtowns to model themselves after suburban malls. The flaw in her thinking was not purely ideological; she did write critically of the “homogenized simplicity” of new developments and praised planners who made a token effort to preserve older buildings. But poor journalistic habits cost her. She didn’t always travel to see the cities and building projects she wrote about, relying instead on the sketches, photographs, prospectuses, and blueprints sent by architects to the magazine. She violated one of the eventual maxims of Death and Life, that “no other expertise can substitute for locality knowledge in planning.”

A revelation came during a tour of Philadelphia—a tour she may well have taken only after she published a laudatory essay about the city’s redevelopment efforts in 1955. Her guide was Philadelphia’s planning-commission director, Edmund Bacon, “the grand poobah” of American planning, who would later appear on the cover of Time as the face of urban renewal. Bacon took Jacobs on a before-and-after tour of his city. “Before” was represented by a street in a condemned black neighborhood; “after” was a towering new housing project. Before Street, Kanigel writes, “was crowded with people, people spilling out onto the sidewalk, sitting on stoops, running errands, leaning out of windows.” After Street was flat and deserted, with the exception of a lone boy kicking a tire.

“Not only did [Bacon] and the people he directed not know how to make an interesting or a humane street,” Jacobs later said, “but they didn’t even notice such things and didn’t care.” In early 1956 she took a series of tours of East Harlem led by William Kirk, a community activist and the director of the Union Settlement Association. He showed her how the construction of 10 housing projects had destroyed not only the neighborhood’s small businesses but the communities they had sustained. A more personal incitement came from Robert Moses’s long-standing plans to redevelop Jacobs’s own beloved neighborhood of Greenwich Village. Moses proposed extending Fifth Avenue through Washington Square Park in the form of a sunken highway, and razing 26 blocks to clear space for a pair of gargantuan housing projects. When Douglas Haskell asked Jacobs to take his speaking slot at an urban-design conference at Harvard in April 1956, before an audience of the nation’s most powerful planners and critics, she let them have it.

Her 1,500-word speech, a version of which appears in Vital Little Plans, became the basis for The Death and Life of Great American Cities. Her main argument was Kirk’s: Small neighborhood stores, ignored by the planners in their grim demolition derby, were essential social hubs. She added that sidewalks, stoops, laundries, and mailbox areas were also indispensable centers of community activity, and that sterile, vacant outdoor space served nobody. “The least we can do,” she said, “is to respect—in the deepest sense—strips of chaos that have a weird wisdom of their own.”

That “weird wisdom” was the wisdom of crowds: the customs and habits that people in cities, left to their own devices, developed while living in close proximity to one another. The planners had been guided by aesthetic concerns, favoring clean lines, geometric shapes, and vast boulevards that were beautiful so long as they were seen from the window of an airplane. But Americans didn’t need a new utopia. They already had a system that, while messy and imperfect, produced a thriving society.

In Death and Life, Jacobs converted democratic values into design policy. This was no magic trick—it was achieved through close observation. Through better reporting, she became a better theorist. The vitality that planners like Bacon and Moses hoped to create already existed. She had seen it herself, not only in the tenements of East Harlem but in Greenwich Village. The two neighborhoods—one doomed, one celebrated for its bohemian spirit—were more alike than not, just as the blocky brick towers of Stuyvesant Town, the celebrated middle-class development where Jacobs’s sister lived, were as dreary as Harlem’s carceral George Washington Houses.

Reduced to a word, Jacobs’s argument is that a city, or neighborhood, or block, cannot succeed without diversity: diversity of residential and commercial use, racial and socioeconomic diversity, diversity of governing bodies (from local wards to state agencies), diverse modes of transportation, diversity of public and private institutional support, diversity of architectural style. Great numbers of people concentrated in relatively small areas should not be considered a health or safety hazard; they are the foundation of a healthy community. Dense, varied populations are “desirable,” Jacobs wrote,

because they are the source of immense vitality, and because they do represent, in small geographic compass, a great and exuberant richness of differences and possibilities, many of these differences unique and unpredictable and all the more valuable because they are.

***

James Madison couldn’t have put it better, though he tried. He addressed the issue in Federalist Paper No. 9, his effort to answer one of the most vexing problems facing the Framers of the Constitution: how to safeguard their new democracy against insurrection or despotism. Madison argued that as you increase the “variety of parties and interests” contained within a republic, “you make it less probable that a majority of the whole will have a common motive to invade the rights of other citizens.” Jacobs saw that the same principle held in cities. It is not a coincidence that she described city planners, and the businessmen and politicians who enabled them, as tyrants: “neurotic,” “destructive,” and “impossibly arrogant.”

“We need all kinds of diversity,” Jacobs concluded in Death and Life, “so the people of cities can sustain (and further develop) their society and civilization.” In later books, particularly The Economy of Cities (1969) and Cities and the Wealth of Nations (1984), she expanded this point, arguing that the fate of a civilization rested on the vitality of its major cities. In her final book, published in 2004, she applied her analysis to our own civilization. What was the current condition of our great cities? How did America stack up against Rome, Mesopotamia, Babylon? What future could we expect if we continued on our current path? She called the book Dark Age Ahead.

Her Cassandra tale is given greater credibility by the fact that many of her direst predictions have already been realized. Within three years of publication, “the miracle of money growing on houses” was revealed to be a mirage that threatened to take down much of the financial system with it. Gentrification, which Jacobs first warned against in Death and Life, was exacerbated in New York City and elsewhere when local governments failed to set aside sufficient affordable public housing. The Total Information Awareness program, a government data-mining surveillance system that she warned against on the book’s final page, morphed into prism, the classified surveillance program exposed by Edward Snowden. These seemingly disparate dangers, Jacobs argued, rose from a common cause: a moral weakening, or drift, accelerated by cultural rot.

In her comparative study of fallen empires, Jacobs identifies common early indicators of decline: “cultural xenophobia,” “self-imposed isolation,” and “a shift from faith in logos, reason, with its future-oriented spirit … to mythos, meaning conservatism that looks backwards to fundamentalist beliefs for guidance and a worldview.” She warns of the profligate use of plausible denial in American politics, the idea that “a presentable image makes substance immaterial,” allowing political campaigns “to construct new reality.” She finds further evidence of our hardening cultural sclerosis in the rise of the prison-industrial complex, the prioritization of credentials over critical thinking in the educational system, low voter turnout, and the reluctance to develop renewable forms of energy in the face of global ecological collapse.

No reader of Jacobs’s work would be surprised by the somewhat recent finding by a Gallup researcher that Donald Trump’s supporters “are disproportionately living in racially and culturally isolated zip codes and commuting zones.” These zones are latter-day incarnations of Higgins: marooned, amnesiac, homogenous, gutted by the diminishment of skills and opportunities. One Higgins is dangerous enough, for both its residents and the republic to which it belongs. But the nation’s Higginses have proliferated to the point that their residents have assumed control of a major political party.

In the foreword to the 1992 Modern Library edition of Death and Life, Jacobs likens cities to natural ecosystems. “Both types of ecosystems,” she writes, “require much diversity to sustain themselves … and because of their complex interdependencies of components, both kinds of ecosystems are vulnerable and fragile, easily disrupted or destroyed.” Dark Age Ahead reminds us how many powerful, technologically advanced cities—and empires—have come before us, only to fade to dust. When they fall, they do not recover. The vanished way of life “slides into an abyss of forgetfulness, almost as decisively as if it had not existed.” Karl Marx, who spent his life studying the subject, observed that history repeats itself, the first time as tragedy, the second time as farce. This topsy-turvy election year makes one wonder whether he might have gotten that backwards. We’ve had farce, that much is certain. What will the next time bring?Nathaniel Rich is the author of Odds Against Tomorrow and King Zeno.

This article was originally published on November 11, 2016, by The Atlantic, and is republished here with permission.

You Need to Practice Being Your Future Self

Being busy is not the same as being productive. It’s the difference between running on a treadmill and running to a destination. They’re both running, but being busy is running in place.

Harvard Business Review|

  • Peter Bregman (getpocket.com)

I was coaching Sanjay,* a leader in a technology firm who felt stuck and frustrated. He wasn’t where he wanted to be at this point in his career.

He had come to our coaching session, as usual, prepared to discuss the challenges he was currently facing. This time, it was his plan for conducting compensation conversations with each of his employees. After a few minutes of listening to him talk through his plans, I interrupted him.

“Sanjay, you’ve had these kinds of conversations before, right?” I asked.

“Yes,” he said.

“And, for the most part, you know how to do them, right?”

“Yes,” he said again.

“Great. Let’s talk about something else.”

“But this is what’s on my mind right now,” he protested. “It’s helpful to think it through with you.”

“I’m glad it’s helpful, Sanjay,” I said. “But you don’t want me to be merely helpful. You want me to be transformational. And focusing on what’s top of mind for you right now is not going to get us there.”

You see, the reason Sanjay is stuck — and the reason many of us feel that way — is that we focus on what’s present for us at any particular moment.

On the other hand, what most of us want most is to move forward. And, by definition, paying attention to the present keeps us where we are. So, sure, I can help Sanjay be a better “present” Sanjay. But I will have a much greater impact if I help him become a successful “future” Sanjay.

It’s a familiar story: You’re busy all day, working non-stop, multitasking in a misguided attempt to knock a few extra things off your to-do list, and as the day comes to a close, you still haven’t gotten your most important work done.

Being busy is not the same as being productive. It’s the difference between running on a treadmill and running to a destination. They’re both running, but being busy is running in place.

If you want to be productive, the first question you need to ask yourself is: Who do I want to be? Another question is: Where do I want to go? Chances are that the answers to these questions represent growth in some direction. And while you can’t spend all your time pursuing those objectives, you definitely won’t get there if you don’t spend any of your time pursuing them.

If you want to be a writer, you have to spend time writing. If you want to be a sales manager, you can’t just sell — you have to develop your management skill. If you want to start a new company, or launch a new product, or lead a new group, you have to spend time planning and building your skills and experience.

Here’s the key: You need to spend time on the future even when there are more important things to do in the present and even when there is no immediately apparent return to your efforts. In other words — and this is the hard part — if you want to be productive, you need to spend time doing things that feel ridiculously unproductive.

I want to expand my writing abilities, so I have started waking up at 5:30 in the morning to write fiction. Unfortunately — and I am not being humble here — I am a terrible fiction writer. So my writing time feels painfully unproductive. I can’t sell it. I can’t use it. I can’t share it. Honestly, I can hardly bear to read it out loud. I have such a long list of things that actually need to get done, it is almost impossible to justify losing sleep in order to do something so unrelated to my present challenges. I know this is how my clients feel when I ask them to put aside their immediate concerns and focus on more distant challenges.

A question I hear a lot is: What about all the things I actually need to get done? Don’t I need to get through my cluttered email box, my pressing conversations, my project plans in order to create space to focus on my future self?

Nope.

That’s a trick your busy self plays on you to keep you away from the scary stuff you’re not yet good at and that isn’t yet productive. Sometimes you need to be irresponsible with your current challenges in order to make real progress on your future self. You have to let the present just sit there, untended. It’s not going away and will never end. That’s the nature of the present.

You may not end up with an empty email inbox. You may not have the perfect compensation conversations. You may not please everyone. But, as your coach, I’m willing to bet that you will do those things well enough.

It’s the other stuff I worry about. The wildly important stuff that never gets done because there’s not time or it’s not urgent or it’s too hard or risky or terrifying. That’s the stuff I want to help you work on.

Even though Sanjay is delighted at the idea of focusing on his future self, he resists it because it doesn’t feel as good as solving his current challenges. He isn’t as skilled at it yet. That’s why it’s his future.

And that is exactly why he needs to focus on it.

This article was originally published on March 28, 2016, by Harvard Business Review, and is republished here with permission.

Our Attitude Toward Aliens Proves We Still Think We’re Special

Why we downplay Fermi’s paradox.

Nautilus|getpocket.com

  • Milan Ćirković

“How many kingdoms know us not!”
—Blaise Pascal, Thoughts (1670)

One summer’s day in 1950, the great Italian-American physicist Enrico Fermi was having lunch with the physicists Edward Teller, Emil Konopinski, and Herbert York at Los Alamos when the conversation turned to a flood of recent UFO sightings all over the United States. There were also, coincidentally, reports of trashcans going missing in New York City at the time. A New Yorker cartoon connected the dots and accused interstellar visitors of the misdeed. In the relaxed atmosphere of that lunchtime conversation, Fermi remarked that the New Yorker’s solution, by proposing a single common cause of two independent empirical phenomena, was in the very best traditions of scientific methodology.

The lunchtime chat stayed on the topic of ET. While they obviously didn’t take seriously the reports of flying saucers, Fermi and his companions began to earnestly discuss things like interstellar—and even superluminal—travel. Then, after some delay—and, one might imagine, in the midst of some tasty dish—Fermi allegedly asked his famous question. Where, indeed, is everybody? Where are the extraterrestrials?

Photo by Yan Wang / Flickr.

The Milky Way galaxy is about 100,000 light-years from edge to edge, Fermi reasoned, which means that a star-faring species would need about 10 million years to traverse it, even if moving at a very modest velocity of 1 percent of the speed of light. Since the galaxy is more than a thousand times older than this, any technological civilization will have had a lot of time in which to expand and colonize the whole galaxy. If one species were to fail in this endeavour, another wouldn’t. Consequently, if intelligent species were out there in any appreciable numbers, they would have been here already. And yet, we do not see them on Earth or in the solar system. For Fermi and many thinkers since, this constituted a paradox.

The volume of scientific literature that Fermi’s paradox has inspired testifies to its serious and provocative nature. When you consider fiction and movies, it’s clear that Fermi’s paradox has become an important part of contemporary culture, challenging us to think more deeply about our place in the cosmos.

But, still, the paradox remains incompletely understood by science, incompletely digested by popular culture, and even actively resisted or deliberately ignored. In this sense, it has become a type of Rorschach test: Our attitudes to the paradox tell us something about ourselves.

***

The strong version of Fermi’s paradox (it is only proper and intellectually honest to tackle the strongest version of any particular scientific problem) doesn’t just ask why there aren’t aliens here on Earth. It also asks why we don’t see any manifestations or traces of extraterrestrial civilizations anywhere in our past light cone—that is the whole volume of space and time visible to us, extending billions of years into the past, to the epoch of earliest galaxies.

The strong Fermi’s paradox became even stronger, so to speak, in 2001, with the work of Charles Lineweaver and collaborators on the age distribution of terrestrial planets in the Milky Way. His calculations show that Earth-like planets in our galaxy began forming more than 9 billion years ago, and that their median age is 6.4 ± 0.9 billion years, which is significantly greater than the age of the Earth and the solar system. This means that a large majority of habitable planets are much older than Earth. If we believe that humans and the planet we live on are not particularly special compared to other civilizations on other planets, we would conclude that the stage of the biosphere and technology on other occupied planets must be, on average, older than the corresponding stages we see on Earth. If we humans are now on the cusp of colonizing our solar system, and we are not much faster than other civilizations, those civilizations should have completed this colonization long ago and spread to other parts of the galaxy.

We presume ourselves to be so special that the question “Where is everybody as complex and important as ourselves (or more)?” cannot be taken seriously.

Another piece of recent science amplifies Fermi’s paradox even further. Geochemical and paleobiological research has recently confirmed that the oldest traces of living beings on Earth are at least 3.8 billion years old, and probably as old as 4.1 billion. The Earth itself is only 4.5 billion years old. While the mechanism of abiogenesis (the origination of life) is still largely unknown, the evidence of abiogenesis occurring early in the Earth’s history seems incontrovertible. The consequences are rather dramatic: If life is quick to form after its host planet has formed, we get good probabilistic support for the existence of simple life on many planets in the Milky Way, and potentially complex life on some of them.

Now that we know that the Earth is a latecomer, and believe the foundations of life have the power to take hold quickly, Fermi’s paradox is more puzzling than ever. In the evocative words of physicist Adrian Kent: It’s just too damn quiet in the local universe.

In spite of all this, Fermi’s paradox is not only downplayed and ignored by a large part of the scientific community, but also mocked and even censured. Distinguished SETI researchers, like Frank Drake or Seth Shostak, claim in their memoirs that they had not heard about Fermi’s paradox until very recently and that it should not be taken seriously. Astrobiology, one of the premier journals in the field, has recently instituted a policy of not considering manuscripts dealing with Fermi’s paradox, including even short communications and book reviews. There are many scientists who, like the British astronomer John Gribbin, are happy to proclaim that there is no paradox whatsoever, since “we are alone, and we had better get used to it.”

In principle there may be several reasons for this attitude. But in my opinion one underlies all of them: We humans still think we’re special.

***

In 1543, two revolutionary books transformed our view of both the universe and ourselves. One, written by Flemish physician Andreas Vesalius, was titled De humani corporis fabrica (On the Fabric of the Human Body), and it laid the foundations of modern medical science by proving once and for all that our bodies are not mystical objects but physical systems amenable to scientific study—and not very different from the bodies of animals. The other, entitled De revolutionibus orbium coelestium (On the Revolutions of the Heavenly Spheres) was of even greater significance by a little-known Polish polymath by the name of Nicolaus Copernicus. It overthrew the cosmological paradigm that had reigned for almost 2,000 years and was supported by the political and religious authorities of the day. In doing so, he inadvertently redefined the very word revolution, from a purely technical term inside to a household label for any dramatic change in any field.

15089_af070abdf5156acd363fca2b6f391ace.png

Photo by Ann Ronan Pictures/Print Collector/Getty Images.

The Copernican Revolution, sometimes called the Scientific Revolution, was not only about whether the Earth rests at the center of the universe with the sun and planets moving around it. It was also about whether humans were the most important objects in the universe. In a sense, the “de-centering” of the Earth brought about by the Copernican Revolution was a consequence, rather than a cause of a new way of thinking about ourselves: We were becoming a part of nature, rather than its exalted goal. If Earth is a typical planet revolving around a typical star (and, as we learned much later, in a typical galaxy), then there is no scientific reason to assign special importance to ourselves. Copernicanism broadly understood is this assertion that humans are nothing special across space, time, and other more abstract parameter spaces. It has enabled tremendous advances in science since the times of Vesalius and Copernicus by combating unsupported anthropocentrism.

But our institutions are still profoundly anthropocentric. We deny even the most basic rights to other parts of nature, including our close animal relatives, some of which share more than 97 percent of our DNA. We pollute our environment with close to zero regard for the well-being of its ecosystems—and we fight pollution only if and when it inconveniences us. Scientific experiments on human beings are not only illegal, but are considered barbarous even when they could provide some useful information. This is in sharp contrast to our practice of experimenting on lab animals, hunting foxes, or killing bulls in the arena. Even in the purely abstract realms of knowledge, one often hears the complaint that physical sciences are “cold” and “inhuman” exactly because they are less permeated by anthropocentrism than, say, philosophy or the humanities or arts. Almost 500 years after the onset of the Copernican revolution, we have a relic belief in the exalted nature of the human mind.

Where is everybody? Where are the extraterrestrials?

These remnants encourage us to resist animal rights, and the reality of anthropogenic climate change—and Fermi’s paradox. We presume ourselves to be so special that the question “Where is everybody as complex and important as ourselves (or more)?” cannot be taken seriously. ET isn’t here, we think, because there’s no equal to us. After Copernicus came Darwin’s revolution, and then Freud’s, delivering blows to our illusions of uniqueness and grandiosity within the biological and mental domains, respectively. It’s not that Fermi’s paradox belongs in this progression—it doesn’t explode any myth of our specialness—but appreciating its full import relies on the perspective that Copernicus, Darwin, Freud, and others have given us.

That clearly is a step too far. Many of us choose to ignore Fermi’s paradox, or even fight it, because it requires too complete an acceptance of our cosmic mediocrity. We would rather secretly believe we are special than confront the real consequences of the paradox—consequences like, for example, intelligence being a maladaptive trait, or our universe being a simulation, or us living in a cosmic zoo. Some of us even go so far as to argue that we have become a navel-gazing, self-absorbed civilization, without much chance of developing a sustained cosmic presence and industrial bases all over the solar system. Destroying what Olaf Stapledon and R. Buckminster Fuller have dubbed the cosmic vision of humanity’s future lets us duck out of the Fermi’s paradox conversation. If we can’t do it, our extraterrestrial peers can’t do it either and we shouldn’t waste time and money searching for them. This subtle form of anthropocentrism leads us to a very dangerous path, since it impedes the best—and ultimately only—prospect for humanity to achieve its cosmic potential. Sir Fred Hoyle put it nicely in 1983:

Many are the places in the Universe where life exists in its simplest microbial forms, but few support complex multicellular organisms; and of those that do, still fewer have forms that approach the intellectual stature of man; and of those that do, still fewer again avoid the capacity for self-destruction which their intellectual abilities confer on them. Just as the Earth was at a transition point 570 million years ago, so it is today. The spectre of our self-destruction is not remote or visionary. It is ever-present with hands already upon the trigger, every moment of the day. The issue will not go away, and it will not lie around forever, one way or another it will be resolved, almost certainly within a single human lifetime.

The current generation is likely to live to the 500-year jubilee of De revolutionibus orbium coelestium in 2043. Let’s hope that, by then, we will have completed the Copernican revolution and embraced the hard and deep problems that modern astrobiology is posing. We are now living at the tipping point—the very moment when firm empirical resolution of our biggest and oldest puzzles is in sight. We should not miss that opportunity by fighting for an outdated vision of ourselves as pinnacles of complexity in the universe. Instead, we should reason as if we were near typical for our given epoch. Only then we shall have a fighting chance of piercing the Great Silence.

Milan Ćirković is a senior research associate at the Astronomical Observatory of Belgrade and an assistant professor in the Department of Physics at the University of Novi Sad in Serbia and Montenegro.

This article was originally published on August 2, 2018, by Nautilus, and is republished here with permission.

Kids’ Video Game Obsession Isn’t Really About Video Games. It’s About Unmet Psychological Needs.

How much gaming is too much, and what can be done to help fill the void?

Nir Eyal|getpocket.com

  • Nir Eyal
  • Andrew Kinch

Many parents are concerned with their child’s seemingly obsessive video game play. Fortnite, the most recent gaming phenomenon, has taken the world by storm and has parents asking whether the shooter game is okay for kids.

The short answer is yes, Fortnite is generally fine. Furthermore, parents can breathe easier knowing that research suggests gaming (on its own) does not cause disorders like addiction.

However, there’s more to the story. A comprehensive answer to the question of whether video games are harmful must take into account other factors. Fortnite is just the latest example of a pastime some kids spend more time on than is good for them. But parents need to understand why kids play as well as when to worry and when to relax.

Addiction, Really?

The word “addiction” gets tossed around quite a bit these days. It’s not uncommon to hear people say that they are addicted to chocolate or shoe shopping, but if it isn’t causing serious harm and impairment to daily function, it isn’t an addiction. It’s an overindulgence.

This isn’t just semantics. An addiction involves a lack of control despite adverse consequences. Parents may worry their kids are addicted, but if the child can pull themselves away from a game to join the family for a conversation over dinner, and shows interest in other activities, like sports or socializing with friends, then they are not addicted.

Generally, parents panic when their kid’s video game playing comes at the expense of doing other things like studying or helping around the house. But let’s be honest, kids have been avoiding these activities for ages. Equally true is the fact parents have been complaining about their unhelpful children well before the first video game was plugged into its socket.

The real question should be what is it about the special draw of gaming that makes it the preferred pastime of so many millions of kids? What makes it so difficult for even non-addicted kids to step away from video games sometimes?

The answer has to do with the way games address basic psychological needs.

Fortnite, like any well-designed video game, satisfies what we are all looking for. According to Drs. Edward Deci and Richard Ryan, people need three things to flourish. We look for competence — the need for mastery, progression, achievement, and growth. We need autonomy — the need for volition and freedom of control over our choice. And finally, we strive for relatedness — the need to feel like we matter to others and that others matter to us. Unfortunately, when considering the state of modern childhood, many kids aren’t getting enough of these three essential elements.

School, where kids spend most of their waking hours, is in many ways the antithesis of a place where kids feel competence, autonomy, and relatedness. There, kids are told what to do, where to be, what to think, what to wear, and what to eat. Alarms and bells orchestrate their movements with farm-chattel precision while teachers opine on topics students could care less about. If they’re bored and want to go, they’re punished. If they want to learn something else, they’re told to be quiet. If they’d like to go deeper on a topic, they’re prodded to stay on track. Of course, this isn’t every student’s experience and different countries, schools, and teachers use different approaches to educate kids. But while some argue discipline and control provide structure, it’s clear why teachers and students might struggle with motivation in the classroom.

Gamers feel competence when they practice strengths to achieve their aims. In a game, players have the autonomy to call the shots, do what they want, and experiment with creative strategies to solve problems. Games are also social outlets where players can feel relatedness. In Fortnite, for example, players often meet in the virtual environment to chat and socialize because doing so in the real world is often inconvenient or off limits. Whereas previous generations were allowed to simply play after school and form close social bonds, many kids today are raised by fearful and overworked parents who insist their kids either attend a regimented afterschool program or stay behind lock and key at home.

We shouldn’t be surprised when the confinement kids find themselves in today often yields behaviors we don’t understand and don’t like. Games satisfy psychological needs other areas of life are not satiating.

Of course, none of this is to say video games are a good substitution — quite the opposite. While a well-designed game attempts to satisfy these needs, it can’t come close to the deep satisfaction real life and real human connection can provide.

No game can give a child the feeling of competence that comes from accomplishing a difficult task or learning a new skill on their own accord. Fortnite can’t compete with the exhilaration that comes from the autonomy of exploring reality, where a child is free to ask questions and unlock mysteries in the real world. No social media site can give a kid the sense of relatedness, safety, and warmth that comes from an adult who loves that child unconditionally just the way they are, no matter what, and takes the time to tell them so.

Some kids suffer from gaming disorders, but such dependencies are often coupled with pre-existing conditions including problems with impulse control. This, of course, does not abdicate companies from their moral responsibility to help problem gamers. It’s time they implement policies to identify and help those with disorders.

For most children, however, parents understanding the deeper truth behind what kids are getting out of games empowers them to take steps to give kids more of what they need. It also helps parents get into a state of mind to talk rationally about overuse instead of succumbing to the hysterics and moral panic that our parents used to try and force us to stop listening to rock ’n’ roll, watching MTV, playing pinball, or reading comic books. Video games are this generation’s outlet and some kids use them as a tool to escape the same way some of us use our own flavor of dissociative devices to tune out reality for a while.

Instead of repeating the mistakes of previous generations with heavy-handed tactics, let’s understand the psychological source of the problem. Ultimately, parents’ goal should be to help kids learn strategies for coping with overuse on their own so that they do what’s good for them even when we’re not around. By teaching self-regulating habits, promoting intentional gaming, and helping kids find suitable alternatives, parents can help kids find what they are really looking for.

Andrew Kinch is the founder of GameAware.

This article was originally published on July 31, 2018, by Nir Eyal, and is republished here with permission.

SF Symphony premieres MTT’s ‘Rilke Songs’

Mezzo-soprano Sasha Cooke, San Francisco Symphony artist-in-residence, sings with the orchestra next week. (Courtesy Stephanie Girard)

German poet’s evocative writing at center of ‘Love and Lyricism’ concerts

A “first” in Michael Tilson Thomas’ last season as music director of the San Francisco Symphony includes the world premiere of his composition “Meditations on Rilke” on a program next week called “MTT & Mahler: Love and Lyricism.”

“I first read Rilke’s poems in English translation 30-40 years ago, loved them, and started reading and even memorizing them in German,” says the conductor, 75. “When you recite poetry, you can hear its music, and now that I am fully reconnecting with my ‘composer self,’ the music of these poems have turned into the compositions we will present.”

MTT joins Alban Berg, Paul Hindemith, Anton Webern, Arnold Schoenberg and Peter Lieberson among composers who have set works by Rilke (1875-1926) — a Bohemian-Austrian poet and novelist described as “one of the most lyrically intense German-language authors” — to music.

“Meditations on Rilke features artist-in-residence mezzo-soprano Sasha Cooke and bass-baritone Ryan McKinny singing a six-part cycle set to the poems “Herbsttag” (“Autumn day”); “Das Lied des Trinkers” (“The drinkers’ song”); “Immer wieder” (“Again and again”); “Imaginärer Lebenslauf” (“Imaginary life journey”); “Herbst” (“Autumn”): and “Ich lebe mein Leben in wachsenden Ringen” (“I live my life in ever-widening circles”), which, translated in English, goes:

I live my life in ever-widening circles

that stretch themselves out over all the things.

I won’t, perhaps, complete the last one,

but I intend on trying.

I circle around God, around the ancient tower,

and I circle for thousands of years;

and I don’t know, yet: am I a falcon, a storm,

or a mighty song.

Rilke’s poetry also is in Taika Waititi’s 2019 movie “Jojo Rabbit,” with a German-language rendition of David Bowie’s “Heroes” played over the Rilke quote: “Let everything happen to you/Beauty and terror/Just keep going/No feeling is final.”

MTT has been composing throughout his long conducting career, including setting poetry by Walt Whitman, sung at its premiere by Thomas Hampson; and Emily Dickinson, premiered by Renée Fleming.

In 1991, he and the New World Symphony presented benefit concerts for UNICEF featuring Audrey Hepburn as narrator of MTT’s “From the Diary of Anne Frank.” In 1995, he led the Pacific Music Festival Orchestra in the premiere of his composition “Shówa/Shoáh,” commemorating the 50th anniversary of the bombing of Hiroshima.

Both in Carnegie Hall and with the San Francisco Symphony, MTT led performances of his “Island Music” for four marimbas and percussion.

In June, as MTT concludes his 25-year tenure heading the orchestra before former Los Angeles Philharmonic music director Esa-Pekka Salonen takes over, SFS Media will release a recording of his music performed by the S.F. Symphony in recent seasons, including “Meditations on Rilke,” “From the Diary of Anne Frank” narrated by mezzo-soprano Isabel Leonard, and “Street Song.”

Next week’s concerts also include Cooke and McKinny singing songs from Mahler’s “Des Knaben Wunderhorn” (“The Boy’s Magic Horn”) as well as the overture to Berlioz’s “Benvenuto Cellini” and Ravel’s “La Valse.”

IF YOU GO

MTT & Mahler: Love & Lyricism

Presented by San Francisco Symphony

Where: Davies Symphony Hall, 201 Van Ness Ave., S.F.

When: 8 p.m. Jan. 9-11, 2 p.m. Jan. 12

Tickets: $20 to $185

Contact: (415) 864-6000, www.sfsymphony.org

Bass-baritone Ryan McKinny performs new music by Michael Tilson Thomas in “MTT & Mahler: Love and Lyricism.” (Courtesy Simon Pauly)

Winter Sleep

Edith Matilda Thomas
I know it must be winter (though I sleep)— 
I know it must be winter, for I dream 
I dip my bare feet in the running stream, 
And flowers are many, and the grass grows deep. 
 
I know I must be old (how age deceives!)
I know I must be old, for, all unseen, 
My heart grows young, as autumn fields grow green 
When late rains patter on the falling sheaves. 
 
I know I must be tired (and tired souls err)— 
I know I must be tired, for all my soul
To deeds of daring beats a glad, faint roll, 
As storms the riven pine to music stir. 
 
I know I must be dying (Death draws near)— 
I know I must be dying, for I crave 
Life—life, strong life, and think not of the grave,
And turf-bound silence, in the frosty year.      

This poem is in the public domain. Published in Poem-a-Day on January 4,
2020, by the Academy of American Poets.

About this Poem “Winter Sleep” originally appeared in A Winter Swallow 
(Charles Scribner’s Sons, 1896) 

Edith Matilda Thomas was born in Ohio in 1854. Her collections include A Winter Swallow (Charles Scribner’s Sons, 1896) and Fair Shadow Land (Houghton, Mifflin and Co., 1893). She died in 1925.

Astrophysicist Says He Knows How to Build a Time Machine

TIME TWISTER

But his peers are far from convinced that it’ll work.

KRISTIN HOUSER JANUARY 2ND 2020 (futurism.com)

Astrophysicist Ron Mallett believes he’s found a way to travel back in time — theoretically.

The tenured University of Connecticut physics professor recently told CNN that he’s written a scientific equation that could serve as the foundation for an actual time machine. He’s even built a prototype device to illustrate a key component of his theory — though Mallett’s peers remain unconvinced that his time machine will ever come to fruition.

To understand Mallett’s machine, you need to know the basics of Albert Einstein’s theory of special relativity, which states that time accelerates or decelerates depending on the speed at which an object is moving.

Based on that theory, if a person was in a spaceship traveling near the speed of light, time would pass more slowly for them than it would for someone who remained on Earth. Essentially, the astronaut could zip around space for less than a week, and when they returned to Earth, 10 years would have passed for the people they’d left behind, making it seem to the astronaut like they’d time traveled to the future.

But while most physicists accept that skipping forward in time in that way is probably possible, time traveling to the past is a whole other issue — and one Mallett thinks he could solve using lasers.

As the astrophysicist explained to CNN, his idea for a time machine hinges upon another Einstein theory, the general theory of relativity. According to that theory, massive objects bend space-time — an effect we perceive as gravity — and the stronger gravity is, the slower time passes.

“If you can bend space, there’s a possibility of you twisting space,” Mallett told CNN. “In Einstein’s theory, what we call space also involves time — that’s why it’s called space time, whatever it is you do to space also happens to time.”

He believes it’s theoretically possible to twist time into a loop that would allow for time travel into the past. He’s even built a prototype showing how lasers might help achieve this goal.

“By studying the type of gravitational field that was produced by a ring laser,” Mallett told CNN, “this could lead to a new way of looking at the possibility of a time machine based on a circulating beam of light.”

As optimistic as Mallet might be about his work, though, his peers are skeptical that he’s on the path to a working time machine.

“I don’t think [his work is] necessarily going to be fruitful,” astrophysicist Paul Sutter told CNN, “because I do think that there are deep flaws in his mathematics and his theory, and so a practical device seems unattainable.”

Even Mallet concedes that his idea is wholly theoretical at this point. And that even if his time machine does work, he admits, it would have a severe limitation that would prevent anyone from, say, traveling back in time to kill baby Adolf Hitler.

“You can send information back,” he told CNN, “but you can only send it back to the point at which you turn the machine on.”

READ MORE: Meet the scientist trying to travel back in time [CNN]

More on time travel: Paradox-Free Time Travel Possible With Many Parallel Universes

What Society Says To Men- Helly Shah | Spoken Word Poetry

Helly Shah “I always felt like men were born with a sense of entitlement in this world. Until now, until this.” This poem is a commentary on the conditioning men receive since childhood and how it shapes their social interactions.

Written & Performed By- Helly Shah Production House- Abhedya Artworks (www.abhedya.in) Shot By- Pratik Bhadekar Edited By- Navaldeep Singh Shot At- The School of Thought, Andheri (W) Special Thanks- Forum Shah

Follow me on: Facebook- https://www.facebook.com/thisishellyshah Instagram- https://www.instagram.com/hellyshah_ Twitter- https://www.twitter.com/hellyshah_99

Ontology

Woman’s finger touching data network.

Ontology is quick-shifting consciousness expansion evolutionary/revolutionary action, which like a computer, is rapidly working out enormous cultural “equations’ via the responsiveness of the proto-mutant’s soma-consciousness.

–Thane of Hawaii