In October of 1883, a paper in the nation’s capital reported under the heading “Current Gossip” that “an Iowa woman has spent seven years embroidering the solar system on a quilt” — a news item originally printed in Iowa and syndicated widely in newspapers across the country that autumn and winter. The New York Times reprinted the report as it appeared in the Iowa paper, dismissively qualifying it as a “somewhat comical statement.”
Ellen Harding Baker’s Solar System quilt, completed in 1876 (Smithsonian)
The woman in question, Ellen Harding Baker (June 8, 1847–March 30, 1886), was not a person to be dismissed with a patronizing chuckle. Baker taught science in rural Iowa, in an era when most institutions of higher education were still closed to women, all the whilst raising her five surviving children. She used her Solar System quilt to illustrate her astronomy lectures. To ensure the accuracy of her embroidered depiction, Baker traveled to the Chicago Observatory to view sunspots and a comet — most likely the Great Comet of 1882, which had become a national attraction — through the professional telescope there.
Ellen Harding Baker (Smithsonian)
Baker was born in the year Maria Mitchell — the figure who sparked the initial inspiration for my book Figuring — made the landmark comet discovery that earned her worldwide acclaim and established her as America’s first professional female astronomer. When Baker began working on her Solar System quilt, she was the same age Mitchell was when she discovered her comet — twenty-nine.
Quilt detail
The quilt, crafted long before we knew the universe contained galaxies other than our own, depicts an enormous radiant sun orbited by the planets known prior to Pluto’s discovery in 1930, as a comet — one of those mysterious and enchanting celestial bodies, extolled in poems and foreboded in Medieval paintings — blazes in one corner. The quilt is made of wool, lined with a cotton-and-wool fabric, and embroidered in silk and wool.
Quilt detail
The convergence of the threaded arts and astronomy was not entirely uncommon in Baker’s day. Mitchell herself, while condemning the needle as “the chain of woman” and resenting the tyranny of “stitch, stitch, stitch” as society’s means of keeping women confined to the domestic sphere, believed that the needle could be reclaimed as an instrument of the mind. “The eye that directs a needle in the delicate meshes of embroidery will equally well bisect a star with the spider web of the micrometer,” she wrote in her diary.
Quilt detail
Nearly a century after Baker made her quilt, the pioneering astronomer Cecilia Payne-Gaposchkin — who revolutionized our understanding of the universe by discovering its chemical composition and became the first woman to chair a Harvard department, having ended up at the esteemed university thanks to a fellowship established there by the Maria Mitchell Association — would pick up where Baker left off, crafting a stunning yarn-on-canvas needlepoint depiction of the supernova remnant Cassiopeia A. In the year of Payne’s death, the artist Judy Chicago would also bring needlepoint and astronomy together in her iconic project The Dinner Party, which features a hand-embroidered runner celebrating Caroline Herschel — the world’s first woman astronomer and the subject of Adrienne Rich’s stunning tribute.
Baker’s quilt is available as an art print, with proceeds benefiting the endeavor to build New York City’s first-ever public observatory at Pioneer Works — a dome of possibility for future Ellens.
A century later, Albert Einstein recounted his takeaway from the childhood epiphany that made him want to be a scientist: “Something deeply hidden had to be behind things.” Virginia Woolf, in her account of the epiphany in which she understood she was an artist — one of the most beautiful and penetrating passages in all of literature — articulated a kindred sentiment: “Behind the cotton wool is hidden a pattern… the whole world is a work of art… there is no Shakespeare… no Beethoven… no God; we are the words; we are the music; we are the thing itself.”
This interleaved thing-itselfness of existence, hidden in plain sight, is what two-time U.S. Poet Laureate Howard Nemerov (February 29, 1920–July 5, 1991) takes up, two centuries after William Blake saw the universe in a grain of sand, in a spare masterpiece of image and insight, found in his altogether wondrous Collected Poems (public library), winner of both the Pulitzer Prize and the National Book Award.
Howard Nemerov
On Being creator and Becoming Wise author Krista Tippett brought the poem to life at the third annual Universe in Verse, with a lovely prefatory meditation on the role of poetry — ancient, somehow forgotten in our culture, newly rediscovered — as sustenance and salve for the tenderest, truest, most vital parts of our being.
FIGURES OF THOUGHT by Howard Nemerov
To lay the logarithmic spiral on Sea-shell and leaf alike, and see it fit, To watch the same idea work itself out In the fighter pilot’s steepening, tightening turn Onto his target, setting up the kill, And in the flight of certain wall-eyed bugs Who cannot see to fly straight into death But have to cast their sidelong glance at it And come but cranking to the candle’s flame —
How secret that is, and how privileged One feels to find the same necessity Ciphered in forms diverse and otherwise Without kinship — that is the beautiful In Nature as in art, not obvious, Not inaccessible, but just between.
It may diminish some our dry delight To wonder if everything we are and do Lies subject to some little law like that; Hidden in nature, but not deeply so.
Dr. X is a dad. Appropriately – boringly – at 4:37 p.m. on a national holiday, he is lighting a charcoal grill, about to grab a pair of tongs with one hand and a beer with the other. His kids are running around their suburban patio, which could be anywhere; Dr. X, though impressively educated now, grew up poor in a town that is basically nowhere. Like most Americans, he is a Christian. Like a lot of health-conscious men, he fights dad bod by working out once or twice a week, before going into his medical practice.
Somewhat less conventionally, two hours ago, he was escorting a woman around his yard, helping her walk off a large dose of MDMA. He’s the one who’d given it to her, earlier in the morning, drugging her out of her mind.
This would be psychedelic-assisted therapy, the not-new but increasingly popular practice of administering psychotropic substances to treat a wide range of physical, psychological and psycho-spiritual concerns. “Some people stagger out” of the room in Dr. X’s home that he uses for these “journeys,” as sessions are called in the semiofficial parlance. Some have to stay for hours and hours beyond the standard five or so, crying or waiting to emotionally rebalance, lying on a mattress, probing the secrets, trauma, belief or grief buried in their subconscious. Dr. X recalls a patient who was considering a round-the-clock Klonopin prescription for anxiety; she reluctantly decided to try a journey instead. On the “medicine,” she spent seven hours unraveling ballistically, picturing herself dumping sadness out of her chest into a jade box that she put a golden heart-shaped lock on and tossed into the sea. She’d been skeptical going in, but after it was over, Dr. X says, “She was so angry that it was illegal.”
Because Dr. X’s hallmark treatment – an MDMA session or two, then further journeys with psilocybin mushrooms if called for – is, absolutely, illegal. MDMA is a Schedule I controlled substance. Psilocybin is as well. Exposure could get his medical license suspended, if not revoked, along with his parental rights, or freedom. “This should be a part of health care, and is a true part of health care,” he says in his defense. The oversimplified concept behind MDMA therapy, which causes intense neurotransmitter activity including the release of adrenaline and serotonin (believed to produce positive mood), is that it tamps down fear, allowing people to interact with – and deal with – parts of their psyche they otherwise can’t. Psychedelics in general are thought to bring an observational part of the ego online to allow a new perspective on one’s self and one’s memories, potentially leading to deep understanding and healing.
As an internal-medicine specialist, Dr. X doesn’t have any patients who come to him seeking psychotherapy. But the longer he does the work, the more “I’m seeing that consciousness correlates to disease,” he says. “Every disease.” Narcolepsy. Cataplexy. Crohn’s. Diabetes – one patient’s psychedelic therapy preceded a 30 percent reduction in fasting blood-sugar levels. Sufferers of food allergies discover in their journeys that they’ve been internally attacking themselves. “Consciousness is so vastly undervalued,” Dr. X says. “We use it in every other facet in our life and esteem the intellectual part of it, but deny the emotional or intuitive part of it.” Psychedelic therapy “reinvigorated my passion and belief in healing. I think it’s the best tool to achieving well-being, so I feel morally and ethically compelled to open up that space.”
Currently – legally – we’re in the midst of a psychedelic renaissance. As of March 2017, New York University, the University of New Mexico, the University of Zurich, Johns Hopkins University, the University of Alabama and the University of California-Los Angeles have all partnered with the psilocybin-focused Heffter Research Institute, studying the compound for smoking cessation, alcoholism, terminal-cancer anxiety and cocaine dependence; the biotech-CEO-founded Usona Institute funds research of “consciousness-expanding medicines” for depression and anxiety at the University of Wisconsin-Madison. Since 2000, the Multidisciplinary Association for Psychedelic Studies (MAPS), a nonprofit based in Santa Cruz, California, has been funding clinical trials of MDMA for subjects with PTSD, mostly veterans, but also police, firefighters and civilians. In November 2016, the FDA approved large-scale Phase III clinical trials – the last phase before potential medicalization – of MDMA for PTSD treatment. MAPS, which has committed $25 million to achieving that medicalization by 2021, also supports or runs research with ayahuasca (a concoction of Amazonian plants), LSD, medical marijuana and ibogaine, the pharmaceutical extract of the psychoactive African shrub iboga. The organization is additionally funding a study of MDMA for treating social anxiety in autistic adults, currently underway at UCLA Medical Center. Another study, using MDMA to treat anxiety in patients with life-threatening illnesses, has concluded.
“If we didn’t have some idea about the potential importance of these medicines, we wouldn’t be researching them,” says Dr. Jeffrey Guss, psychiatry professor at NYU Medical Center and co-investigator of the NYU Psilocybin Cancer Project. “Their value has been written about and is well known from thousands of years of recorded history, from their being used in religious and healing settings. Their potential and their being worthy of exploration and study speaks for itself.”
Optimistic insiders think that if all continues to go well, within 10 to 15 years some psychedelics could be legally administrable to the public, not just for specific conditions but even for personal growth. In the meantime, says Rick Doblin, MAPS’ executive director, “there are hundreds of therapists willing to work with illegal Schedule I psychedelics” underground, like Dr. X. They’re in Florida, Minnesota, New York, California, Colorado, North Carolina, Pennsylvania, New England, Lexington, Kentucky. “Hundreds in America,” he says, though they’re “spread out all over the world.”
As within any field, underground practitioners vary in quality, expertise and method. Some are M.D.s, like Dr. X, or therapists, and some are less conventionally trained. They don’t all use the same substances, and don’t necessarily use just one. Some work with MDMA or psilocybin or ayahuasca, which has become trendy to drink in self-exploration ceremonies all over the country; others administer 5-MeO-DMT, extracted from a toad in the Sonoran Desert, or iboga or ibogaine, which, according to the scant research that exists, may be one of the most effective cures for opiate addiction on the planet – but may also cause fatal heart complications.
“Psychedelic therapy reinvigorated my belief in healing,” says one physician.
Underground psychedelic therapists are biased toward their preferred medicines, and those they think work best for particular indications. But they are united by true belief. “People that are involved are risking their careers, their freedom, in order to help others achieve a certain emotional freedom, and they disagree with prohibition,” says Doblin. “The fact that people are willing to do these therapies at great personal risk says something about what they think the potential of these drugs actually is to enhance psychotherapy.”
There are limitations. Psychedelics aren’t for everyone. Or at all foolproof. Nary a researcher or provider, under- or aboveground, fails to point out that some pre-existing conditions make them inappropriate for use, and that though the dangers don’t rise nearly to the level of drug-war -mythology (iboga/ibogaine is the major exception), adverse outcomes do happen. The toxicity of -ayahuasca is on par with codeine – though codeine causes many thousands more deaths per year. Psilocybin’s is even less. Some studies have found brain damage in chronic Ecstasy users, but in 2010, researchers at Harvard Medical School studied a large sample of Mormons who used Ecstasy – which the LDS Church was late to ban – but no other drugs or alcohol, and failed to find cognitive consequences; safety studies of the dosages used in MDMA therapy have found no evidence of neurotoxicity or permanent changes in serotonin transporters. LSD does not stay in your body forever (its half-life is a matter of hours). But behaviorally, people on Ecstasy have died from heatstroke, or drinking too much or not enough water at raves; there have been assaults and even a murder at ayahuasca ceremonies for foreigners in Peru, which has seen a massive tourism boom around the substance’s popularity. Probably the most common concern, the specter of “freaking out” during or long after a bad trip, has yet to happen in any of the clinical trials – though it’s not unusual for subjects to have tough experiences in their journeys. Dr. Charles Grob, a professor of psychiatry and biobehavioral science at UCLA, who has conducted studies with MDMA, ayahuasca and psilocybin, says that’s a function of screening, preparation and expert support. “This is serious medicine with a capital M,” he says, “and if you don’t watch yourself and you don’t pay attention to the essential basics, you could be in for a very difficult time.”
Even under the best of circumstances, the process catalyzed by psychedelic therapy is often far from painless. “It’s definitely not that people just get blissed out and it gets better,” says Dr. Michael Mithoefer, the lead clinician on the MDMA trials in Charleston, South Carolina (others are ongoing in Boulder, Colorado; Canada; and Israel). “It makes the healing process possible, not easy.” When you take 125 milligrams of pure MDMA, enough to nearly immobilize you, and someone invites you to take a look at your deepest self, “it is a destabilizing agent,” Dr. X cautions. But it’s purposefully so. “It opens us,” he says. “Sometimes the medicine can stabilize someone in a difficult situation. Sometimes it stirs up madness, so they can process that. Some people feel rejuvenated and ready to go back into their lives, but other people feel frazzled, spent, fragmented. I’ve had a few people say, ‘That shattered who I thought I was.’ ”
Limitations and challenges aside, the evidence so far still makes researchers cautiously optimistic that psychedelics hold potential for great healing and change. If they’re right, medicalization could address the deficits in treatment options for afflictions – trauma, depression, anxiety, addiction – that collectively impact millions of Americans, and ultimately shape our world. “If we move forward and understand that these substances should only be used under optimal conditions,” says Grob, “it will have a positive impact on an individual, family, collective and societal level.” In aboveground clinical trials like his, subjects routinely report that psychedelic therapy is among the top five most important experiences of their lives, akin to the birth of a child.
We’ve been here before: From the 1950s to the early Seventies, more than 40,000 cases of psychedelic treatment were studied in 1,000 different papers in the medical literature, covering everything from addiction to PTSD to OCD to antisocial disorders and autism. Despite encouraging results, says Grob, the “wild, uninhibited enthusiasm of the Sixties” contributed to some bad recreational outcomes that gave legislators ammunition to ban psychedelics from research for decades. But as the above-ground movement has again been picking up steam, so is the underground. More positive studies get published; more patients and doctors read them; more underground success stories spread through word of mouth. “The secret is out,” says Grob, and, perhaps combined with depression and opiate overdoses at all-time highs, skyrocketing civilian and veteran suicide rates, and trends toward personal optimization and wellness, demand is increasing. Researchers at NYU, UCLA and Johns Hopkins all stressed that they cannot and do not ever work with people in the underground, but some of them admired the willingness of certain health care professionals to act, however illegally, on their belief that sometimes healing can’t wait and that psychedelics are imperative to it. “I respect that in them,” NYU’s Guss says. “I really do. I’ve become a member of the most established establishment. And so in a way, we’re isolated from all the wisdom and knowledge in the underground community.” That vast, uncollected experience contains details about the medicines’ potential and pitfalls, challenges and inconsistencies – the variety of ways psychedelics might wholly, drastically change a life. “I’m very interested to learn,” Guss says, “what underground psychedelic psychotherapists have to teach us.”
***
My first introduction to underground psychedelic therapy was when, years ago, a doctor told me my vagina was depressed. I’d gone in for a pelvic exam because something felt wrong; at the follow-up appointment, when my test results were all negative and my answers to her hundred questions about the post-traumatic stress disorder I was in treatment for were all related to sexual threats and reporting on sexual violence, she said my genitals were just fucking bummed out.
This was San Francisco, and I did a lot of yoga; but even I rolled my eyes at the idea that my privates had an emotional disorder. I was very intrigued, however, when the doctor said she knew a therapist who could heal years of trauma in one five-hour swoop, so long as I had the secret password. The doctor gave me the number for that therapist – who worked with MDMA.
I never called. I moved across the country. Years later, I was on vacation on the coast when my husband went out for a run, and I stayed behind and may or may not have contemplated suicide.
OK. I did. In the car, on the road, running an errand, I thought about driving off the edge of a cliff into the brilliant, crashing Pacific.
“We can direct our own intellectual evolution by using psychedelics as self-hacking tools,” says a Silicon Valley magnate.
Yes, I had a history: the PTSD, with concomitant major depressive disorder, suicidal thoughts. On my official paperwork, I was technically permanently disabled, but I had been doing much better – working, going to karaoke, having a life. I had backslides and big episodes, but if my “issues” were not exactly handled, they were at least on a general upswing thanks to years of constant treatment. But then, the night before my drive, I had started yelling in a restaurant, feeling that I was spiraling out of control but unable to stop myself from making a scene. Now, having coaxed my car away from the cliff edge and back to the hotel, I lay facedown and screamed into the pillows. I called a local therapist and begged for an emergency appointment. As I lay there in her office, in the fetal position, I wondered aloud if I should try MDMA therapy.
Weirdly (or magically, as would later be obvious), she happened to have the number of another therapist who worked with it.
The therapist who gave me the second referral said she had a client with whom she’d been working for years who had done a journey. The difference in that patient’s suffering, she said, was like night and day. When I called the number, the woman who answered said we needed to meet in person, and when we did, she mentioned that my struggle was why the wait for MDMA to become widely available was untenable. She said, in a stunning lack of expectation management, that she could help me massively – more, in a few sessions, than all my years and dollars of hard therapeutic work had combined.
So after one more conversation, I showed up nervous, but excited, but desperate on a Monday morning (as scheduled) with an empty stomach (as directed) to a charming room with a couch at one end and a bed at the other. After we did something like a prayer, I took the see-through capsule of white powder and retired to the bed with the journal I was encouraged to bring while the therapist went out on the deck to give me space. I’d been told that the journey with psychedelics truly starts beforehand, the moment you decide to do it, and I had indeed been struggling extra since then. Waiting for the medicine to come on was no exception.
The Journey. 9:35 a.m.
I’m full of grief, and gratitude, and terror. I’ve been extra wound up and tight, extra untouchable, since we put this on the calendar. My body must be gripping and tensing in preparation to let go. . . .
9:55 is when the doubt sets in. About the pointlessness, the uselessness, the futility of this endeavor. A moment ago, I was envisioning lots of purple tears. I’m like, let’s just go read a newspaper and drink some tea somewhere.
This is when the therapist, who had come back inside, told me I was higher than I realized, and to lie down and let it ride.
I hadn’t anticipated tripping, or time-travel. But there were movies of my life, and visits with loved ones. The therapist had turned on jangly guitar music, which struck me as lame at first, but soon became the most beautiful, dynamic composition I’d ever heard because: Ecstasy. I breathed deep with my eyes closed and a hand on my chest. I cried, often, as I rewitnessed my life. My therapist said very little. She had said before that our collective job was to trust my intuition. I went back to the scenes where my PTSD started. In one of them, I revisited a remote, bleak room where a stranger cornered me. I watched the scenario – which, in reality, I had escaped physically unscathed – play out with an alternate ending. But I didn’t get overpowered and raped, which is what I’d always assumed was so scary about it. Instead, the stranger stepped forward and, in one swift move, landed his hands in a death grip around my throat.
Several times, the scene replayed. Repeatedly, I watched myself get strangled.
Ohhhhhhhhhhh, I could see, suddenly. This isn’t just a rape issue, as I’d been working through it in therapy for years. This is also a murder issue.
For weeks after the journey, every man I walked past triggered an automatic but definitive – and elated! – voice inside me that said: That guy’s not gonna kill you! Down the sidewalk in a city, that guy’s not gonna kill you, and that guy’s not gonna kill you. If I had realized at the conscious level that I thought they would, I would have stopped leaving the house. No wonder I was always exhausted. After the journey, I stepped down the street with wild new energy. Seeing, finally, the ultimate fear of that moment, my feared choking death, was sort of terrible, I guess, but not really, it wasn’t, because: Ecstasy. And as soon as I acknowledged it and saw it through, the moment lost its quiet, powerful rule over my system.
For some people, an MDMA journey ends after a few hours. They sit up and start talking. They drink the water and eat the snack given to them, and talk for a bit as the medicine wears off. And then they leave.
I had to be pulled out of mine. Whether because I have a genetic variation that makes people more sensitive to MDMA or because I am “a very intense person,” around 2 p.m. the therapist had to shake me; it was time to get ready to go – my husband was scheduled to pick me up, and the therapist had another appointment coming. She had me sit up and eat and drink and try to rejoin the present. When I left some half an hour later, I was cheerful and articulate, but still tripping. My husband, in utter bewilderment over how to handle me, took me to a nearby hotel, as planned. Later, we tried to go eat in a restaurant. I babbled, pleasantly at first, but then, about eight hours after my journey began, everything turned twitchy and dark. I called the therapist frantically and asked her if most people, post-journey, felt like every single thing in their entire lives needed to be burned down immediately, and she said no, not really, but that my job in any case was to “do nothing, very slowly.”
In the clinical trials of MDMA for PTSD, the protocol is to keep patients overnight. The sessions – typically there are three, spaced a month apart – last at least eight hours, because that’s sometimes when the heaviest processing will only begin to kick in, particularly for patients who have a history of dissociation, or severe detachment from reality – which I do. My MDMA therapist, who had been doing journeys for a long time, had never happened to see a person quite like me, but for people like me, researchers say, it’s not unheard of for the journey to get ugly at around the time I was in the middle of a dinner date.
But I didn’t happen to know any of that.
That night, I ran, fleeing from the hotel into the rural darkness, alone. I had total conviction that every facet of my existence was a mistake. I was engulfed in panic. I had no idea what to do with myself, except for one specific thing, as the clear message of it kept ringing over and over in my head, and that message was: GET. DIVORCED.
***
It’s harder to integrate if you have a life: a company, a house, a wife,” Dr. Y explains to a patient during a phone session one day. Dr. Y, who looks younger than his middle age, paces and stretches while he talks to the man, many states away, who recently started therapy after he lost his relationship, lost his job and moved – three of the top five stressful life events, psychologists say. Dr. Y is a psychiatrist, which means he has the ability to prescribe medications, but in this session, this patient’s third, he instead asks whether the patient is feeling open to taking ayahuasca after having read all the literature Dr. Y assigned last time. He wants to be sure the man is fully aware of the “integration” process, which could be less charitably called “picking up the pieces of inner-personal land mines,” that may follow. Half of Dr. Y’s patients enact a major life change after ayahuasca. “Probably a quarter,” he says, strongly consider a breakup or divorce.
Dr. Y considers about 90 percent of his patients to be fit for ayahuasca. The one out of 10 he believes it isn’t right for could include people with a history of psychosis, mania or personality disorders, but more often it is those who don’t have the support necessary for integration, or aren’t ready to be led through symptom management while they’re weaned off antidepressants. That’s required by most knowledgeable practitioners: Like MDMA and psilocybin, ayahuasca increases serotonin in the body, and there’s a risk of serotonin poisoning if it’s taken with certain medications. Dr. Y’s patient today doesn’t have any of these contraindications. And Dr. Y believes the patient is strong enough to sort through his psychological contents as long as the patient also thinks he’s ready, which he says he is after airing some hesitations (“You know,” he says, “once you pull back a layer, there’s no going back, and you can’t unsee or unfeel what you saw”). Dr. Y will send him referrals to vetted, reputable providers in his preferred city. “Three nights [in a row] is better than two, and two is definitely better than one,” he tells him. First night, drink ayahuasca, open up; next night, dive deeper in. Layers of self-discovery. The soul as a somewhat coy onion. Sometimes, the peeling of it with ayahuasca involves experiencing your own death. Dr. Y gives the patient instructions for the month leading up to his journey: no other drugs, no alcohol, no sex. No reading news, no violent TV; reduce stress, meditate, find quiet. And, in the final week, no meat, no spice, no fermented foods. “The cleaner you go in,” Dr. Y, who himself has experienced hundreds of ceremonies, tells the man, “the more impactful the ceremony.” Whatever happens, during or after, Dr. Y will be available.
There are downsides to doing things underground. In addition to the obvious threat of arrest, more risks are created at every step of the psychedelic-therapy process by illegality, providers say. There can be difficulty with something as basic as finding and ensuring clean compounds: MAPS helped run an MDMA testing program, and half of the pills sent in didn’t contain any MDMA at all; there have been reports of some shamans spiking ayahuasca with a more toxic hallucinogenic plant to intensify the trip. The best-cared-for patient is still disadvantaged by the general lack of cultural wisdom and support around the treatment. Even good providers aren’t as knowledgeable as they could be. Once a year, there is a secret conference that brings together 50 to 100 underground practitioners at a revolving location. “Information gets shared, and people learn new things,” says one regular attendee. Another participant recalls lectures on practicalities like the best and most therapeutic doses, how to screen for patients with borderline personality – whom many believe are not compatible with psychedelics – and how different music and sounds impact sessions. But not nearly all the world’s practitioners are there. And none of the minutes or findings can be published.
“It’s really our best shot at solving the veteran suicide crisis,” says one marine who underwent MDMA therapy.
Plus, not every underground patient gets care as elaborate or expert as Dr. Y’s. Some don’t receive the preparation or follow-up they may need, because they can’t afford it, or because in an underground, patients don’t have the luxury to be picky about their providers; they may have to take anyone whose number they can manage to get their hands on, and it can be hard for laypeople to adequately vet providers anyway. An M.D. who used to administer psychedelics (he prefers not to say which) for depression and anxiety (and who, when I tell him he’ll have a secret identity – like Batman – asks if he can be Dr. Batman) doesn’t provide underground psychedelic treatment anymore because it started to feel too threatening to his legitimate practice, but in extreme cases he still refers opiate addicts to underground providers who work with ibogaine. “I know quite a few people who do that,” he says. “But I only trust two of them. Out of about 10. These are nurses, or respiratory therapists – people that know how to resolve an emergency.” Outside of that, there’s “a whole subculture” of more amateur iboga and ibogaine therapists, Dr. Batman says. “It’s a movement that’s driven by addicts helping other addicts. I don’t think that’s good, per se.”
It would be best, in Dr. Batman’s opinion, for people to get iboga-based addiction treatment in a reputable clinic outside the country. According to one such center in Mexico, one in 10 patients needs some medical care, one in 100 needs serious medical intervention, and, even in the hospital-like setting, people do occasionally die. But not everyone has the money to travel to the best treatment. “It’s very difficult for me to make that referral” to the underground for such a risky compound, Dr. Batman says. But sometimes his concern that someone will join the nearly 100 Americans who die of opioid overdose every day overrides his hesitation.
Even for comparatively safer MDMA and psilocybin, says Dr. X, “the fact that we have to do this and hide and send people back to their lives, versus doing it at an inpatient facility,” where patients could stay for more integration, is less than ideal.
But all these are risks that people who feel they need psychedelic therapy are willing to take. Nigel McCourry, a 35-year-old Iraq War veteran who participated in a MAPS MDMA study, was so transformed by the PTSD treatment that he was determined to get it for one of his fellow Marines. “This is my Marine battle buddy,” he says. “He needed help.” It took a lot of searching and ultimately traveling to another state to find an underground therapist, whom neither Marine knew, and McCourry was acutely aware of how difficult the process could be: For up to a year after his own treatment began, he says, “It was really wild. I had all of these emotions coming up out of nowhere. I would cry at random times. I had to give myself so much space to be able to let that out. I would be crying and I had no idea what I was crying about. It was just really intense.”
As a subject in the clinical trial, McCourry underwent three 90-minute preparatory sessions prior to dosing, another long integration session the morning after, a phone call every day for a week, and additional 90-minute sessions every week between the three journeys. His friend didn’t have the money or opportunity for nearly that kind of support. But he took the journey anyway. In their infantry unit, 2/2 Warlords, “guys are consistently committing suicide,” McCourry says. “I think [MDMA therapy] is really our best shot at solving the veteran suicide crisis.”
Elizabeth Bast, a 41-year-old artist and mother, also felt like she was out of options when she and her husband, Joaquin Lamar Hailey (better known as street artist Chor Boogie), flew to Costa Rica to get iboga therapy at a healing center after Hailey relapsed into an old heroin addiction that both of them felt was going to kill him. When he felt he needed a booster dose six months later, they turned to an underground provider closer by, in the States. Iboga “was crucial,” Bast says. “It saved his life.” The couple have started organizing and facilitating treatment trips for addicts to other countries (the drug is illegal in less than a dozen). But there are a lot of others they can’t help. Since Bast wrote a book about their experience, “I get inquiries every day: ‘My brother’s dying, and I can’t get out of the country.’ We would love to support that. But it’s too risky.”
Psychedelic medicalization isn’t without its own potential problems. There is squabbling in the underground community about whether it would provoke too much regulation over who can administer medicines, and who can take them and how; or whether it would lead to corporatization, or a boom in licensed but low-quality providers of substances that are so intense. Even now, in the aboveground in other countries, “There are places where it’s done that are very unprofessional,” says Ben De Loenen, executive director of the International Center for Ethnobotanical Education Research and Service (ICEERS), which provides resources for users and potential users of ayahuasca and iboga. UCLA’s Grob has been called by patients who’ve suffered severe, persistent anxiety for months after a psychedelic-therapy experience, which he says tends to be the result of bad preparedness, ethics, or practices of providers. There are also questions about sustainability. As both deforestation of the Amazon and popularity of ayahuasca increase, shamans have had to trek deeper into the jungle to find the plants that compose it. The increasing popularity of 5-MeO-DMT, called “the Toad” for its origins in the venom sacs of an amphibian – which are milked, the liquid then dried and basically free-based (smoking it is necessary; swallowing it can be fatal) – has led to incidences of people stealing onto Native American reservations to find the frog, leaving empty beer bottles and trash in their wake. If the broader culture ever accepted the species as the path to healing or enlightenment, one can surmise how long it might survive.
Guss, the NYU researcher, sees a future where psychedelic therapy is the specialty of highly and appropriately trained professionals and a robust field of scientific inquiry. For now, there’s the underground, some developing countries and the Internet. ICEERS offers tips for vetting practitioners, as well as free therapeutic support to people in crisis during or after ceremonies. MAPS has published a manual for how to do MDMA-assisted psychotherapy on its website, downloadable by anyone.
“Putting out info about how we do the therapy is more likely to contribute to safety than anything else,” says Doblin. On the dark Web, sellers of iboga and ibogaine thrive. There were a thousand people on the wait list for MAPS’ most recently completed MDMA trial. “People are desperate,” Doblin says. “People are doing this.”
***
Personally, my integration after MDMA was brutal. Though I eventually returned to my hotel room that first night, my state didn’t improve. I didn’t sleep, lying next to my husband, garnering every ounce of willpower to keep from saying that I was leaving, immediately and forever; my husband didn’t sleep either, blanketed in my agitation. For weeks, we found ourselves on the floor, or in bed, one or both of us crying as he asked if I still wanted to be married and I didn’t know; and I didn’t know, for that matter, what my personality was (callous? Funny? Was I funny? If so, was I really, or just performing?) or whether I was bisexual like I always thought or strictly gay. My moods swung from extreme openness and optimism to utter despair and stunned confusion. One day, I spent hours indulging a rich and specific fantasy about filling a bathtub with hot water, downing the years-old bottle of Ativan from when I was first diagnosed, and slitting my forearms from wrist to elbow. Later, in an entirely different temperament, I saw the plan in my Journey Journal and recognized it as active suicidal ideation; if someone had taken the notebook to the police, they could have legally committed me to an institution against my will.
From the beginning, my MDMA therapist had recommended more than one journey. Next time, she said in one of our multiple follow-up integration sessions, I’d stay all night. I agreed that another journey was in order, but I happened to talk to someone who mentioned an underground therapist with a different practice and whom I got a good feeling from when we talked, and so, three months after the first journey, in a dark and silent room with three other people after nightfall, concerns about my family history of schizophrenia thoroughly discussed and considered, I drank ayahuasca.
On the first night of the two-night ceremony, sitting on the “nests” we each built with yoga mats and sleeping bags on the floor, I was nervous again. But less than last time. After drinking about an ounce of the thick sludge, I lay down. There were the initial sparkles and shooting stars behind my eyes, and after a while, as the facilitators started singing – ancient songs they say come from the plant and help it work – a vision of myself as a five-year-old appeared. There was a suggestion at a history, something bad that happened that I didn’t remember; I did not like the direction it was going in; I also thought it was bullshit. The visions stopped. Instead, an abject, suffocating rage came over me, and I lay there in it for five hours thinking about getting in my car and driving away and wishing everyone else in the room would fucking die.
The next night, after a long, raw and still-irate day in the house, the first vision that showed up was five-year-old me again – pissed. She wouldn’t talk to me, however much I tried to coax her. I knew I had to get her to engage, which over the course of seven hours involved recognizing that I hated myself, that my self-hatred was my best and most reliable friend, and that my self-hatred would never die until I appreciated how it had protected me; when I did, and it did, I gave it a Viking funeral in the vision and in reality cried harder than I ever had in my life. Then I just had to reckon with shame. I sensed the five-year-old had brought it, actually, not me, but no matter, I assured her: I was the goddamn adult here, and I was going to take care of it. There was suffering and writhing and grief and nausea. I threw up, twice, prodigious quantities of black liquid, once so hard into a bucket that it splashed up all over the bottom half of my face.
A few inches away from me, a woman, who’d recently been in a car accident that put her in the hospital and in a wheelchair for a time, lay perfectly still and silent; a few inches from her, a man gnashed his teeth at visions of his abusive parent. At the other end of the room, another participant relived the night of his father’s suicide. In the vision, as in real life, he was unable to stop him from slipping out into the garage to do it. But this time, when the man discovered his father’s body and cut him down from the rope, he didn’t falter under the weight and drop him, as he did when he was a teenager. This time, he had the strength of his adult self, and when he caught him, he held him. Suspending his own sense of horror and failure, and the calling of the police, and the screams of his mother, he got to hold him for a very long time.
***
In November 2016, the results of two large studies showed that the majority of cancer patients who received one dose of psilocybin experienced lasting recovery from depression and anxiety. In February of 2017, a paper in the Journal of Psychopharmacology found that “experience with psychedelic drugs is associated with decreased risk of opioid abuse and dependence.” Medical-journal papers about ayahuasca suggest it can treat addiction, anxiety and depression, and change brain structure and personality. So far in the MDMA PTSD trials, zero participants haven’t improved at all, and more than 80 percent have recovered to an extent that they don’t qualify as having PTSD anymore. Estimates for the effectiveness of other PTSD treatments range as high as 70 percent but as low as 50 percent. The number is somewhat contentious, but even “if you think it’s only 25 percent” for whom conventional treatments don’t work, says Mithoefer, the lead clinician on the trials in Charleston, “that’s still millions of people a year in the United States alone.” All the participants in the trials had previously tried medication or therapy, usually both; as a cohort, they’d had PTSD for an average of 19 years.
But “ultimately, the decision to reschedule [psychedelics from Schedule I substances] is not a scientific one,” points out NYU’s Guss. “It’s a governmental one. We may be able to prove safety and efficacy. But there still may be governmental legislative reasons that rescheduling doesn’t move forward.”
Psychedelic use has been opposed and persecuted by authorities for centuries, both in Europe and in the New World. Among those reasons, believers believe, is the fear that widespread smart psychedelic use could foment societal upheaval. That’s not unlike the belief in the Sixties – but we know more now about what psychedelics do and how to optimize them. “We didn’t have as much data then as we do now,” says Dr. Dan Engle, a board-certified psychiatrist who consults with plant-medicine healing centers worldwide. “And we didn’t have as many of the safeguards as we have now.” He envisions “the psychedelic renaissance as a cornerstone in the redemption of modern psychiatric care.” Now, thanks to brain imaging, researchers can see that far greater “brain-network connections light up on psilocybin compared to the normal brain. More cross-regional firing. That’s what the brain actually looks like on the ‘drugs’ that we’ve been using for hundreds if not thousands of years.”
This has helped make psychedelics particularly popular in Silicon Valley, where a drive toward self-actualization meets the luxury of having the resources to pursue it. California, where Berkeley-born chemist Alexander “Sasha” Shulgin synthesized and distributed MDMA to therapists for decades before it was prohibited, has long been at the front of the movement; today, Doblin estimates, the state doesn’t have quite the majority, but probably 40 percent of underground psychedelic therapists in the nation. In 2016, California Sunday Magazine reporter Chris Colin profiled Entrepreneurs Awakening (EA), a company that arranges Peruvian ayahuasca sojourns primarily for tech and startup CEOs. The customers, says owner Michael Costuros, are “supersuccessful type-A people who use it to be better at what they do.”
“These things are so powerful,” says Eric Weinstein, managing director at Thiel Capital, Peter Thiel’s investment firm in San Francisco, “that they can get into layers of patterned behavior to show folks things that they could change and could do differently. And the brain has probably been playing with these ideas in the subconscious. This entire family of agents is extraordinary, as they appear to be very profound, unexpectedly constructive and surprisingly safe. Most people who take these agents seem to discover cognitive modes that they never knew even existed.” Weinstein has been considering trying to put together a series of opposite-land “This Is Your Brain on Drugs” public-service commercials, in which other Silicon Valley luminaries and scientists like himself – a Ph.D. mathematician and physicist – out themselves as having “directed their own intellectual evolution with the use of psychedelics as self-hacking tools.”
But even for the super-high-functioning, psychedelic use isn’t just about optimizing. It also, Costuros says, makes them better people: “What I’ve seen consistently happen is CEOs become a people-centric, people-focused person.” After well-administered and integrated psychedelics, “we’re not gonna see the kind of Donald Trump entrepreneurs that are only about extracting value.” After an ayahuasca journey with EA, an arms magnate left his multimillion- dollar company to build an art and music residency program. Chris Hunter, the 38-year-old inventor of caffeinated malt-liquor beverage Four Loko, went into his trip with EA’s Costuros as a regular former Ohio State University fraternity brother from Youngstown and came out a new man. “Why are you such a dick?” he says he asked himself on ayahuasca. “What if you approached masculinity in a different way – instead of being dominant and overseeing the women in your life, you came from the other side, underneath, fully supporting and lifting women up?” Ayahuasca users whom UCLA’s Grob has researched in other countries “have become better partners to their spouses, better parents to their children, better children to their parents, better employees, better employers, just more responsible overall, bringing a higher level of ethical integrity to everything they do,” he says.
It’s possible that psychedelics could transform a wide array of people. Clinical trials have included subjects across demographic categories, including soldiers and conservatives and the elderly and people who’ve never taken drugs at all before. Some of Dr. X’s patients most definitely do not vote Democrat. But the people who have access to psychedelic treatment underground (or overseas) do tend to have something in common: They are usually well-off. “If I could do it legally, I would not turn away anyone for treatment, if I could be aboveground and I could get them to supportive services [afterward],” Dr. X says. Because of the necessary secrecy and lack of outside support now, he considers it irresponsible to provide journeys to anyone without the time and resources to also pay for integration sessions. (McCourry had to pay for the first journey of his Marine friend, who didn’t have any money; they had to find a wealthy benefactor to cover the next two.) Clients are also mostly white – as are providers. “Sentencing for middle-class white people is a hell of a lot friendlier than for minorities and poor people,” Dr. X says. “It’s a tragedy that people with the most vulnerability, who need it most, we can’t do it with them.”
Doblin, for his part, speculates that the DEA hasn’t cracked down on underground psychedelic therapists because they have more pressing priorities than those trying to heal a select few of the rich, the traumatized and the addicted. It’s also one thing for psychedelics to be popular with millionaires – and some Nobel laureates and business celebrities you’d never believe, Costuros maintains – and the hip participants of the estimated 120 ayahuasca ceremonies that take place in New York City and the Bay Area every weekend. But who knows what might unfold if psychedelic therapy were available to people for whom the status quo doesn’t work so well?
It’s unclear if the current presidential administration, which includes some extremely drug-unfriendly members, will alter or slow the course of possible medicalization. For the time being, the researchers soldier on, and the underground grows. In 2017, K., a therapist with a traditional practice in an Appalachian state, administered her first MDMA journey with a client (with two additional medical professionals on hand for safety); the client, who’d still needed occasional suicide watch stemming from symptoms of complex PTSD despite 16 years of therapy, had brought her the MAPS manual, downloaded off the Internet. “I’m trained to provide the best care to my clients in a way that’s ethical,” K. says, “so if research is backing up that things that are now illegal are really helpful with little to no side effects, especially compared with psychiatric medications, which have a ton of side effects, then it’s something I’m open to.” When dosed, K.’s client, S., talked through a childhood of severe abuse and torture – “but none of it was terrifying,” S. says. “I talked in detail about a lot of horrific shit that happened. Then I said: The thing is, all those things are over, and I know they’re over, and my body knows that everything is going to be OK.”
For Silicon Valley’s Weinstein, the success stories show the importance of advocating for broader access. “If we don’t legalize, study and utilize these plants and other medicines, people who could be saved will die,” he says. “Families will break apart. Parents will continue to bury depressed children who might have been saved by these miraculous agents. Can we bring ourselves to ask if a single professionally administered flood dose of legalized ibogaine could have saved Prince from opioid addiction? Some of these agents are anti-drug drugs . . . and we are still against them. I definitely would like to attack the idea that any of this makes any sense.”
***
So I’d done an underground MDMA session, and a weekend of illegal ayahuasca ceremonies.
The integration, as the months went on, seemed to go a bit smoother.
After ayahuasca, I still had good and bad days. The process was still intense but less earthshaking, either because I’d done the first big, tough layer of processing post-MDMA, or because the journey was different, or I was getting used to being unsettled, or all of the above. Or maybe the smoother time was a little reprieve, since something more shattering was about to happen.
After all the months, all the pieces that had been stirred up were not quite connected. I felt I needed one more sitting with the therapist and the psychedelic that at that point felt right. So I settled into a nest on a little patch of floor, again, in the same house as last time, but in a large, high-ceilinged living room full of moonlight coming in through the windows, and I whispered into a cup of ayahuasca a plea for wholeness, and drank it.
The vision is about me, as a five-year-old. Again.
Psychedelics, they say, will not give you what you want. But they will give you what you need.
I’m shocked to encounter the child again, but ready to see what she shows me this time. The child remembers; I remember, though the realization is slow, and the acceptance is slower.
When I thought I cried the hardest in my life the last time I drank ayahuasca, I was wrong.
I cannot (and would not) begin to encompass, in a brief space, what happens in the next long hours, and the next day, and the next night. The second night, the facilitators have to end the ceremony without me. They bless and blow smoke and perfume on the others because after so many hours, they’re done, but I’m still deep in it. They take turns staying with me and singing. It goes on for so long, with so much shaking and sickness, that to be kind to my nervous system, my facilitator, who in her day job cares for homeless children, puts me in a bathtub of hot water.
I hyperventilate, for a long time, until I don’t. I remember the bathtub-suicide fantasy. The facilitator is sitting next to me, on the floor, putting a soaked hot washcloth against my face, my neck, on my head. I tell her about the fantasy, and that I have come to know, in this bathtub, that I am not going to kill myself.
For a second she thinks I mean I won’t kill myself in her bathtub, rather than in general. Then when she gets it, the two of us laugh about what a drag that would be for her, if I killed myself here, on drugs in her house, both of us joking about it: me, naked, her, trying to help me save my life.
We’re laughing, but this moment is a big deal, and we know it. I am not healed. But I am whole. I can go ahead and get divorced if that turns out to be the right thing, but not because I was violated too many times to bear intimacy. There will be many more spectacularly challenging, professionally supported months of working through the terror and pain imprinted on my body when it was tiny, powerless under adult darkness and weight, but one of the end results has already arrived. The too-many years of my life where I sometimes actively, and maybe always a little bit passively, thought about killing myself are over.
But what has changed, people keep asking me, since the journeys. In my life, what difference did it make?
Every single thing is different, I tell them. Because I was splintered before, but now: I’m here.
This article was originally published on March 9, 2017, by Rolling Stone, and is republished here with permission.
The postcard contained only two words: “Hurry up.”
John Archibald Wheeler, a 33-year-old physicist, was in Hanford, Wash., working on the nuclear reactor that was feeding plutonium to Los Alamos, when he received the postcard from his younger brother, Joe. It was late summer, 1944. Joe was fighting on the front lines of World War II in Italy. He had a good idea what his older brother was up to. He knew that five years earlier, Wheeler had sat down with Danish scientist Niels Bohr and worked out the physics of nuclear fission, showing that unstable isotopes of elements like uranium or soon-to-be-discovered plutonium would, when bombarded with neutrons, split down the seams, releasing unimaginable stores of atomic energy. Enough to flatten a city. Enough to end a war.
After the postcard’s arrival, Wheeler worked as quickly as he could, and the Manhattan Project completed its construction of the atomic bomb the following summer. Over the Jornada del Muerto Desert in New Mexico, physicists detonated the first nuclear explosion in human history, turning 1,000 feet of desert sand to glass. J. Robert Oppenheimer, the project’s director, watched from the safety of a base camp 10 miles away and silently quoted Hindu scripture from the Bhagavad Gita: “Now I am become Death, the destroyer of worlds.” In Hanford, Wheeler was thinking something different: I hope I’m not too late. He didn’t know that on a hillside near Florence, lying in a foxhole, Joe was already dead.
When Wheeler learned the news, he was devastated. He blamed himself. “One cannot escape the conclusion that an atomic bomb program started a year earlier and concluded a year sooner would have spared 15 million lives, my brother Joe’s among them,” he wrote in his memoir. “I could—probably—have influenced the decision makers if I had tried.”
Time. As a physicist, Wheeler had always been curious to untangle the nature of that mysterious dimension. But now, in the wake of Joe’s death, it was personal.
Wheeler would spend the rest of his life struggling against time. His journals, which he always kept at hand (and which today are stashed, unpublished, in the archives of the American Philosophical Society Library in Philadelphia), reveal a stunning portrait of an obsessed thinker, ever-aware of his looming mortality, caught in a race against time to answer not a question, but the question: “How come existence?”
“Of all obstacles to a thoroughly penetrating account of existence, none looms up more dismayingly than ‘time,’” Wheeler wrote. “Explain time? Not without explaining existence. Explain existence? Not without explaining time.”
As the years raced on, Wheeler’s journal entries about time grew more frequent and urgent, their lines shakier. In one entry, he quoted the Danish scientist and poet Piet Hein:
“I’d like to know what this whole show is all about before it’s out.”
Before his curtain came down, Wheeler changed our understanding of time more radically than any thinker before him or since—a change driven by the memory of his brother, a revolution fueled by regret.
Wheeler was thinking: I hope I’m not too late. He didn’t know that on a hillside near Florence, lying in a foxhole, Joe was already dead.
***
In 1905, six years before Wheeler was born, Einstein formulated his theory of special relativity. He discovered that time does not flow at a steady pace everywhere for everyone; instead, it’s relative to the motion of an observer. The faster you go, the slower time goes. If you could go as fast as light, you’d see time come to a halt and disappear.
But in the years following Einstein’s discovery, the formulation of quantum mechanics led physicists to the opposite conclusion about time. Quantum systems are described by mathematical waves called wavefunctions, which encode the probabilities for finding the system in any given state upon measurement. But the wavefunction isn’t static. It changes. It evolves in time. Time, in other words, is defined outside the quantum system, an external clock that ticks away second after absolute second, in direct defiance of Einstein.
That’s where things stood—the two theories in a stalemate, the nature of time up in the air—when Wheeler first came onto the physics scene in the 1930s. As he settled into an academic career at Princeton University, Wheeler was soft-spoken and impossibly polite, donning neatly pressed suits and ties. But behind his conservative demeanor lay a fearlessly radical mind. Raised by a family of librarians, Wheeler was a voracious reader. As he struggled with thorny problems in general relativity and quantum mechanics, he consulted not only Einstein and Bohr but the novels of Henry James and the poetry of Spanish writer Antonio Machado. He lugged a thesaurus in his suitcase when he travelled.
Wheeler’s first inkling that time wasn’t quite what it seemed came one night in the spring of 1940 at Princeton. He was thinking about positrons. Positrons are the antiparticle alter egos of electrons: same mass, same spin, opposite charge. But why should such alter egos exist at all? When the idea struck, Wheeler called his student Richard Feynman and announced, “They are all the same particle!”
Imagine there’s only one lone electron in the whole universe, Wheeler said, winding its way through space and time, tracing paths so convoluted that this single particle takes on the illusion of countless particles, including positrons. A positron, Wheeler declared, is just an electron moving backwards in time. (A good-natured Feynman, in his acceptance speech for the 1965 Nobel Prize in Physics, said he stole that idea from Wheeler.)
The puzzle of existence: “I am not ‘I’ unless I continue to hammer at that nut,” wrote John Archibald Wheeler. Photo from Corbis Images.
After working on the Manhattan Project in the 1940s, Wheeler was eager to get back to Princeton and theoretical physics. Yet his return was delayed. In 1950, still haunted by his failure to act quickly enough to save his brother, he joined physicist Edward Teller in Los Alamos to build a weapon even deadlier than the atomic bomb—the hydrogen bomb. On November 1, 1952, Wheeler was on board the S.S. Curtis, about 35 miles from the island of Elugelab in the Pacific. He watched the U.S. detonate an H-bomb with 700 times the energy of the bomb that destroyed Hiroshima. When the test was over, so was the island of Elugelab.
With his work at Los Alamos complete, Wheeler “fell in love with general relativity and gravitation.” Back at Princeton, just down the street from Einstein’s home, he stood at a chalkboard and gave the first course ever taught on the subject. General relativity described how mass could warp spacetime into strange geometries that we call gravity. Wheeler wanted to know just how strange those geometries could get. As he pushed the theory to its limits, he became fascinated by an object that seemed to turn time on its head. It was called an Einstein-Rosen bridge, and it was a kind of tunnel that carves out a cosmic shortcut, connecting distant points in spacetime so that by entering one end and emerging from the other, one could travel faster than light or backward in time. Wheeler, who loved language, knew that one could breathe life into obscure convolutions of mathematics by giving them names; in 1957, he gave this warped bit of reality a name: wormhole.
As he pushed further through spacetime, he came upon another gravitational anomaly, a place where mass is so densely packed that gravity grows infinitely strong and spacetime infinitely mangled. This, too, he gave a name: black hole. It was a place where “time” lost all meaning, as if it never existed in the first place. “Every black hole brings an end to time,” Wheeler wrote.
As he pushed further through spacetime, Wheeler came upon another gravitational anomaly. This, too, he gave a name: black hole.
***
In the 1960s, as the Vietnam War tore the fabric of American culture, Wheeler struggled to mend a rift in physics between general relativity and quantum mechanics—a rift called time. One day in 1965, while waiting out a layover in North Carolina, Wheeler asked his colleague Bryce DeWitt to keep him company for a few hours at the airport. In the terminal, Wheeler and DeWitt wrote down an equation for a wavefunction, which Wheeler called the Einstein-Schrödinger equation, and which everyone else later called the Wheeler-DeWitt equation. (DeWitt eventually called it “that damned equation.”)
Instead of a wavefunction describing some system of particles moving around in a lab, Wheeler and DeWitt’s wavefunction described the whole universe. The only problem was where to put the clock. They couldn’t put it outside the universe, because the universe, by definition, has no outside. So while their equation successfully combined the best of both relativity and quantum theory, it also described a universe that couldn’t evolve—a frozen universe, stuck in a single, eternal instant.
Wheeler’s work on wormholes had already shown him that, like electrons and positrons, we too might be capable of bending and breaking time’s arrow. Meanwhile his work on the physics of black holes had led him to suspect that time, deep down, does not exist. Now, at the Raleigh International Airport, that damned equation left Wheeler with a nagging hunch that time couldn’t be a fundamental ingredient of reality. It had to be, as Einstein said, a stubbornly persistent illusion, a result of the fact that we are stuck inside a universe that only has an inside.
Wheeler was convinced the central clue to the puzzle of existence—and in turn of time—was quantum measurement. He saw that the profound strangeness of quantum theory lies in the fact that when an observer makes a measurement, he doesn’t measure something that already exists in the world. Instead, his measurement somehow brings that very thing into existence—a bizarre fact that no one in his right mind would have bought, except that it had been proven again and again with a mind-melting experiment known as the double-slit. It was an experiment that Wheeler could not get out of his head.
In the experiment, single photons are shot from a laser at a screen with two tiny parallel slits, then land on a photographic plate on the other side, where they leave a dot of light. Each photon has a 50/50 chance of passing through either slit, so after many rounds of this, you’d expect to see two big blobs of light on the plate, one showing the pile of photons that passed through slit A and the other showing the pile that passed through slit B. You don’t. Instead you see a series of black and white stripes—an interference pattern. “Watching this actual experiment in progress makes vivid the quantum behavior,” Wheeler wrote. “Simple though it is in concept, it strikingly brings out the mind-bending strangeness of quantum theory.”
As impossible as it sounds, the interference pattern can only mean one thing: each photon went through both slits simultaneously. As the photon heads toward the screen, it is described by a quantum wavefunction. At the screen, the wavefunction splits in two. The two versions of the same photon travel through each slit, and when they emerge on the other side, their wavefunctions recombine—only now they are partially out of phase. Where the waves align, the light is amplified, producing stripes of bright light on the plate. Where they are out of sync, the light cancels itself out, leaving stripes of darkness.
Things get even stranger, however, when you try to catch the photons passing through the slits. Place a detector at each slit and run the experiment again, photon after photon. Dot by dot, a pattern begins to emerge. It’s not the stripes. There are two big blobs on the plate, one opposite each slit. Each photon took only one path at a time. As if it knows it’s being watched.
Photons, of course, don’t know anything. But by choosing which property of a system to measure, we determine the state of the system. If we don’t ask which path the photon takes, it takes both. Our asking createsthe path.
Could the same idea be scaled up, Wheeler wondered. Could our asking about the origin of existence, about the Big Bang and 13.8 billion years of cosmic history, could that createthe universe? “Quantum principle as tiny tip of giant iceberg, as umbilicus of the world,” Wheeler scrawled in his journal on June 27, 1974. “Past present and future tied more intimately than one realizes.”
In his journal, Wheeler drew a picture of a capital-U for “universe,” with a giant eye perched atop the left-hand peak, staring across the letter’s abyss to the tip of the right-hand side: the origin of time. As you follow the swoop of the U from right to left, time marches forward and the universe grows. Stars form and then die, spewing their carbon ashes into the emptiness of space. In a corner of the sky, some carbon lands on a rocky planet, merges into some primordial goo, grows, evolves until … an eye! The universe has created an observer and now, in an act of quantum measurement, the observer looks back and creates the universe. Wheeler scribbled a caption beneath the drawing: “The universe as a self-excited system.”
The problem with the picture, Wheeler knew, was that it conflicted with our most basic understanding of time. It was one thing for electrons to zip backward through time, or for wormholes to skirt time’s arrow. It was something else entirely to talk about creation and causation. The past flows to the present and then the present turns around and causes the past?
“Have to come through to a resolution of these issues, whatever the cost,” Wheeler wrote in his journal. “Nowhere more than here can I try to live up to my responsibilities to mankind living and dead, to [his wife] Janette and my children and grandchildren; to the child that might have been but was not; to Joe…” He glued into the journal a newspaper clipping from TheDaily Telegraph. The headline read: “Days are Getting Shorter.”
***
In 1979, Wheeler gave a lecture at the University of Maryland in which he proposed a bold new thought experiment, one that would become the most dramatic application of his ideas about time: the delayed choice.
Wheeler had realized that it would be possible to arrange the usual double slit experiment in such a way that the observer can decide whether he wants to see stripes or blobs—that is, he can create a bit of reality—after the photon has already passed through the screen. At the last possible second, he can choose to remove the photographic plate, revealing two small telescopes: one pointed at the left slit, the other at the right. The telescopes can tell which slit the photon has passed through. But if the observer leaves the plate in place, the interference pattern forms. The observer’s delayed choice determines whether the photon has taken one path or two after it has presumably already done one or the other.
For Wheeler, this wasn’t a mere curiosity. This was a clue to the universe’s existence. It was the mechanism he needed to get his U-drawing to work, a bending of the rules of time that might allow the universe—one that was born in a Big Bang 13.8 billion years ago—to be created right now. By us.
To see the point, Wheeler said, just take the delayed choice experiment and scale it up. Imagine light traveling toward Earth from a quasar a billion light years away. A massive galaxy sits between the quasar and the Earth, diverting the light’s path with its gravitational field like a lens. The light bends around the galaxy, skirting either left or right with equal probability and, for the sake of the thought experiment, arrives on Earth a single photon at a time. Again we are faced with a similar choice: We can center a photographic plate at the light’s arrival spot, where an interference pattern will gradually emerge, or we can point our telescope to the left or right of the galaxy to see which path the light took. Our choice determines which of two mutually exclusive histories the photon lived. We determine its route (or routes) start to finish, right now—despite the fact that it began its journey a billion years ago.
Listening intently in the audience was a physicist named Carroll Alley. Alley had known Wheeler in Princeton, where he had studied under the physicist Robert Henry Dicke, whose research group had come up with the idea of putting mirrors on the moon.
Dicke and his team were interested in studying general relativity by looking at subtle gravitational interactions between the moon and the Earth, which would require exquisitely accurate measurements of the distance to the moon as it swept along its orbit. They realized if they could put mirrors on the lunar surface, they could bounce lasers off of them and time how long it took the light to return. Alley became the principle investigator of the NASA project and got three mirrors on the moon; the first one was set down in 1969 by Neil Armstrong.
Now, as Alley listened to Wheeler speak, it dawned on him that he might be able to use the same techniques he had used for measuring laser light bouncing off the moon to realize Wheeler’s vision in the lab. The light signals returning from the mirrors on the moon had been so weak that Alley and his team had developed sophisticated ways to measure single photons, which was exactly what Wheeler’s delayed choice setup required.
In 1984, Alley—along with Oleg Jakubowicz and William Wickes, both of whom had also been in the audience that day—finally got the experiment to run. It worked just as Wheeler had imagined: measurements made in the present can create the past. Time as we once knew it does not exist; past does not come indelibly before future. History, Wheeler discovered—the kind that brews guilt, the kind that lies dormant in foxholes—is never set in stone.
Later that year, he wrote, “How come existence? How come the quantum? Is death the penalty for raising such a question?”
Still, some fundamental insight eluded Wheeler. He knew that quantum measurement allowed observers in the present to create the past, the universe hoisting itself into existence by its bootstraps. But how did quantum measurement do it? And if time was not a primordial category, why was it so relentless? Wheeler’s journals became a postcard of their own, written again and again to himself. Hurry up. The puzzle of existence taunted him. “I am not ‘I’ unless I continue to hammer at that nut,” he wrote. “Stop and I become a shrunken old man. Continue and I have a gleam in my eye.”
In 1988, Wheeler’s health was wavering; he had already undergone cardiac surgery two years before. Now, his doctors gave him an expiration date. They told him he could expect to live for another three to five years. Under the threat of his own mortality, Wheeler grew despondent, worried that he would not solve the mystery of existence in time to even the score for what he saw his personal failure to save his brother. Under the heading “Apology,” he wrote in his journal, “It will take years of work to develop these ideas. I—76—don’t have them.”
Luckily, like scientists before them, the doctors had gotten the nature of time all wrong. The gleam in Wheeler’s eye continued to shine, and he hammered away at the mystery of quantum mechanics and the strange loops of time. “Behind the glory of the quantum—shame,” he wrote on June 11, 1999. “Why shame? Because we still don’t understand how come the quantum. Quantum as signal of self-created universe?” Later that year, he wrote, “How come existence? How come the quantum? Is death the penalty for raising such a question—”
Although Wheeler’s journals reveal a driven man on a lonely quest, his influence was widespread. In his last years, Stephen Hawking, along with his collaborator Thomas Hertog of the Institute for Theoretical Physics at the KU Leuven in Belgium, developed an approach known as top-down cosmology, a direct descendant of Wheeler’s delayed choice. Just as photons from a distant quasar take multiple paths simultaneously when no one’s looking, the universe, Hawking and Hertog argued, has multiple histories. And just as observers can make measurements that determine a photon’s history stretching back billions of years, the history of the universe only becomes reality when an observer makes a measurement. By applying the laws of quantum mechanics to the universe as a whole, Hawking carried the torch that Wheeler lit that day back at the North Carolina airport, and challenges every intuition we have about time in the process. The top-down approach “leads to a profoundly different view of cosmology,” Hawking wrote, “and the relation between cause and effect.” It’s exactly what Wheeler had been driving at when he drew the eye atop his self-creating universe.
In 2003, Wheeler was still chasing the meaning of existence. “I am as far as can be imagined from being able to speak so reasonably about ‘How come existence’!” he wrote in his journal. “Not much time left to find out!”
On April 13, 2008, in Hightstown, N.J., at the age of 96, John Archibald Wheeler finally lost his race against time. That stubbornly persistent illusion.
Amanda Gefter is a physics writer and author of Trespassing on Einstein’s Lawn: A father, a daughter, the meaning of nothing and the beginning of everything. She lives in Cambridge, Massachusetts.
This article was originally published on November 8, 2018, by Nautilus, and is republished here with permission.
The year she turned 18, Jane Butzner traveled from her hometown of Scranton, Pennsylvania, to the Appalachian hamlet of Higgins, North Carolina, where she encountered a mystery that haunted her for the rest of her life. It was 1934, the midpoint of the Great Depression, a difficult time to hold a job, even an unpaid one. Butzner—later Jacobs—had been laid off from The Scranton Republican after almost a year working without pay as a cub reporter. At her parents’ suggestion, she went to live in the mountains with her aunt Martha Robison, a Presbyterian missionary. Robison had come to Higgins 12 years earlier on a mission and was so staggered by its poverty that she refused to leave. There were no paved roads, school was rarely in session, the illiterate preacher believed the world was flat, and commerce was conducted by barter. Robison built a church and a community center, adopted children, and established classes in pottery, weaving, and woodwork. Nevertheless, the townspeople continued to live a primitive existence in which, as Robison’s niece later said, “the snapping of a pitchfork or the rusting of a plow posed a serious financial crisis.”
Jane Jacobs wrote about Higgins in Cities and the Wealth of Nations (1984) and Dark Age Ahead (2004), but its negative example looms over her entire body of work. Higgins had not always been backwards. In the early 1700s, as Jacobs noted in Cities and the Wealth of Nations, its founders, three English brothers named Higgins and their families, possessed a wide range of knowledge and skills:
spinning and weaving, loom construction, cabinetmaking, corn milling, house and water-mill construction, dairying, poultry and hog raising, gardening, whiskey distilling, hound breeding, molasses making from sorghum cane, basket weaving, biscuit baking, music making with violins …
By the 1920s, the brothers’ descendants had lost nearly all of these, apart from making molasses, which was sold in the county seat, Burnsville, 12 miles away. Most residents had never traveled that far, however, because the only way to get there was by mule on a rough mountain track. Candles were a vanishing luxury. After the few remaining cows died, there would be no more milk or butter. One woman still remembered how to weave baskets, but she was close to death. When Robison suggested building the church with large stones from the creek, the community elders rebuked her. Over generations the townspeople had not only forgotten how to build with stone. They had lost the knowledge that such a thing was possible.
How had Higgins fallen so low? Mountain isolation contributed, but it was not the only factor. The same fate, after all, had befallen much larger cities, and even empires—Rome, the Olmecs, the New Kingdom of Egypt, and perhaps other civilizations, like the people who painted the Lascaux caves, for which we don’t even have names. “Suppose, hypothetically, that the world were to behave like a single sluggish empire in decline,” wrote Jacobs.
Such a thing could happen if cities in too many places stagnated simultaneously or in quick succession. Or it could happen if the world were to become, in fact, one single sluggish empire … We all have our nightmares about the future of economic life; that one is mine.
In the centenary of her birth, Jacobs has been remembered as our Solon of cities: a shrewd theorist who revealed how cities work, why they thrive, and why they fail. Jacobs lived to the age of 89, long enough to see her renegade theories become conventional wisdom. No one questions anymore that lively neighborhoods require diversity of use and function, that more roads lead to more cars, that historic buildings should be preserved, that investment in public transportation reduces traffic and promotes neighborhood activity, that “flexible and gradual change” is almost always preferable to “cataclysmic,” broad-stroke redevelopment.
Urban life was Jacobs’s great subject. But her great theme was the fragility of democracy—how difficult it is to maintain, how easily it can crumble. A city offered the perfect laboratory in which to study democracy’s intricate, interconnected gears and ballistics. “When we deal with cities,” she wrote in The Death and Life of Great American Cities (1961), “we are dealing with life at its most complex and intense.” When cities succeed, they represent the purest manifestation of democratic ideals: “Cities have the capability of providing something for everybody, only because, and only when, they are created by everybody.” When cities fail, they fail for the same reasons democracies fail: corruption, tyranny, homogenization, overspecialization, cultural drift and atrophy.
In a year when American democracy has courted despotism, Jacobs’s work offers a warning and a challenge. Her goal was never merely to enlighten urban planners. In her work she argued, with increasing urgency, that the distance between New York City and Higgins is not as great as it seems. It is not very great at all, and it is shrinking.
***
Four books are united in their determination to undermine the seductive myth that Jacobs, as her biographer Peter L. Laurence puts it, “was primarily a housewife with unusual abilities to observe and defend the domestic surroundings of her Greenwich Village home.” This line was codified in 1962 by The New Yorker’s architecture critic Lewis Mumford in a 30-page review of Death and Life that called the book “a mingling of sense and sentimentality, of mature judgments and schoolgirl howlers.” (If Mumford was responsible for the article’s headline, “Mother Jacobs’ Home Remedies,” he seems to have regretted it. In his 1968 collection, The Urban Prospect, it appears under the moderately less chauvinistic “Home Remedies for Urban Cancer.”) The allegation of amateurism often went unchallenged because most of Jacobs’s considerable body of writing before The Death and Life of Great American Cities had been published without a byline.
University of Pennsylvania Press
Two biographies—Laurence’s Becoming Jane Jacobs, a close, vivid study of Jacobs’s intellectual development, and Robert Kanigel’s broader Eyes on the Street: The Life of Jane Jacobs—as well as an anthology of previously uncollected articles and speeches, Vital Little Plans: The Short Works of Jane Jacobs, and Jane Jacobs: The Last Interview and Other Conversations correct the record. By the time she published her masterpiece, at the age of 45, she had been writing about urban redevelopment for nearly a decade in dozens of lengthy articles for Architectural Forum. Before that she had written about, and in direct service of, American democracy.
After her six-month purgatory in Higgins, Jacobs moved to Brooklyn with the goal of becoming a writer. Within a year, she began writing a succession of Lieblingesque columns for Vogue about New York’s fur, diamond, leather, and flower districts: “All the ingredients of a lavender-and-old-lace story, with a rip-roaring, contrasting background, are in New York’s wholesale flower district.” She signed up for classes at Columbia’s University Extension Program, many of them in economic geography, the interdisciplinary study of economics, history, culture, and the environment. (She would later call herself a “city naturalist.”) In one of these courses, Jacobs likely encountered Henri Pirenne’s Medieval Cities: Their Origins and the Revival of Trade (1925), which explained how cities promoted democratic values, and which she cited frequently throughout her career. But it was a pair of classes in American constitutional law that inspired her first book.
Constitutional Chaff was published by Columbia University Press despite the author’s age (24) and lack of a college degree. It is a compilation of failed proposals from the Constitutional Convention of 1787, such as a third house of Congress and direct election of a Senate that never went out of session. In her introduction, Jacobs argued that the vigorous debate over the text of the Constitution reflected the soul of American democracy as vividly as the ratified document itself did. The losers deserved to be heard, even a century and a half after their arguments had been defeated. It was a sentiment the Founders, at pains to protect the rights of minority factions, would have cheered.
Knopf
After writing about the United States government, Jacobs went to work for it. She spent most of the next decade as a professional propagandist. At the U.S. Office of War Information, which she joined in the fall of 1943, she wrote articles about American history, industry, and politics for placement in the foreign press. Her bureau chief praised “her quick grasp of the propaganda job to be done.” After the war, she was hired by the State Department to join the staff of Amerika, one of the more glorious efforts in auto-mythopoeia that the nation has produced.
The publication, which originated in an agreement between Franklin D. Roosevelt and Joseph Stalin at Yalta to expand cultural diplomacy between the two nations, was designed to resemble Life magazine, with illustrated features about Radio City Music Hall, Benjamin Franklin, Arizona deserts, and the Senate. The circulation was initially limited to 10,000 copies, not nearly enough to satisfy demand; Laurence writes that though the official price in 1946 was 10 rubles (83 cents), it sold on the black market for 1,000. (The Soviets produced a counterpart publication, Soviet Life, but despite its editors’ best efforts—“Leonid I. Brezhnev’s Reminiscences,” “A Guide to the 15 Union Republics,” “Tashkent, Seattle’s Sister City”—it somehow failed to attract a commensurate following in the U.S.) In a Manhattan office building near Columbus Circle, Jacobs wrote articles about American cafeterias, the World Series, and modern art, and modeled maternity clothes for a feature on women’s fashion.
She was sensitive to reader opinion. Kanigel mentions one criticism that may have helped shape her later career. In 1949, V. Kusakov of the Academy of Architecture in the U.S.S.R. complained in a Soviet publication about two articles, uncredited but written by Jacobs, praising the work of Frank Lloyd Wright and other modern architects. Kusakov attacked Amerika for neglecting to cover the more important story: “the ever increasing housing crisis which the cities of America are experiencing.” American capitalism, Kusakov wrote, “dooms the majority of the population to a negative existence and death in ill-smelling cesspools, in slums deprived of air, sunlight, and trees or shrubs.”
The column unsettled Jacobs, who responded with a thorough investigation of life in America’s inner-city neighborhoods. In her article, she proposed slum-clearing and the construction of high-rise apartment towers, remedies she would later excoriate. Kanigel suggests that Jacobs was even then not entirely satisfied with these arguments. “This seemingly narrow question would slip out of its original borders,” he writes, “become something big to chew on, broaden into one of the biggest questions of all: What, really, was the Good Life?” How, in other words, could urban policy promote life, liberty, and the pursuit of happiness?
Random House
For her devoted, thoughtful work in service of American mythos, Jacobs came under federal investigation for suspected ties to the Soviet Union. At the State Department, she’d had the misfortune of serving under Alger Hiss. When she tried to travel to Siberia in 1945 to report a freelance article, she asked Hiss for help securing a visa. As Hiss was already under secret investigation for espionage, the request roused the suspicion of the FBI. Jacobs, Laurence writes, was likely one of the names on Joseph McCarthy’s infamous list of known Communists “working in and shaping policy in the State Department.” J. Edgar Hoover demanded to oversee her investigation himself. During the course of four years, Jacobs was required to sign multiple Oaths of Office, declare that she was not a Communist or a Fascist, and endure a series of interrogations by the State Department’s Loyalty Security Board. At least 13 of her friends, family members, and colleagues were interviewed by FBI agents. One informant said that he believed her to be a Communist sympathizer because she lived in Greenwich Village.
By 1952, she’d had enough. After yet another inquiry—requesting her views on, among other things, the Marshall Plan, the United Public Workers of America, and atomic energy—she sent the board an 8,000-word defense that remains the most powerful declaration of her moral convictions. “I was brought up to believe that there is no virtue in conforming meekly to the dominant opinion of the moment,” she wrote, defending her integrity and shaming her inquisitors.
I was encouraged to believe that simple conformity results in stagnation for a society, and that American progress has been largely owing to the opportunity for experimentation, the leeway given initiative, and to a gusto and a freedom for chewing over odd ideas. I was taught that the American’s right to be a free individual, not at the mercy of the state, was hard-won and that its price was eternal vigilance, that I too would have to be vigilant.
In The Death and Life of Great American Cities, Jacobs sought to translate these principles of individual liberty into urban design.
***
This was not an intuitive process. What ratio of green space to residential acreage was most conducive to individual liberty? Did tenement buildings or high-rise towers create better opportunities for experimentation? What block length, what width of sidewalk, what frequency of stoplights best encouraged the chewing-over of odd ideas?
She pursued this line of questioning at Architectural Forum, the leading architectural publication of its day, after resigning from the State Department in 1952. The magazine was edited by Douglas Haskell, whom Laurence identifies as the crucial figure in Jacobs’s intellectual maturation. Like Jacobs, Haskell lacked an academic pedigree and paired “an anti-utopian streak” with a faith in the power of architecture to bring about social change, for better and for worse. Shortly after her hire, Haskell announced that Forum would intensify its emphasis, already significant, on “the problems of cities.” Was urban renewal—which James Baldwin would later call “Negro removal”—improving the slums of America’s great cities? If not, what else should be done?
Melville House
Jacobs wrote numerous un-bylined articles in favor of theories she would later ridicule. She lionized her future nemeses, the city planners, writing that “the first—the most elementary—lesson for downtown is simply the importance of planning.” She continued to argue in favor of superblocks and for demolishing entire neighborhoods of blighted buildings. In a 24-page feature about shopping centers, she called for downtowns to model themselves after suburban malls. The flaw in her thinking was not purely ideological; she did write critically of the “homogenized simplicity” of new developments and praised planners who made a token effort to preserve older buildings. But poor journalistic habits cost her. She didn’t always travel to see the cities and building projects she wrote about, relying instead on the sketches, photographs, prospectuses, and blueprints sent by architects to the magazine. She violated one of the eventual maxims of Death and Life, that “no other expertise can substitute for locality knowledge in planning.”
A revelation came during a tour of Philadelphia—a tour she may well have taken only after she published a laudatory essay about the city’s redevelopment efforts in 1955. Her guide was Philadelphia’s planning-commission director, Edmund Bacon, “the grand poobah” of American planning, who would later appear on the cover of Time as the face of urban renewal. Bacon took Jacobs on a before-and-after tour of his city. “Before” was represented by a street in a condemned black neighborhood; “after” was a towering new housing project. Before Street, Kanigel writes, “was crowded with people, people spilling out onto the sidewalk, sitting on stoops, running errands, leaning out of windows.” After Street was flat and deserted, with the exception of a lone boy kicking a tire.
“Not only did [Bacon] and the people he directed not know how to make an interesting or a humane street,” Jacobs later said, “but they didn’t even notice such things and didn’t care.” In early 1956 she took a series of tours of East Harlem led by William Kirk, a community activist and the director of the Union Settlement Association. He showed her how the construction of 10 housing projects had destroyed not only the neighborhood’s small businesses but the communities they had sustained. A more personal incitement came from Robert Moses’s long-standing plans to redevelop Jacobs’s own beloved neighborhood of Greenwich Village. Moses proposed extending Fifth Avenue through Washington Square Park in the form of a sunken highway, and razing 26 blocks to clear space for a pair of gargantuan housing projects. When Douglas Haskell asked Jacobs to take his speaking slot at an urban-design conference at Harvard in April 1956, before an audience of the nation’s most powerful planners and critics, she let them have it.
Her 1,500-word speech, a version of which appears in Vital Little Plans, became the basis for The Death and Life of Great American Cities. Her main argument was Kirk’s: Small neighborhood stores, ignored by the planners in their grim demolition derby, were essential social hubs. She added that sidewalks, stoops, laundries, and mailbox areas were also indispensable centers of community activity, and that sterile, vacant outdoor space served nobody. “The least we can do,” she said, “is to respect—in the deepest sense—strips of chaos that have a weird wisdom of their own.”
That “weird wisdom” was the wisdom of crowds: the customs and habits that people in cities, left to their own devices, developed while living in close proximity to one another. The planners had been guided by aesthetic concerns, favoring clean lines, geometric shapes, and vast boulevards that were beautiful so long as they were seen from the window of an airplane. But Americans didn’t need a new utopia. They already had a system that, while messy and imperfect, produced a thriving society.
In Death and Life, Jacobs converted democratic values into design policy. This was no magic trick—it was achieved through close observation. Through better reporting, she became a better theorist. The vitality that planners like Bacon and Moses hoped to create already existed. She had seen it herself, not only in the tenements of East Harlem but in Greenwich Village. The two neighborhoods—one doomed, one celebrated for its bohemian spirit—were more alike than not, just as the blocky brick towers of Stuyvesant Town, the celebrated middle-class development where Jacobs’s sister lived, were as dreary as Harlem’s carceral George Washington Houses.
Reduced to a word, Jacobs’s argument is that a city, or neighborhood, or block, cannot succeed without diversity: diversity of residential and commercial use, racial and socioeconomic diversity, diversity of governing bodies (from local wards to state agencies), diverse modes of transportation, diversity of public and private institutional support, diversity of architectural style. Great numbers of people concentrated in relatively small areas should not be considered a health or safety hazard; they are the foundation of a healthy community. Dense, varied populations are “desirable,” Jacobs wrote,
because they are the source of immense vitality, and because they do represent, in small geographic compass, a great and exuberant richness of differences and possibilities, many of these differences unique and unpredictable and all the more valuable because they are.
***
James Madison couldn’t have put it better, though he tried. He addressed the issue in Federalist Paper No. 9, his effort to answer one of the most vexing problems facing the Framers of the Constitution: how to safeguard their new democracy against insurrection or despotism. Madison argued that as you increase the “variety of parties and interests” contained within a republic, “you make it less probable that a majority of the whole will have a common motive to invade the rights of other citizens.” Jacobs saw that the same principle held in cities. It is not a coincidence that she described city planners, and the businessmen and politicians who enabled them, as tyrants: “neurotic,” “destructive,” and “impossibly arrogant.”
“We need all kinds of diversity,” Jacobs concluded in Death and Life, “so the people of cities can sustain (and further develop) their society and civilization.” In later books, particularly The Economy of Cities (1969) and Cities and the Wealth of Nations (1984), she expanded this point, arguing that the fate of a civilization rested on the vitality of its major cities. In her final book, published in 2004, she applied her analysis to our own civilization. What was the current condition of our great cities? How did America stack up against Rome, Mesopotamia, Babylon? What future could we expect if we continued on our current path? She called the book Dark Age Ahead.
Her Cassandra tale is given greater credibility by the fact that many of her direst predictions have already been realized. Within three years of publication, “the miracle of money growing on houses” was revealed to be a mirage that threatened to take down much of the financial system with it. Gentrification, which Jacobs first warned against in Death and Life, was exacerbated in New York City and elsewhere when local governments failed to set aside sufficient affordable public housing. The Total Information Awareness program, a government data-mining surveillance system that she warned against on the book’s final page, morphed into prism, the classified surveillance program exposed by Edward Snowden. These seemingly disparate dangers, Jacobs argued, rose from a common cause: a moral weakening, or drift, accelerated by cultural rot.
In her comparative study of fallen empires, Jacobs identifies common early indicators of decline: “cultural xenophobia,” “self-imposed isolation,” and “a shift from faith in logos, reason, with its future-oriented spirit … to mythos, meaning conservatism that looks backwards to fundamentalist beliefs for guidance and a worldview.” She warns of the profligate use of plausible denial in American politics, the idea that “a presentable image makes substance immaterial,” allowing political campaigns “to construct new reality.” She finds further evidence of our hardening cultural sclerosis in the rise of the prison-industrial complex, the prioritization of credentials over critical thinking in the educational system, low voter turnout, and the reluctance to develop renewable forms of energy in the face of global ecological collapse.
No reader of Jacobs’s work would be surprised by the somewhat recent finding by a Gallup researcher that Donald Trump’s supporters “are disproportionately living in racially and culturally isolated zip codes and commuting zones.” These zones are latter-day incarnations of Higgins: marooned, amnesiac, homogenous, gutted by the diminishment of skills and opportunities. One Higgins is dangerous enough, for both its residents and the republic to which it belongs. But the nation’s Higginses have proliferated to the point that their residents have assumed control of a major political party.
In the foreword to the 1992 Modern Library edition of Death and Life, Jacobs likens cities to natural ecosystems. “Both types of ecosystems,” she writes, “require much diversity to sustain themselves … and because of their complex interdependencies of components, both kinds of ecosystems are vulnerable and fragile, easily disrupted or destroyed.” Dark Age Ahead reminds us how many powerful, technologically advanced cities—and empires—have come before us, only to fade to dust. When they fall, they do not recover. The vanished way of life “slides into an abyss of forgetfulness, almost as decisively as if it had not existed.” Karl Marx, who spent his life studying the subject, observed that history repeats itself, the first time as tragedy, the second time as farce. This topsy-turvy election year makes one wonder whether he might have gotten that backwards. We’ve had farce, that much is certain. What will the next time bring?Nathaniel Rich is the author of Odds Against Tomorrow and King Zeno.
This article was originally published on November 11, 2016, by The Atlantic, and is republished here with permission.
Being busy is not the same as being productive. It’s the difference between running on a treadmill and running to a destination. They’re both running, but being busy is running in place.
I was coaching Sanjay,* a leader in a technology firm who felt stuck and frustrated. He wasn’t where he wanted to be at this point in his career.
He had come to our coaching session, as usual, prepared to discuss the challenges he was currently facing. This time, it was his plan for conducting compensation conversations with each of his employees. After a few minutes of listening to him talk through his plans, I interrupted him.
“Sanjay, you’ve had these kinds of conversations before, right?” I asked.
“Yes,” he said.
“And, for the most part, you know how to do them, right?”
“Yes,” he said again.
“Great. Let’s talk about something else.”
“But this is what’s on my mind right now,” he protested. “It’s helpful to think it through with you.”
“I’m glad it’s helpful, Sanjay,” I said. “But you don’t want me to be merely helpful. You want me to be transformational. And focusing on what’s top of mind for you right now is not going to get us there.”
You see, the reason Sanjay is stuck — and the reason many of us feel that way — is that we focus on what’s present for us at any particular moment.
On the other hand, what most of us want most is to move forward. And, by definition, paying attention to the present keeps us where we are. So, sure, I can help Sanjay be a better “present” Sanjay. But I will have a much greater impact if I help him become a successful “future” Sanjay.
It’s a familiar story: You’re busy all day, working non-stop, multitasking in a misguided attempt to knock a few extra things off your to-do list, and as the day comes to a close, you still haven’t gotten your most important work done.
Being busy is not the same as being productive. It’s the difference between running on a treadmill and running to a destination. They’re both running, but being busy is running in place.
If you want to be productive, the first question you need to ask yourself is: Who do I want to be? Another question is: Where do I want to go? Chances are that the answers to these questions represent growth in some direction. And while you can’t spend all your time pursuing those objectives, you definitely won’t get there if you don’t spend any of your time pursuing them.
If you want to be a writer, you have to spend time writing. If you want to be a sales manager, you can’t just sell — you have to develop your management skill. If you want to start a new company, or launch a new product, or lead a new group, you have to spend time planning and building your skills and experience.
Here’s the key: You need to spend time on the future even when there are more important things to do in the present and even when there is no immediately apparent return to your efforts. In other words — and this is the hard part — if you want to be productive, you need to spend time doing things that feel ridiculously unproductive.
I want to expand my writing abilities, so I have started waking up at 5:30 in the morning to write fiction. Unfortunately — and I am not being humble here — I am a terrible fiction writer. So my writing time feels painfully unproductive. I can’t sell it. I can’t use it. I can’t share it. Honestly, I can hardly bear to read it out loud. I have such a long list of things that actually need to get done, it is almost impossible to justify losing sleep in order to do something so unrelated to my present challenges. I know this is how my clients feel when I ask them to put aside their immediate concerns and focus on more distant challenges.
A question I hear a lot is: What about all the things I actually need to get done? Don’t I need to get through my cluttered email box, my pressing conversations, my project plans in order to create space to focus on my future self?
Nope.
That’s a trick your busy self plays on you to keep you away from the scary stuff you’re not yet good at and that isn’t yet productive. Sometimes you need to be irresponsible with your current challenges in order to make real progress on your future self. You have to let the present just sit there, untended. It’s not going away and will never end. That’s the nature of the present.
You may not end up with an empty email inbox. You may not have the perfect compensation conversations. You may not please everyone. But, as your coach, I’m willing to bet that you will do those things well enough.
It’s the other stuff I worry about. The wildly important stuff that never gets done because there’s not time or it’s not urgent or it’s too hard or risky or terrifying. That’s the stuff I want to help you work on.
Even though Sanjay is delighted at the idea of focusing on his future self, he resists it because it doesn’t feel as good as solving his current challenges. He isn’t as skilled at it yet. That’s why it’s his future.
And that is exactly why he needs to focus on it.
This article was originally published on March 28, 2016, by Harvard Business Review, and is republished here with permission.
“How many kingdoms know us not!” —Blaise Pascal, Thoughts (1670)
One summer’s day in 1950, the great Italian-American physicist Enrico Fermi was having lunch with the physicists Edward Teller, Emil Konopinski, and Herbert York at Los Alamos when the conversation turned to a flood of recent UFO sightings all over the United States. There were also, coincidentally, reports of trashcans going missing in New York City at the time. A New Yorker cartoon connected the dots and accused interstellar visitors of the misdeed. In the relaxed atmosphere of that lunchtime conversation, Fermi remarked that the New Yorker’s solution, by proposing a single common cause of two independent empirical phenomena, was in the very best traditions of scientific methodology.
The lunchtime chat stayed on the topic of ET. While they obviously didn’t take seriously the reports of flying saucers, Fermi and his companions began to earnestly discuss things like interstellar—and even superluminal—travel. Then, after some delay—and, one might imagine, in the midst of some tasty dish—Fermi allegedly asked his famous question. Where, indeed, is everybody? Where are the extraterrestrials?
Photo by Yan Wang / Flickr.
The Milky Way galaxy is about 100,000 light-years from edge to edge, Fermi reasoned, which means that a star-faring species would need about 10 million years to traverse it, even if moving at a very modest velocity of 1 percent of the speed of light. Since the galaxy is more than a thousand times older than this, any technological civilization will have had a lot of time in which to expand and colonize the whole galaxy. If one species were to fail in this endeavour, another wouldn’t. Consequently, if intelligent species were out there in any appreciable numbers, they would have been here already. And yet, we do not see them on Earth or in the solar system. For Fermi and many thinkers since, this constituted a paradox.
The volume of scientific literature that Fermi’s paradox has inspired testifies to its serious and provocative nature. When you consider fiction and movies, it’s clear that Fermi’s paradox has become an important part of contemporary culture, challenging us to think more deeply about our place in the cosmos.
But, still, the paradox remains incompletely understood by science, incompletely digested by popular culture, and even actively resisted or deliberately ignored. In this sense, it has become a type of Rorschach test: Our attitudes to the paradox tell us something about ourselves.
***
The strong version of Fermi’s paradox (it is only proper and intellectually honest to tackle the strongest version of any particular scientific problem) doesn’t just ask why there aren’t aliens here on Earth. It also asks why we don’t see any manifestations or traces of extraterrestrial civilizations anywhere in our past light cone—that is the whole volume of space and time visible to us, extending billions of years into the past, to the epoch of earliest galaxies.
The strong Fermi’s paradox became even stronger, so to speak, in 2001, with the work of Charles Lineweaver and collaborators on the age distribution of terrestrial planets in the Milky Way. His calculations show that Earth-like planets in our galaxy began forming more than 9 billion years ago, and that their median age is 6.4 ± 0.9 billion years, which is significantly greater than the age of the Earth and the solar system. This means that a large majority of habitable planets are much older than Earth. If we believe that humans and the planet we live on are not particularly special compared to other civilizations on other planets, we would conclude that the stage of the biosphere and technology on other occupied planets must be, on average, older than the corresponding stages we see on Earth. If we humans are now on the cusp of colonizing our solar system, and we are not much faster than other civilizations, those civilizations should have completed this colonization long ago and spread to other parts of the galaxy.
We presume ourselves to be so special that the question “Where is everybody as complex and important as ourselves (or more)?” cannot be taken seriously.
Another piece of recent science amplifies Fermi’s paradox even further. Geochemical and paleobiological research has recently confirmed that the oldest traces of living beings on Earth are at least 3.8 billion years old, and probably as old as 4.1 billion. The Earth itself is only 4.5 billion years old. While the mechanism of abiogenesis (the origination of life) is still largely unknown, the evidence of abiogenesis occurring early in the Earth’s history seems incontrovertible. The consequences are rather dramatic: If life is quick to form after its host planet has formed, we get good probabilistic support for the existence of simple life on many planets in the Milky Way, and potentially complex life on some of them.
Now that we know that the Earth is a latecomer, and believe the foundations of life have the power to take hold quickly, Fermi’s paradox is more puzzling than ever. In the evocative words of physicist Adrian Kent: It’s just too damn quiet in the local universe.
In spite of all this, Fermi’s paradox is not only downplayed and ignored by a large part of the scientific community, but also mocked and even censured. Distinguished SETI researchers, like Frank Drake or Seth Shostak, claim in their memoirs that they had not heard about Fermi’s paradox until very recently and that it should not be taken seriously. Astrobiology, one of the premier journals in the field, has recently instituted a policy of not considering manuscripts dealing with Fermi’s paradox, including even short communications and book reviews. There are many scientists who, like the British astronomer John Gribbin, are happy to proclaim that there is no paradox whatsoever, since “we are alone, and we had better get used to it.”
In principle there may be several reasons for this attitude. But in my opinion one underlies all of them: We humans still think we’re special.
***
In 1543, two revolutionary books transformed our view of both the universe and ourselves. One, written by Flemish physician Andreas Vesalius, was titled De humani corporis fabrica (On the Fabric of the Human Body), and it laid the foundations of modern medical science by proving once and for all that our bodies are not mystical objects but physical systems amenable to scientific study—and not very different from the bodies of animals. The other, entitled De revolutionibus orbium coelestium (On the Revolutions of the Heavenly Spheres) was of even greater significance by a little-known Polish polymath by the name of Nicolaus Copernicus. It overthrew the cosmological paradigm that had reigned for almost 2,000 years and was supported by the political and religious authorities of the day. In doing so, he inadvertently redefined the very word revolution, from a purely technical term inside to a household label for any dramatic change in any field.
Photo by Ann Ronan Pictures/Print Collector/Getty Images.
The Copernican Revolution, sometimes called the Scientific Revolution, was not only about whether the Earth rests at the center of the universe with the sun and planets moving around it. It was also about whether humans were the most important objects in the universe. In a sense, the “de-centering” of the Earth brought about by the Copernican Revolution was a consequence, rather than a cause of a new way of thinking about ourselves: We were becoming a part of nature, rather than its exalted goal. If Earth is a typical planet revolving around a typical star (and, as we learned much later, in a typical galaxy), then there is no scientific reason to assign special importance to ourselves. Copernicanism broadly understood is this assertion that humans are nothing special across space, time, and other more abstract parameter spaces. It has enabled tremendous advances in science since the times of Vesalius and Copernicus by combating unsupported anthropocentrism.
But our institutions are still profoundly anthropocentric. We deny even the most basic rights to other parts of nature, including our close animal relatives, some of which share more than 97 percent of our DNA. We pollute our environment with close to zero regard for the well-being of its ecosystems—and we fight pollution only if and when it inconveniences us. Scientific experiments on human beings are not only illegal, but are considered barbarous even when they could provide some useful information. This is in sharp contrast to our practice of experimenting on lab animals, hunting foxes, or killing bulls in the arena. Even in the purely abstract realms of knowledge, one often hears the complaint that physical sciences are “cold” and “inhuman” exactly because they are less permeated by anthropocentrism than, say, philosophy or the humanities or arts. Almost 500 years after the onset of the Copernican revolution, we have a relic belief in the exalted nature of the human mind.
Where is everybody? Where are the extraterrestrials?
These remnants encourage us to resist animal rights, and the reality of anthropogenic climate change—and Fermi’s paradox. We presume ourselves to be so special that the question “Where is everybody as complex and important as ourselves (or more)?” cannot be taken seriously. ET isn’t here, we think, because there’s no equal to us. After Copernicus came Darwin’s revolution, and then Freud’s, delivering blows to our illusions of uniqueness and grandiosity within the biological and mental domains, respectively. It’s not that Fermi’s paradox belongs in this progression—it doesn’t explode any myth of our specialness—but appreciating its full import relies on the perspective that Copernicus, Darwin, Freud, and others have given us.
That clearly is a step too far. Many of us choose to ignore Fermi’s paradox, or even fight it, because it requires too complete an acceptance of our cosmic mediocrity. We would rather secretly believe we are special than confront the real consequences of the paradox—consequences like, for example, intelligence being a maladaptive trait, or our universe being a simulation, or us living in a cosmic zoo. Some of us even go so far as to argue that we have become a navel-gazing, self-absorbed civilization, without much chance of developing a sustained cosmic presence and industrial bases all over the solar system. Destroying what Olaf Stapledon and R. Buckminster Fuller have dubbed the cosmic vision of humanity’s future lets us duck out of the Fermi’s paradox conversation. If we can’t do it, our extraterrestrial peers can’t do it either and we shouldn’t waste time and money searching for them. This subtle form of anthropocentrism leads us to a very dangerous path, since it impedes the best—and ultimately only—prospect for humanity to achieve its cosmic potential. Sir Fred Hoyle put it nicely in 1983:
Many are the places in the Universe where life exists in its simplest microbial forms, but few support complex multicellular organisms; and of those that do, still fewer have forms that approach the intellectual stature of man; and of those that do, still fewer again avoid the capacity for self-destruction which their intellectual abilities confer on them. Just as the Earth was at a transition point 570 million years ago, so it is today. The spectre of our self-destruction is not remote or visionary. It is ever-present with hands already upon the trigger, every moment of the day. The issue will not go away, and it will not lie around forever, one way or another it will be resolved, almost certainly within a single human lifetime.
The current generation is likely to live to the 500-year jubilee of De revolutionibus orbium coelestium in 2043. Let’s hope that, by then, we will have completed the Copernican revolution and embraced the hard and deep problems that modern astrobiology is posing. We are now living at the tipping point—the very moment when firm empirical resolution of our biggest and oldest puzzles is in sight. We should not miss that opportunity by fighting for an outdated vision of ourselves as pinnacles of complexity in the universe. Instead, we should reason as if we were near typical for our given epoch. Only then we shall have a fighting chance of piercing the Great Silence.
Milan Ćirković is a senior research associate at the Astronomical Observatory of Belgrade and an assistant professor in the Department of Physics at the University of Novi Sad in Serbia and Montenegro.
This article was originally published on August 2, 2018, by Nautilus, and is republished here with permission.
Many parents are concerned with their child’s seemingly obsessive video game play. Fortnite, the most recent gaming phenomenon, has taken the world by storm and has parents asking whether the shooter game is okay for kids.
The short answer is yes, Fortnite is generally fine. Furthermore, parents can breathe easier knowing that research suggests gaming (on its own) does not cause disorders like addiction.
However, there’s more to the story. A comprehensive answer to the question of whether video games are harmful must take into account other factors. Fortnite is just the latest example of a pastime some kids spend more time on than is good for them. But parents need to understand why kids play as well as when to worry and when to relax.
Addiction, Really?
The word “addiction” gets tossed around quite a bit these days. It’s not uncommon to hear people say that they are addicted to chocolate or shoe shopping, but if it isn’t causing serious harm and impairment to daily function, it isn’t an addiction. It’s an overindulgence.
This isn’t just semantics. An addiction involves a lack of control despite adverse consequences. Parents may worry their kids are addicted, but if the child can pull themselves away from a game to join the family for a conversation over dinner, and shows interest in other activities, like sports or socializing with friends, then they are not addicted.
Generally, parents panic when their kid’s video game playing comes at the expense of doing other things like studying or helping around the house. But let’s be honest, kids have been avoiding these activities for ages. Equally true is the fact parents have been complaining about their unhelpful children well before the first video game was plugged into its socket.
The real question should be what is it about the special draw of gaming that makes it the preferred pastime of so many millions of kids? What makes it so difficult for even non-addicted kids to step away from video games sometimes?
The answer has to do with the way games address basic psychological needs.
Fortnite, like any well-designed video game, satisfies what we are all looking for. According to Drs. Edward Deci and Richard Ryan, people need three things to flourish. We look for competence — the need for mastery, progression, achievement, and growth. We need autonomy — the need for volition and freedom of control over our choice. And finally, we strive for relatedness — the need to feel like we matter to others and that others matter to us. Unfortunately, when considering the state of modern childhood, many kids aren’t getting enough of these three essential elements.
School, where kids spend most of their waking hours, is in many ways the antithesis of a place where kids feel competence, autonomy, and relatedness. There, kids are told what to do, where to be, what to think, what to wear, and what to eat. Alarms and bells orchestrate their movements with farm-chattel precision while teachers opine on topics students could care less about. If they’re bored and want to go, they’re punished. If they want to learn something else, they’re told to be quiet. If they’d like to go deeper on a topic, they’re prodded to stay on track. Of course, this isn’t every student’s experience and different countries, schools, and teachers use different approaches to educate kids. But while some argue discipline and control provide structure, it’s clear why teachers and students might struggle with motivation in the classroom.
Gamers feel competence when they practice strengths to achieve their aims. In a game, players have the autonomy to call the shots, do what they want, and experiment with creative strategies to solve problems. Games are also social outlets where players can feel relatedness. In Fortnite, for example, players often meet in the virtual environment to chat and socialize because doing so in the real world is often inconvenient or off limits. Whereas previous generations were allowed to simply play after school and form close social bonds, many kids today are raised by fearful and overworked parents who insist their kids either attend a regimented afterschool program or stay behind lock and key at home.
We shouldn’t be surprised when the confinement kids find themselves in today often yields behaviors we don’t understand and don’t like. Games satisfy psychological needs other areas of life are not satiating.
Of course, none of this is to say video games are a good substitution — quite the opposite. While a well-designed game attempts to satisfy these needs, it can’t come close to the deep satisfaction real life and real human connection can provide.
No game can give a child the feeling of competence that comes from accomplishing a difficult task or learning a new skill on their own accord. Fortnite can’t compete with the exhilaration that comes from the autonomy of exploring reality, where a child is free to ask questions and unlock mysteries in the real world. No social media site can give a kid the sense of relatedness, safety, and warmth that comes from an adult who loves that child unconditionally just the way they are, no matter what, and takes the time to tell them so.
Some kids suffer from gaming disorders, but such dependencies are often coupled with pre-existing conditions including problems with impulse control. This, of course, does not abdicate companies from their moral responsibility to help problem gamers. It’s time they implement policies to identify and help those with disorders.
For most children, however, parents understanding the deeper truth behind what kids are getting out of games empowers them to take steps to give kids more of what they need. It also helps parents get into a state of mind to talk rationally about overuse instead of succumbing to the hysterics and moral panic that our parents used to try and force us to stop listening to rock ’n’ roll, watching MTV, playing pinball, or reading comic books. Video games are this generation’s outlet and some kids use them as a tool to escape the same way some of us use our own flavor of dissociative devices to tune out reality for a while.
Instead of repeating the mistakes of previous generations with heavy-handed tactics, let’s understand the psychological source of the problem. Ultimately, parents’ goal should be to help kids learn strategies for coping with overuse on their own so that they do what’s good for them even when we’re not around. By teaching self-regulating habits, promoting intentional gaming, and helping kids find suitable alternatives, parents can help kids find what they are really looking for.
Andrew Kinch is the founder of GameAware.
This article was originally published on July 31, 2018, by Nir Eyal, and is republished here with permission.
A “first” in Michael Tilson Thomas’ last season as music director of the San Francisco Symphony includes the world premiere of his composition “Meditations on Rilke” on a program next week called “MTT & Mahler: Love and Lyricism.”
“I first read Rilke’s poems in English translation 30-40 years ago, loved them, and started reading and even memorizing them in German,” says the conductor, 75. “When you recite poetry, you can hear its music, and now that I am fully reconnecting with my ‘composer self,’ the music of these poems have turned into the compositions we will present.”
MTT joins Alban Berg, Paul Hindemith, Anton Webern, Arnold Schoenberg and Peter Lieberson among composers who have set works by Rilke (1875-1926) — a Bohemian-Austrian poet and novelist described as “one of the most lyrically intense German-language authors” — to music.
“Meditations on Rilke features artist-in-residence mezzo-soprano Sasha Cooke and bass-baritone Ryan McKinny singing a six-part cycle set to the poems “Herbsttag” (“Autumn day”); “Das Lied des Trinkers” (“The drinkers’ song”); “Immer wieder” (“Again and again”); “Imaginärer Lebenslauf” (“Imaginary life journey”); “Herbst” (“Autumn”): and “Ich lebe mein Leben in wachsenden Ringen” (“I live my life in ever-widening circles”), which, translated in English, goes:
I live my life in ever-widening circles
that stretch themselves out over all the things.
I won’t, perhaps, complete the last one,
but I intend on trying.
I circle around God, around the ancient tower,
and I circle for thousands of years;
and I don’t know, yet: am I a falcon, a storm,
or a mighty song.
Rilke’s poetry also is in Taika Waititi’s 2019 movie “Jojo Rabbit,” with a German-language rendition of David Bowie’s “Heroes” played over the Rilke quote: “Let everything happen to you/Beauty and terror/Just keep going/No feeling is final.”
MTT has been composing throughout his long conducting career, including setting poetry by Walt Whitman, sung at its premiere by Thomas Hampson; and Emily Dickinson, premiered by Renée Fleming.
In 1991, he and the New World Symphony presented benefit concerts for UNICEF featuring Audrey Hepburn as narrator of MTT’s “From the Diary of Anne Frank.” In 1995, he led the Pacific Music Festival Orchestra in the premiere of his composition “Shówa/Shoáh,” commemorating the 50th anniversary of the bombing of Hiroshima.
Both in Carnegie Hall and with the San Francisco Symphony, MTT led performances of his “Island Music” for four marimbas and percussion.
In June, as MTT concludes his 25-year tenure heading the orchestra before former Los Angeles Philharmonic music director Esa-Pekka Salonen takes over, SFS Media will release a recording of his music performed by the S.F. Symphony in recent seasons, including “Meditations on Rilke,” “From the Diary of Anne Frank” narrated by mezzo-soprano Isabel Leonard, and “Street Song.”
Next week’s concerts also include Cooke and McKinny singing songs from Mahler’s “Des Knaben Wunderhorn” (“The Boy’s Magic Horn”) as well as the overture to Berlioz’s “Benvenuto Cellini” and Ravel’s “La Valse.”
IF YOU GO
MTT & Mahler: Love & Lyricism
Presented by San Francisco Symphony
Where: Davies Symphony Hall, 201 Van Ness Ave., S.F.
When: 8 p.m. Jan. 9-11, 2 p.m. Jan. 12
Tickets: $20 to $185
Contact: (415) 864-6000, www.sfsymphony.org
Bass-baritone Ryan McKinny performs new music by Michael Tilson Thomas in “MTT & Mahler: Love and Lyricism.” (Courtesy Simon Pauly)
I know it must be winter (though I sleep)— I know it must be winter, for I dream I dip my bare feet in the running stream, And flowers are many, and the grass grows deep.
I know I must be old (how age deceives!) I know I must be old, for, all unseen, My heart grows young, as autumn fields grow green When late rains patter on the falling sheaves.
I know I must be tired (and tired souls err)— I know I must be tired, for all my soul To deeds of daring beats a glad, faint roll, As storms the riven pine to music stir.
I know I must be dying (Death draws near)— I know I must be dying, for I crave Life—life, strong life, and think not of the grave, And turf-bound silence, in the frosty year.
This poem is in the public domain. Published in Poem-a-Day on January 4, 2020, by the Academy of American Poets.
About this Poem “Winter Sleep” originally appeared in A Winter Swallow (Charles Scribner’s Sons, 1896)
Edith Matilda Thomas was born in Ohio in 1854. Her collections include A Winter Swallow (Charles Scribner’s Sons, 1896) and Fair Shadow Land (Houghton, Mifflin and Co., 1893). She died in 1925.
Astrophysicist Ron Mallett believes he’s found a way to travel back in time — theoretically.
The tenured University of Connecticut physics professor recently told CNN that he’s written a scientific equation that could serve as the foundation for an actual time machine. He’s even built a prototype device to illustrate a key component of his theory — though Mallett’s peers remain unconvinced that his time machine will ever come to fruition.
To understand Mallett’s machine, you need to know the basics of Albert Einstein’s theory of special relativity, which states that time accelerates or decelerates depending on the speed at which an object is moving.
Based on that theory, if a person was in a spaceship traveling near the speed of light, time would pass more slowly for them than it would for someone who remained on Earth. Essentially, the astronaut could zip around space for less than a week, and when they returned to Earth, 10 years would have passed for the people they’d left behind, making it seem to the astronaut like they’d time traveled to the future.
But while most physicists accept that skipping forward in time in that way is probably possible, time traveling to the past is a whole other issue — and one Mallett thinks he could solve using lasers.
As the astrophysicist explained to CNN, his idea for a time machine hinges upon another Einstein theory, the general theory of relativity. According to that theory, massive objects bend space-time — an effect we perceive as gravity — and the stronger gravity is, the slower time passes.
“If you can bend space, there’s a possibility of you twisting space,” Mallett told CNN. “In Einstein’s theory, what we call space also involves time — that’s why it’s called space time, whatever it is you do to space also happens to time.”
He believes it’s theoretically possible to twist time into a loop that would allow for time travel into the past. He’s even built a prototype showing how lasers might help achieve this goal.
“By studying the type of gravitational field that was produced by a ring laser,” Mallett told CNN, “this could lead to a new way of looking at the possibility of a time machine based on a circulating beam of light.”
As optimistic as Mallet might be about his work, though, his peers are skeptical that he’s on the path to a working time machine.
“I don’t think [his work is] necessarily going to be fruitful,” astrophysicist Paul Sutter told CNN, “because I do think that there are deep flaws in his mathematics and his theory, and so a practical device seems unattainable.”
Even Mallet concedes that his idea is wholly theoretical at this point. And that even if his time machine does work, he admits, it would have a severe limitation that would prevent anyone from, say, traveling back in time to kill baby Adolf Hitler.
“You can send information back,” he told CNN, “but you can only send it back to the point at which you turn the machine on.”