Abstract

Abstract

Friday, March 25, 2016

A virtual resurrection

As Good Friday was approaching, I was reflecting on one of the most consequential events in human history—the death of Jesus of Nazareth on order of Roman authorities. Paula and I were both raised Catholic, and, through different paths, ended up as ex-Catholics. As ayoung woman, Paula began visiting other churches as well as synagogues. My break with the mother church was more abrupt. “Losing interest” would be an understatement.
My dad taught for many years at my home city’s only Catholic school, and so it was especially embarrassing for him when I refused to go through the rite of Confirmation. This was not simply a case of being stubborn—though I had a well-deserved reputation for stubbornness. I also caught a whiff of hypocrisy. When we were rehearsing the event, our C.C.D. teacher told us that each of us would tell the priest why we wanted to be confirmed. I asked him, “What if I don’t want to be confirmed?” This threw him off his rhythm. I can’t recall his precise words, but the sense was that I should say something even if it was untrue.
A less maddening adolescent would have gone through the motions, reciting words that meant nothing to him, and put the procedure behind him. Instead, I announced to my parents that I’d decided not to be confirmed. Neither of my parents was pleased by my decision, but my dad was especially vexed. Catholicism was central to his identity. He grew up in the 1920s and ’30s, when Catholics were mistrusted by the Protestant majority.
When I met Paula, I was pleased to learn that she, too, was an ex-Catholic. For years we didn’t attend church. But we reconsidered our church-less lives once we were expecting our first child. We met with a minister in an Episcopal church in Harvard Square about baptism, and the first question I asked her was whether her church welcomed agnostics. Her response surprised me. Not only did her church welcome agnostics; the minister suggested that, she, too, at times, allowed doubt into her faith.
When our son was in preschool, Paula began taking him to the Episcopal church during Advent and then for Sunday school during the school year. At first, I did not attend, but these were years when I was reading a fair amount about Christianity and other religions. Of all the books I read, the most consequential for me was James Carroll’s Constantine’s Sword: The Church and the Jews. I’d been reading Carroll’s column in the Boston Globe for years, and his perspective was unlike that of any other writer I knew. He mixed politics and religion in unexpected ways. In the mid-1990s, he won the National Book Award for his memoir An American Requiem: God, My Father, and the War that Came Between Us. In the sixties, Carroll’s dad was an Air Force lieutenant general who directed the dropping of bombs in Vietnam; Carroll at that time was a young Catholic priest opposed to the war.
Constantine’s Sword was a more ambitious work. When I held this 700-page tome in my right hand recently, I was reminded that I still have vestiges of carpal tunnel syndrome. As the subtitle suggests, the book is about anti-Semitism going back to the time of the Roman Emperor Constantine, under whose rule, fledgling Christianity emerged as the state religion—with calamitous consequences through the centuries for Jews.
My main takeaway, some fifteen years later, is Carroll’s heterodox interpretation of Jesus’s resurrection. The death of Jesus was, first of all, a disaster for his followers. Their gatherings, according to Carroll, “were like those of a bereft circle, and they were built around lament, the reading of texts, silence, stories, food, drink, songs, more texts, poems—a changed sense of time and a repeated intuition that there was ‘one more member’ than could be counted. That intuition is what we call the Resurrection.”
Carroll continued: “To the eyes of faith, Jesus was really present. Whether a video camera could have recorded his ‘appearances’ or not is less important than the fact that for those who loved him, and for those who sensed the full power of the love he’d offered to them, the continued presence of Jesus was no mere delusion…His presence, of course, was different now.” Carroll went on to underscore the concept that “This is not knowledge of Jesus, but faith in him.” Carroll describes himself as “one of those haunted friends who found themselves incapable of believing him simply gone, but I am also one who knows him in the first place only through the story those first friends gathered to tell.”
Do I share Carroll’s interpretation? Anyone with a deeply secular outlook on life—and by this I mean, a skeptical one—has a hard time conceding that the laws of natural science were once suspended for a brief period twenty centuries ago. The best I can say for myself is that I am consistent in my unknowing.

Friday, March 18, 2016

Idea density


In The End of Memory: A Natural History of Aging and Alzheimer’s, Jay Ingram devotes an entire chapter to “The Nun Study,” a project that started three decades ago with 678 participants and continues today in seven monasteries ranging from Minnesota to Texas to Connecticut. A nun named Sister Mary, who lived past 100, was the first to bequeath her brain to research. According to Ingram, when Sister Mary took the Mini-Mental State Examination near the end of her life, her score was almost seven times higher than what someone her age was expected to achieve.
Her autopsy was a shocker. Despite her high-end test scores, her brain was full of plaques and tangles, the telltale features of Alzheimer’s. As Ingram put it, “Sister Mary should have been demented.” Under the microscope, her brain tissue looked as diseased as Auguste Deter’s, the first patient to be diagnosed with what we now call Alzheimer’s disease. (When interviewed by Dr. Alois Alzheimer in 1901, Deter, a woman in her early fifties, could not recall her husband’s name.) How was it possible that Sister Mary could have so much tau and amyloid protein in the brain and still be cogent?
The answer is connected to a fantasy of mine. My strongest skill set—writing and editing—has remained largely intact, even though many other skills have been gradually weakening. This seems to be an example of “cognitive reserve,” the notion that some people with Alzheimer’s have certain skill sets that they retain deep into the disease’s progression, while other skills atrophy. But according to Ingram, possessing good writing skills around the age of twenty somehow helps protect the brain decades later from Alzheimer’s. The evidence is connected to essays written many decades ago by novitiates around the age of twenty.
 Two concepts are involved, “idea density” and “grammatical complexity.” Idea density, to me, sounds like what writers and journalists call “writing tightly”: the jettisoning of every unneeded word or phrase to maximize the sentence’s clarity and impact. Grammatical complexity is measured by the number of clauses, dependent or independent, in each sentence.
By my senior year in high school, I was writing with syntactical sophistication, pretty much the way I write today. But for my purposes, and for Ingram’s, idea density is the more relevant concept. As a young writer, I aimed to follow the models of Ernest Hemingway and George Orwell, each of whom made clarity and conciseness central tenets. But how might sophisticated writing as a young adult be a protective feature against the ravages of Alzheimer’s many decades later? As Ingram puts it, “It sounds bizarre, but in the case of the two fictitious essays [cited], the author of the first one would have been much more likely to die of Alzheimer’s than the second.”
Perhaps the most interesting aspect of Ingram’s book is his suggestion that Alzheimer’s may be a more fluid category than commonly assumed. About two out of five nuns enrolled in the Nun study “had measurable memory deficits, even though they were still in these first two stages. Findings like these have led to the belief that there is no chasm between normal mental functioning and Alzheimer’s disease [My italics]. Rather, “normal functioning slides almost imperceptively into mild cognitive impairment…which, not always, but often, then moves into Alzheimer’s.”
Near the end of this illuminating chapter, Ingram comments, “Formal education, or something related to education [author’s italics], provides some type of cognitive or neural reserve.” A couple paragraphs later he asks, “How do a few years of education lessen impact of plaque accumulation sixty years later?”
I have no idea, and from what Ingram writes, the effect appears to be mysterious to him as well. What I do understand is this: Education is even more important than people assume.

Friday, March 11, 2016

A farewell to Aricept


Just before I was diagnosed with Alzheimer’s last June, a doctor prescribed Donepezil, better known by its brand name, Aricept. I was already taking Strattera, a drug said to boost concentration, but in my case made me jumpy. To complete my trio of medications, the doctor prescribed the anti-depressant Zoloft. This last prescription was a mystery to me, since I was not depressed. Later I learned that the dosage was very low, perhaps negligible.
Soon I was only taking Aricept. But even at the modest level of 10 milligrams, the drug produced unpleasant effects. One was joint pain, and soon after I began taking the medication, an arthritic joint in my lower back, like clockwork, began to pain me early each afternoon, about fourteen hours after I had taken the pill. Other consequences were more dramatic. One night, when we were staying in a cabin in Vermont, I woke up feeling as if I had stepped into a factory at full capacity, with ear-splitting noise and bright lights. I was hesitant to journey down the steep flight of well-worn wooden stairs to get a glass of water, and I needed about a half-hour in the kitchen before I was ready to resume my sleep. Worst of all, one morning I got a foretaste of why Cialis and Viagra make explicit disclaimers about their drugs’ side effects. What Aricept produced was not a pleasurable sensation.
In consultation with my doctor, I reduced my dosage to just 5 milligrams. And while the side effects went away, so did, perhaps, all traces of its efficacy. I took this low dose for a week, and then I would go another week with no Aricept at all. It was hard to tease out whether there was any benefit at the lower level.
At the recommendation of a friend, I reached out to Robert Whitaker, a prominent critic of the biopharmaceutical industry. Whitaker, who is based in Cambridge, is the author of several books, including Anatomy of an Epidemic: Magic Bullets, Psychiatric Drugs, and the Astonishing Rise of Mental Illness in America. In an email, Whitaker made clear that he is not an expert on Aricept’s possible consequences. But he did offer a powerful anecdote: His father, who was in an early stage of Alzheimer’s, experienced a psychotic reaction after taking Aricept.
 “It sent him into a tailspin from which he never fully recovered,” Whitaker said. And when he looked into the drug’s properties, he learned that the labeling, at that time, made clear that the medication was not intended for people with mild or early-stage Alzheimer’s.
Whitaker suggested that I look into Aricept’s possible side effects today, and the list is long and disturbing: weight loss, dizziness, depression, confusion, hallucinations, among others. There are plenty of reasons for me to leave my Aricept tablets inside their bottle. Or, better yet, dropped into the drug-disposal box at my local police station.

Friday, March 4, 2016

End game


A few weeks ago I found myself in a funk. One reason was that it was the dead of winter, but that was a bogus reason. What really had me down was that my mind kept drifting toward Alzheimer’s end game, many years from now. Will I still be me—my sense of self intact, however circumscribed—when plaques and tangles have colonized my brain? I was not feeling optimistic. I recalled a visit last year, a couple months before I was diagnosed, to my mom’s assisted-care home. While my mom’s mind remains impressively sharp, an elderly man I encountered clearly was in the late stage of the disease. He seemed to have lost the ability to speak and was struggling to pantomime his meaning to me. We made eye contact, and I nodded encouragingly. I could read his frustration on his face. But then he drifted away, as if he were a shade from the Greek underworld.
Last year, Greg O’Brien, author of the valuable book On Pluto: Inside the Mind of Alzheimer’s, made the decision to forgo treating his stage-three prostate cancer, in favor of dying with his wits intact. Given my overall robust health at 54, with few risk factors for cancer or heart disease, it seems likely that my fate will be severe dementia. In discussing this recently, Paula reminded me that, while Massachusetts narrowly defeated a death-with-dignity measure a few years ago, Washington is one of the few states that have legalized doctor-assisted suicide. There was something comforting about the notion, many years from now, of going back to my beloved home city of Bellingham to die. But there was a catch. The regulations state that the person seeking to die must be “of sound mind.” There appears to be no mechanism to honor one’s wish, made many years before, that the person’s life could be ended once a deep, irreversible level of despondency is reached.
For advice, I reached out to my friend Mike Balchunas, who I met working at a newspaper in Connecticut more than 30 years ago, when I was fresh out of college. Mike was only twenty-eight, but seemed wise beyond his years. Later I learned that from an early age he has endured the chronic intestinal disorder Crohn’s Disease. When he was growing up, his parents were told that Mike would be lucky to live into his twenties. Now he is sixty.
When I emailed Mike last week about my dread of late-stage Alzheimer’s, his response was not what I expected. Rather than endorsing my view that life would not be worth living into the disease’s terminal stage, he challenged it. It appears that dementia runs in Mike’s extended family, and his mother (though she died of complications from a broken hip) had almost no short-term memory capacity when she died. As Mike, who has written about dementia, noted, “After age 80, if my mother put down a cookie, she would not remember whether or not she had taken a bite out of it. Either no consolidation [of what had just happened] was occurring or the retrieval function was mostly wiped out. Yet there were vestiges of more complex memory function that survived longer, such as card-playing ability.”
Mike said it was only in her last year that she needed help playing the card game Shanghai rummy.
“At these card games, there was a shelf near the table with a photo of her extended family, taken around 1945,” Mike said.  “Whenever she noticed it, sometimes several times in the course of a few hours, she would say, ‘What’s that picture of?’ We would take it down and show it to her, and she had no problem identifying by name everyone in it—her mother, father, uncle, aunt, siblings, cousins. She could not remember what she had said or done seconds earlier, or who some of us were, but still had recall of her childhood family members when prompted by the photo.” Other pastimes included watching the Boston Red Sox on TV. (Mike and most of his relatives live in Southern California, but the family’s roots are in Massachusetts.)
By the time Mike’s mother reached eighty, her short-term memory was basically shot. Yet even in her last months, she could appreciate a home run by David Ortiz. But, as Mike put it, by the time Big Papi had rounded first base, Mike’s mom would have forgotten what happened. And yet, she continued to read a daily newspaper.
Mike also passed along an anecdote about his mom and the 2012 presidential election. His mother happened to be watching a debate on TV, and among the candidates was Herman Cain, who is black. Mike’s mom was a liberal Democrat, and when she noticed Cain among the candidates, she commented, “That black guy is running for president?” Mike confirmed that he was, and his mom replied, “Well, he’ll never win. Americans would never vote for a black president.” Mike explained that Barack Obama had been in office for more than three years.
“At times like that,” Mike said, “she would immediately become silent and become depressed.”
Now back to my original question: Will I still be me? Based on Mike’s observations, there is a case to be made that the terminal stage of Alzheimer’s will not be as bad as many of us assume. To maintain some simple pleasures, like sharing time with loved ones and watching sports on TV, might sound like a very circumscribed existence, but it is still an existence, one with much frustration, but also, no doubt, with many fleeting moments of joy.