Friday, December 30, 2016

Can Biogen win where Eli Lilly failed?

At the beginning of December, I did some venting about the miserable record of the biopharmaceutical industry’s inability to slow, let alone cure, Alzheimer’s. But on December 10, the lead article in the Boston Globe’s business section was headlined “Hopes rise for approach to combat Alzheimer’s.” Was this just another tease? The article, written by Robert Weisman, suggested that the failure of Eli Lilly opened the door to Cambridge-based Biogen. As with Eli Lilly’s failed drug candidate—solanezumab—Biogen’s compound has a long name: aducanumab. On December 21, Weisman wrote about Michel Vounatsos, Biogen’s incoming chief executive. The banner headline stated, “Alzheimer’s is top challenge, Biogen CEO says.”
According to Weisman, recent research by Biogen confirmed data indicating that aducanumab decreased levels of plaque among individuals who continued to take the drug after the clinical trial was over. This was encouraging—both to the people who took part in the clinical trial and, of course, to Biogen and its investors. Yet Weisman noted that optimism was “tempered” about the drug candidate because a biotechnology analyst estimated that there was only about a 35 percent chance that the drug would ever be approved. In the world of Alzheimer’s research, however, anything better than a one-in-three chance sounds highly encouraging. Keep in mind, of course, that the only Alzheimer’s drugs on the market today are Aricept and a few other drugs, none of which claim the ability to hinder the disease’s progress, let alone stop it. So any genuine progress to slow the disease would be a big deal.
But, as always with Alzheimer’s research, big challenges lie ahead. One drawback: One of the people in the clinical trial suffered a seizure. Weisman’s article did not make clear whether the seizure was directly caused by aducanumab, and Samantha Budd Haeberlein, Biogen’s vice president of clinical of discovery and development, took an optimistic view, suggesting that the data from clinical trials “shows that we can take patients more slowly to their designated doses and still see efficacy.”
Nothing will happen overnight, of course. According to Weisman, Biogen’s forthcoming clinical trial will be mammoth—as many as 3,000 participants spread among twenty nations. Will I be volunteering? I hesitate because I seem to be able to bear Aricept at only very low doses, and I suspect that would be the case with the drug in the Biogen clinical trial as well. I cherish my sleep—as anyone with Alzheimer’s should—and I’m wary of being knocked off my daily regimen by poor sleep hygiene.

Friday, December 16, 2016

Virtual dementia

Is it possible to simulate the symptoms of late-stage Alzheimer’s, so people without the disease can experience the condition vicariously? That was the question of a television journalist whose father had died from Alzheimer’s. The journalist, Monica Robins, moved to the Cleveland area in the late 1990s so she could care for her father. After he died, Robins arranged to go through the “Virtual Dementia Tour Comprehensive Program,” created by the company Second Wind Dreams, in order to get a better understanding of what her dad went through.
One of the first steps in the simulation involved putting inserts in Robins’ shoes, intended to simulate neuropathy—the sensation of pins and needles. She then was told to put on a half dozen pairs of socks, to create, presumably, the difficulty of walking after suffering nerve damage in the feet. “What?” Robins exclaimed, when she heard the directive. Another simulation involved compromised peripheral vision, which makes driving a car hazardous. Deeply confused by this time, Robins asked, plaintively, “What am I supposed to do?” Soon she had to attend to another task. A moment later she returned, musing “about a belt,” but failed to understand the context. Another task was to read five instructions posted on a wall. By that time, Robins was bewildered, and could remember only one of the directives.
 “Wait! Wait!” she suddenly said. “Something about a picture.…Was I supposed to do something at the table? I don’t know what to do,” she said, her voice brimming with anxiety. She attempted to make the bed, as directed, but because of the hindrances to her vision, hearing and spatial awareness, Robins became ever more disoriented, leading her to break down in tears. “24/7,” she said, a few moments later. “I would honestly change everything that I did,” Robins said, referring to the years that she was caring for her dad. “I don’t want others to make the mistakes that I did.”
Robins, however, may have been too hard on herself. After all, it was her father she was caring for, not some stranger. How not to feel some inadequacy, if for nothing more than the lack of being able to do the impossible? And I, too, occasionally ponder what late-stage Alzheimer’s might be like. I recall reading that Ralph Waldo Emerson, one of the great minds of the nineteenth century, seemed relatively serene in his final years, despite his dementia, which manifested itself as aphasia (the loss of language, an especially cruel fate for someone who was exceedingly articulate). Still, the Sage of Concord is said to have maintained his equanimity. “I have lost my mental faculties but am perfectly well,” Emerson purportedly said.
To view video of Robins’ challenges, do a search for “Virtual dementia reveals what life is like with brain disorder.”

With the holidays approaching, my next blogpost will appear on Friday, December 30.

Friday, December 9, 2016

A life cut short

Sometimes I need a break from writing about Alzheimer’s. The litany of failed drug candidates is too long, the supposed progress largely illusory, to provide me or my cohorts much in the way of hope. That’s why, in the summer of 2015, when I was preparing to launch this blog, I sought to give myself some space to occasionally barely mention Alzheimer’s at all.
Lately I’ve been thinking of Pete Oswald, a friend of mine from high school. We met on the football team. Pete, one year ahead me, was the team’s center, in more sense than one. Anyone who has played high school football is likely to remember at least one teammate who took great glee in bullying the underclassmen. Pete was the opposite. In his sophomore season, if not sooner, he’d learned humility. He’d developed a stubborn illness that fall, and though he soldiered on, he associated that season with the smell of the drying, soiled football gear. It was an unpleasant association. But by the time I became acquainted with him, he reported that he’d come to like the smell—not because of the particular odor, of course, but because of its association with his illness, which now was in the past. And through his years of high school, he remained steadfastly loyal to his childhood friends, none of whom were jocks.
After Pete graduated, he enrolled in the engineering program at Harvey Mudd College, west of Los Angeles. He played on the school’s football team his freshman year, but by then he had a higher ambition: to transfer to the Naval Academy. There, too, he played football, earning All-East honors his senior year. In retrospect, the decision to follow a military path was almost inevitable. Pete’s father, Harold Oswald, was a Navy pilot during World War II. My friend Scott, who also was close to Pete, suggested that the senior Oswald was stationed in the Aleutians, from which pilots could carry out bombing raids in the north Pacific. After the war, according to Scott, Pete’s dad was stationed at the Whidbey Naval Air station, north of Seattle. The graceful bridge north of the naval station is an architectural marvel. It was also alluring to Harold Oswald and one of his fellow pilots. Sometimes they would fly under the bridge—a practice that was halted after one of the pilots clipped a wire, possibly a power line.
Pete himself was the baby of the family. He had two much-older brothers, including a Navy pilot who went on to become an astronaut. (I remember, sometime in the mid-nineties, getting a postcard from Pete noting that he was in Houston, because his brother had “just got back from space.”) Pete had wanted to follow his brother’s path to become a pilot, but his eyesight was subpar. Instead he eventually became a Navy SEAL.
From the letters I received from him during his first months at Annapolis, it was clear that he was unhappy. He was admonished, for example, for having a poster of Jimi Hendrix in his dorm room. When he was visiting me at my apartment, not far from Western Washington University, he remarked that when he heard Hendrix’s “Purple Haze”—it was playing on my stereo—he wanted to fling himself over my balcony. At nineteen, he was eager to jump out of planes, and, many years later, when he was living in Coronado, California, jumping out of planes was part of his job.
But that was deep into the future. When I saw Pete at Christmas break in 1981, he was in a bleak mood. We drove up an old logging road on Chuckanut Mountain, just south of Bellingham, and as we surveyed the dark islands to the west, drinking our beers, Pete confided that he wasn’t sure he wanted to go back to Annapolis. He feared that he was too irreverent, or that he couldn’t endure the military strictures. But he did find that he had a sense of mission. While serving on a nuclear submarine in the north Atlantic, he described the submarine’s role as “keeping the world safe from itself.” He also told me about the armed guard, whose role was to make sure that some deranged sailor didn’t get access to the Trident nuclear missiles waiting in their bays.
A couple of years later, stationed at the Puget Sound Naval Base in Bremerton, Pete compared his circumstances to the old sitcom McHale’s Navy. The enlisted men maddened him. By then I was living in Waterbury, Connecticut, and, in a happy coincidence, Pete ended up studying at the Naval War College, in Newport, Rhode Island, just a couple of hours away by car. For more than a year, I saw Pete regularly. I particularly remember a weekend ski trip we made to Killington, Vermont, in early March 1985. The night before, I’d thrown up from the flu, but being a 23-year-old guy, I made the trip anyway. I remember skiing vigorously during the day, and throwing up at night. It seemed the reasonable thing to do. Only when I got back to Waterbury did I grasp just how sick I was.
After several more encounters with Pete that year, he left New England for good. The last time I saw him was the summer of 1989, when we both happened to be in Bellingham. A year later, Pete was preparing for the Gulf War, which turned out to be a lightning-swift campaign. At the time, I had no idea that Pete was serving. (Of course I didn’t. His mission, after all, was clandestine.) It was not until I visited Scott in Bellingham in September 1992 that I grasped that Pete had seen combat. The proof was in the photo—which, if I recall correctly, included the charred remains of an Iraqi vehicle. Nor did I know that later in his career he spent three years in Japan, where he became fluent in the language, or that, a few years before the Gulf War, he oversaw mine-sweeping operations during the Iran-Iraq war.
I never saw him again. But in the mid-nineties he reached out to me. I was surprised to learn that Pete, too, was a writer. He appreciated the time I spent poring through his draft of his novel. It was no surprise that I failed to make it back to the funeral in Bellingham; by the time I learned the news of his death, my friend was already in the ground.
What was the meaning of Pete Oswald’s life? What is the meaning of any life? Maybe the better question is, “How can I make the most of my remaining time?” You never know when fate might connect on a haymaker. At the time of Pete’s death—he died in a military-training accident in El Salvador near the end of August 2002—he had a wife and three young daughters. File this one in the “life isn’t fair” department. But it’s worthwhile, as Flannery O’Connor sort of put it, that one should be ready to die at any moment. You may not know when your number comes up.

Friday, December 2, 2016

Another strikeout

I am becoming impatient. Season by season I sense my faculties eroding. Yes, I understand that mine is a progressive disease, and, as far as I can tell, my rate of decline is a bit slower than the norm. Four-and-a-half years since my earliest symptoms, in most social situations I can pass for normal. But drop me off alone at an airport, and my anxiety will spike. And require me to change planes in, say, O’Hare Airport, en route to Seattle, with limited time to catch my connecting flight, I would be courting disaster. The ever-flipping electronic boarding times seem part of a system-wide cruel joke on impaired travelers. Just as I would find the SEA abbreviation, I would lose sight of the flight number. I would have to set down my carry-on luggage, pull out a mini-notebook from my jacket pocket, concentrate my hardest, and hope to jot down the correct gate number and time. It’s unlikely that I will fly alone again.
And yes, I am aware that many people are far worse off. I’m not only talking about people with Lou Gehrig’s disease, one of the cruelest neurodegenerative diseases. I’m also thinking of B. Smith, the African-American fashion maven who is now well into the middle realm of Alzheimer’s three stages, despite being diagnosed only about three years ago. What frustrates me is the parade of failure of Alzheimer’s drug candidates. Many of us know that the vast majority of Alzheimer’s drug trials end in complete failure. If Alzheimer’s researchers, some of the smartest people on the planet, were a baseball team, their collective batting average would be not much above .000. There are only a few drugs on the market, and the best-known one, Aricept, provides limited efficacy.
The latest disappointment came, rather cruelly, just before Thanksgiving. The Boston Globe reported that Eli Lilly didn’t get over the efficacy bar to continue with their research. The company had already learned that their drug candidate would not halt the disease. What else is new? But there was alluring hope that the drug would slow the pace of the disease, and that gave people like me optimism that we might enjoy a significantly wider window before we reach the disease’s extremely unwelcome final stage. “We didn’t get the results we wanted or expected,” a spokesman stated. Isn’t this the way that Alzheimer’s trials always end?

Friday, November 25, 2016

Where Alzheimer's is the norm

Could a village in Colombia, devastated by Alzheimer’s like no other community, generation after generation, provide clues toward developing a cure for the disease? Dr. Pierre N. Tariot, the director of the Phoenix-based Banner Alzheimer’s Institute, is among researchers who believe so. Speaking at the annual Matthew & Marcia Simons Symposium on Alzheimer’s Disease on Nov. 9 in Newton, Massachusetts, Tariot displayed a pedigree chart reflecting the misery of the town of Yarumal. “Roughly every other person was laid low with Alzheimer’s,” Tariot said. “You can see it in every generation,” dating from the 1640s. Historians have long understood that diseases brought from Europe, such as smallpox, were ruinous for native populations. But the Spaniard who founded Yarumal did not bring a sickness that killed people quickly. Far from it. Because of a genetic mutation carried by the founder, the village’s plight over the centuries has been far-reaching. And that has made it important to Alzheimer’s researchers.
As the New York Times reported back in 2010, people in Yarumal show symptoms of Alzheimer’s as early as their early thirties. The typical resident with the disease has severe symptoms by age 47. One man, a reporter noted, “babbles incoherently, shreds his socks and diapers, and squirms so vigorously he is sometimes tied to a chair.” Another man was in denial about his condition, and when he was sent to the market to buy two basic staples—bread and milk—he was able to remember only one of the two items.
Could any good come out of this suffering? Not for the villagers who are dying in agonizing and humiliating ways. But there may be hope for their children and their descendants. Rowan Hooper, writing in the journal New Scientist about a year ago, noted the work of Kenneth Kosik and his colleagues at the University of California at Santa Barbara. Hooper cited a process called “identity-by-descent analysis.” Because Kosik’s team had information on the genome sequence around the Yarumal mutation, they were able to apply the approach to the town, to get a better sense of how people in the study were related. “It’s hard to explain why all these people would share a large chunk of DNA, if there hadn’t been a common founder,” Hooper wrote.
The scientists appear to concur that the village’s founder, a conquistador who founded Yarumal around 375 years ago, brought the fateful gene with him. The strangeness of the disease is captured by a detail in One Hundred Years of Solitude by the late Colombian novelist Gabriel García Márquez. As Hooper noted, residents call the disease La Bobera (foolishness), the kind of term that characterizes the real-life town of Yarumal. In an article in the British newspaper The Telegraph, Michael Jacobs reflected on the eerie similarities between the forgetfulness of García Márquez’s characters and the real-life people suffering in Yarumal.
What is the likelihood of this previously obscure village being the place where crucial insights are learned about Alzheimer’s, including slowing the disease’s pace? Dr. Tariot told his audience in Newton that the key to stopping Alzheimer’s is to prevent it well before symptoms surface. Yarumal’s unique genetic pool could shed light on an approach that could lead to a cure.
That, of course, would not directly help those of us who already have been diagnosed with the disease. But that is beside the point. It’s our children’s generation who have the promise of a future when Alzheimer’s will have become a manageable disease. And, of course, the Yarumal residents would, over time, begin to escape from the plague that has shortened the lives of their ancestors for centuries.

Friday, November 18, 2016

The outlier

I met Jay Willis in a support group for people in the early stage of early-onset Alzheimer’s disease. I had already learned that this disease typically moves slowly, so it was no surprise that most of my cohorts remained articulate. But Jay, a former lawyer, is particularly well-spoken. Recently, he left our support group, but not because his condition was going downhill. Jay concluded that he had stopped getting worse. Most remarkably, I learned that he was diagnosed with the disease roughly fourteen years. The Alzheimer’s Association website notes that it is possible to live for up to twenty years after being diagnosed, but the assumption is that the person would be just a shadow of his or her former self. Jay, in contrast, no longer shows any symptoms of the disease at all. Is Jay a statistical freak? He says that the people at Brigham and Women’s Hospital, where he has taken part in many clinical trials over the years, nicknamed him “The Anomaly.”
Back in 1999, Jay was diagnosed with clinical depression. He recalled, “I didn’t know what was going on. Just that every day I was waking up and thinking, ‘Well, maybe I’ll die today and I won’t have to deal with this.’” Depression, he discovered, can be an early sign of Alzheimer’s. In 2002, he was diagnosed with mild cognitive impairment, often a precursor to Alzheimer’s. Jay was devastated. He had a prosperous law practice and what he described as a photographic memory. But then he began forgetting things. Alzheimer’s runs on both sides of his family, so he knew that he was at risk. Over the next four years, his weight ballooned from a trim 190 pounds on his six-foot-two frame to 245.
“My brothers and sisters thought I was faking, and I was also regressing,” Jay recalled. “I didn’t know what was going on, either. Just that every day I was waking up and thinking, ‘Well, maybe I’ll die today and I won’t have to deal with this.’” Jay’s doctor advised him to close down his law practice, and when Jay asked how soon, the answer was, “immediately.” It took a year and a half to wind things down.
Over the years, Jay has aggressively sought out participation in clinical trials, in the hope that he could reap the benefits of the drug candidates before they received FDA approval. His earliest clinical trials came soon after his diagnosis. At that time, some researchers thought that the first symptoms of Alzheimer’s could be detected through the hippocampus, the seat of short-term memory, which in cases of Alzheimer’s is typically the first part of the brain to experience decline.
One of Jay’s earliest clinical trials was under the direction of the Alzheimer’s researcher Dr. Dennis Selkoe at Brigham and Women’s. In Jay’s view, if he had to walk the path of the disease, someone else should learn something from it. About five years ago, Jay qualified for a modest drug trial with roughly 200 participants. By then, he was much more social, his depression long behind him. Dr. Selkoe didn’t think Jay would score low enough to qualify for the trial, but in fact he did. “I continued to function, in spite of this chemical imbalance in my brain,” Jay said. “Over the next two years, I kept feeling better and better.” At the end of the five years, they terminated the study.
Are there lessons to be drawn from Jay’s long experience with Alzheimer’s? As one doctor of his said, “You should be mildly demented but you’re not.” Jay’s experience is highly atypical, of course, if not downright freakish. But one practice of his—intense aerobic exercise on a daily or near-daily basis—is something that can benefit many of us who have the disease, in the service of forestalling our way to our murky destination, a depot where no one wants to arrive.

Friday, November 11, 2016

The Bredesen approach

If you have early-stage Alzheimer’s and want to know what you can do that might forestall the disease’s progress, you’ll find plenty of information on the Internet. The challenge is sorting out the wheat from the chaff. It has become conventional wisdom that a “Mediterranean Diet”—featuring lots of seafood, yogurt, olive oil, nuts and grapes, along with leafy greens such as kale and spinach—is a good thing to follow. But it appears that Dr. Dale Bredesen, a neurologist at UCLA, has furnished evidence that certain dietary and lifestyle changes may not only slow down the pace of the disease but reverse it.
Early this week on the Today show, Bredesen spoke about his research, bringing encouraging news to a wide number of viewers. An abstract of the study, published two years ago, describes a personalized therapeutic regimen that, beyond sound nutrition and lots of aerobic exercise, also focuses on good sleep hygiene. To maximize sleep’s restorative power, Bredesen recommends fasting for at least ten hours after dinner. That rules out evening bowls of cereal or midnight snacks.
Over the past few years, Bredesen has been striving to bring his work to a wider audience. In a UCLA press release two years ago, Mark Wheeler laid out encouraging anecdotal evidence that Bredesen’s approach could reverse deterioration if treated in a timely manner: “Patient 1 had two years of progressive memory decline. She was considering quitting her job, which involved analyzing data and writing reports, she got disoriented driving, and she mixed up the names of her pets.” A second person was forgetting people’s faces along with the combination for his gym locker. A third person’s memory “was so bad that she used an iPad to record everything, then forgot her password. Her children noticed she commonly lost her train of thought in mid-sentence, and often asked them if they had carried out the tasks that she had mistakenly thought she had asked them to do.”
The abstract of the study notes that all but one of the ten subjects showed cognitive improvement during the trial, and that person was already at a later stage of the disease, too late for the approach to help. Most strikingly, six of the ten patients improved enough during the drug trial to return to work. Improvement, either subjective or objective, was noted. The study gives hope that something as simple as tweaking one’s diet and exercise regimen, along with good sleep hygiene and lots of mentally stimulating activities, can forestall cognitive decline.

Friday, November 4, 2016

Not your typical Alzheimer’s movie

Movies about Alzheimer’s are more numerous than I realized. Back in 2007, Roger Ebert mentioned he had reviewed five such movies in the first seven years of this century. They included Iris, an elegiac biopic about the superb British writer Iris Murdoch, who died of Alzheimer’s in 1999. The most memorable scene in Iris comes when Murdoch, played by Judi Dench, blanks out during a live television interview.
A movie that came out in 2006, Away From Her, featuring Julie Christie, and directed by Sarah Polley, struck a very different tone. Unlike the 2014 movie, Still Alice, featuring Julianne Moore as a Harvard linguist who has a rare and fast-moving variant of the disease, Away From Her depicts a woman who finds the disease liberating. Not that Fiona needs encouraging. She and her husband live in rural Ontario, and when she gets lost in the woods while cross-country skiing, rather than panicking, she throws herself on her back and stares up at the snow falling from the boughs overhead, as if she were making a snow angel. Grant dutifully retrieves her.
The ironic style of this film derives from the Canadian author Alice Munro, one of Paula and my favorite writers, and the winner of the 2013 Nobel Prize in literature. One of Munro’s stories was the basis of Away From Her. A key point in the movie comes when the decision is made that Fiona can no longer be trusted to live without assistance. One scene depicts her leaving a burner on, risking a fire, and soon preparations are underway to transfer her to a care facility. And, rather protesting the decision to leave her home, she embraces it. It turns out that the pricey care facility has a rather odd policy about loved ones: No one can visit until 30 days have passed. As I was watching the film, I thought, how awful it must be to be dumped in a nursing home, no matter how upscale, and not see your loved ones for an entire month.
And much can happen in that span. The surprise comes once Fiona is reunited with her husband Grant (Gordon Pinsent). Grant soon grasps that she is enjoying herself—too much, in fact, in Grant’s view. Early on, the film makes clear that Grant, a former college professor, had a long career as a philanderer. It is only in his senior years that he has been loyal to Fiona. Now, at the care facility, he quickly learns that Fiona has become romantically involved with one of the male residents. Grant himself strikes up a friendship, or at least a confidence, with a middle-age care attendant. Only fleetingly does this movie pause to let viewers see the unpleasantries which await people with Alzheimer’s in the terminal stage. This is one cheery Alzheimer’s film.

Friday, October 28, 2016

Be organized

The Austin-based journalist Michael Andor Brodeur recently wrote about devices designed to locate missing items, such as car keys and cellphones. Among companies active in this field is Tile, a device that will search for missing items within a 100-feet radius “and play a loud tune until you find it.” One of Tile’s competitors, The O, appears to be a bit more upscale, making its sensor-fobs look like jewelry. The cost for a starter kit is $139.
I imagine that for certain people, The O, Tile, and similar products would be helpful, maybe essential. And, yes, from time to time I do have to call my cellphone on my landline to discover where in my house I left it. Usually I hear it faintly beeping from inside the pocket of the jacket I last wore. But for people dealing with early-stage Alzheimer’s, as well as those who are just chronically absent-minded, there is a simpler way, at no real cost, to cut down on the time spent looking for things.
Last year at this time, I was working with Dr. Laura Phillips, a neuropsychologist at Mount Auburn Hospital in Cambridge. I met Dr. Phillips in March 2014, about four months after another doctor at the hospital had intimated that I might be experiencing the symptoms of dementia. When my diagnosis was confirmed more than a year later, I was assigned to work with Dr. Phillips, as a means of mitigating the symptoms. One of the first things Dr. Phillips suggested was to purchase a notebook, the kind that is bound like a book, along with a weekly planner.
The blank book, what I call the “purple book,” has been particularly helpful. Unlike with my weekly planner, the purple book rarely leaves our home. Occasionally I use it for brief journal entries, but the more important role is to serve as a reference for hints for various passwords—especially for those long strings of characters that Verizon requires when we’re having problems with our Internet service. The book also serves as a place to enter contact information for people I’ve met from the Alzheimer’s Association or on the state panel I serve on that focuses on providing services for people with Alzheimer’s.
Each object that I use daily—wallet, keys, cellphone, reading glasses—has a designated place. My wristwatch, for example, hangs on a hook near our sink, so there will be no chance of my plunging the watch into the dishwater. Do I still misplace things? Yes, often. But not as often as I did when I first starting experiencing the symptoms of dementia. The one object that I most misplace is my favorite pair of reading glasses. If I did an inventory, I would probably find more than half-a-dozen other pairs, some of which I should have already discarded, because they no longer meet my needs.
Dr. Phillips also advised me to limit clutter in our home, and, while the results were modest, they were steps in the right direction. For the first time in a couple of years, I had tidied up our narrow, walk-in-a-crouch attic space. The effect was short-lived. Roughly six weeks into the new year—Valentine’s Day’s, to be precise—a pipe burst due to the cold, and, a moment later, I beheld a fishbowl-shaped light fixture in our front hallway filling up with water. The only thing missing from the scene was a goldfish or two floating belly-up, after dying from the shock. The good news was that nothing caught on fire. The bad news was overwhelming: All the boots, shoes, coats, and books in our front hallway and all the books, photos, files, office supplies, and other items in the home office had to be stored somewhere else while repairs were made over the next month. The result was a packed attic crawlspace, a packed front enclosed porch, a coat tree and desks moved into the living room, and important papers stored in the dining room. Not exactly what Dr. Phillips had prescribed.

Friday, October 21, 2016

A means to slow Alzheimer's?

This past Sunday, the Boston Globe ran a lengthy article devoted to an Eli Lilly drug candidate designed to slow—not cure—Alzheimer’s disease. The article, produced by the news organization STAT, stated that while the drug failed to slow mental and physical deterioration among people with mild-to-moderate Alzheimer’s, the news was not all bad. As the author of the piece, Damian Garde, noted, “a secondary analysis of pooled data in patients with mild forms of the disease” showed a 34 percent reduction in the patients’ cognitive decline.
This finding could be significant. While some people with Alzheimer’s deteriorate rapidly, the much more common scenario is a long, gradual decline—so mild that in the earliest stage of the disease, neither doctors nor patients are likely to be aware that the person has the disease. This is especially true for people with early-onset Alzheimer’s, because they aren’t at an age when the disease is expected to occur, and symptoms are mistaken for other problems, such as attention deficit disorder. In my case, it took about three years from my first symptoms to my diagnosis. Like most people in my situation, I felt that I had been given a death sentence. And I had. But the executioner has turned out to be a lazy fellow, and four-and-one-half years later, my symptoms remain mild.
But there is a possibility that Eli Lilly’s drug candidate, solanezumab, could add many additional years to the arc of Alzheimer’s progression. At the far end of optimism is the notion that Alzheimer’s could become a manageable disease, as AIDS did in the early 1990s, thanks to a complex cocktail of drugs. According to Garde, people treated with solanezumab performed about 35 percent better on cognitive tests than those on a placebo.
In tandem with common-sense measures such as exercising regularly, following a reasonable diet and getting plenty of sleep, solanezumab could extend life expectancy significantly in the face of Alzheimer’s, according to Garde. But as he noted, roughly 99 percent of all Alzheimer’s treatments have failed. A particularly blunt statement came from Mayo Clinic neurologist David Knopman: “I hate having to tell people that we don’t have anything that can truly arrest the disease at this point.” The rather modest aim, according to Garde, is to see if the drug does better than the placebo. The plan is to inject the subjects, all of whom are at a relatively mild stage of Alzheimer’s, once a month with a drug that has the potential to clear away amyloid plaque, a dominant feature of the disease.
As the Alzheimer’s researcher Dr. Howard Fillit commented in Garde’s article, the best-case scenario would be that solanezumab would serve as a foundation for further Alzheimer’s research. That, Fillit said, would likely create the biggest market for any pharmaceutical ever. Under optimal circumstances, the drug could be approved by late next year. That strikes me as a long shot. Still, solanezumab is welcome news.

Friday, October 14, 2016

Delayed reaction

Paula and I recently attended a dinner party where everyone either had a spouse with Alzheimer’s or had been diagnosed with the disease itself. All of the diagnosed individuals happened to be men, and all but one of us reported suffering a concussion as a child. Mine came when I was around six, riding on the back fender of my brother’s bike. When I came to, I was lying in the back seat of our car, not long before we reached our doctor’s office, a trip of about fifteen minutes. It is clear that I had fallen backwards. My brother, who was four years older, recalls seeing an egg-shaped lump protruding from the back of my head. He raced down our hill to alert our mom. This was long before kids wore bike helmets.
Could that single childhood concussion have predisposed me to develop Alzheimer’s in middle age? From an early age I liked to play rough games—wrestling with my brother on our carpet, not far from our brick mantel, or playing tackle football in our backyard, without helmets, often with bigger boys. The playgrounds at our elementary school were entirely paved, which ruled out playing tackle football—unless snow was on the ground. In one of the years when we did get substantial snow, I banged heads with a friend of mine. I didn’t exactly feel the pain but I smelled it—an odor of rubbing alcohol, or something else one might encounter in a hospital. In sixth grade, I played my first of seven seasons of organized football.
An article published in 2011, “Long Term Consequences: Effects on Normal Development Profile after Concussion,” notes that “immature neural tissue differs from mature tissue in reponse to injury.” The key word is “plasticity.” A young brain, in other words, is more vulnerable to damage than an adult brain is. In one sense, I feel fortunate. Whatever damage I suffered from that long-ago head injury, it had little effect on me as I was growing up. I generally did well in school, though in math I got no further than geometry, a subject that I struggled with. It’s possible that my childhood bicycle accident did enough harm to damage my spatial reasoning. My sense of direction has never been good, and these days it is atrocious.
But according to the study I cited, many childhood head injuries come with much more damaging consequences. “Because the prefrontal cortex is one of the last brain structures to mature, it is not surprising that parents [often] report attention deficits, hyperactivity or conduct disorder,” following a concussion, according to the study. That my head hit on the back of the skull, rather than a more sensitive part of the brain, may have spared me from worse damage. My consequences, to the extent that I can document them, were a good deal more subtle. I was a high-spirited child, but I did well in school. Yet I do believe that my concussion roughly 50 years ago is at least partly responsible for my short-term memory and executive-function difficulties today. Throw in the countless helmet hits I endured in my seven years of organized football, and there should be little mystery about why I began to experience mild symptoms of Alzheimer’s a few months before my 51st birthday.

Friday, October 7, 2016

Lake Padden

Unlike Walden Pond, where Henry David Thoreau lived for more than two years, Lake Padden isn’t famous. Most people outside of northwest Washington are unaware of it, and that is for the better. As a child and a teenager, I spent more time at Lake Whatcom, the deep, cold, ten-mile-long body of water where I shivered through fruitless swimming lessons and, more than a decade later, learned to water ski. On Lake Padden, motorboats are not allowed. In high school, my friends and I would float on the lake in inner tubes, sometimes with a six-pack attached a few feet below the surface, keeping the beer exceedingly cold. In August, as the football season approached, I would often run two laps around the lake, a distance of just over five miles.
It wasn’t until my junior year in college that I fully appreciated Lake Padden. I’d been having a rough time on our college newspaper, which I’d worked on since the spring of my freshman year. No one questioned my competence as a reporter or an editor, but I was an aloof manager, assuming that the people I was supervising would work as hard as I would. In the fall quarter, when I was a junior, the editor of the newspaper was diagnosed with mononucleosis, and I temporarily filled his place. I thought I was leading by example, but what I was really doing was alienating the majority of my staff. One evening I walked into the newsroom, and I sensed that people had been gossiping about me. The chatter stopped as soon as I appeared.
On the day before the new editor would be chosen, I walked around Lake Padden. It was in early March, a damp part of the year. No one else could be seen. The oxygen I was drawing in acted like a sedative. In the solitude of the muddy trail, I realized, for the first time, that I might not be appointed editor. I had no idea of how to respond. It didn’t occur to me that I should have called a staff meeting, with the aim to clear the air. College is a place where, to one extent or another, most students are not fully mature. I certainly wasn’t, and it showed. But the walk around the lake enabled me to perceive the truth: There was a good chance that I would not be chosen as the next editor—a truth confirmed the following evening.
Nowadays, a walk around Lake Padden’s mild terrain is an essential part of my annual visit to my home town. Walking with me last month were my son Andryc and my cousin Jackie. During the week-plus that we spent in the area, there was only one day with any rain, and I would have felt cheated by drought if we missed out on what the author Timothy Egan coined “the good rain”: the mossy cedars, the stately Douglas firs, the lush carpet of needles on the forest floor. Other than the occasional plop of horse poop, this is the most peaceful place I know.

Friday, September 30, 2016

Left behind

About eighteen months ago, on a flight from Boston to Seattle, I left a cherished leather jacket behind as I was going through security. When we were preparing to land, I wasn’t concerned that I didn’t have the jacket in my lap; I assumed that I had stuffed it into the overhead bin before take-off. I had taken a very early flight, as was my preference, to maximize my time on the ground that first day in the Northwest. By the time I was leaving the plane, I was anxious, and the airport authorities were not particularly helpful. It seems that the jurisdiction for lost items between the airline and the baggage handlers was murky, and a visit to the lost-and-found turned up nothing, and for good reason: I am now almost certain that I lost that jacket going through security back in Boston.
And that was just the beginning. One day during that trip, I took my mom, who lives in an assisted-care facility just north of Seattle, on a drive up into the foothills of the Cascade mountains to the strangely named town of Index for lunch. The restaurant exuded rustic charm, and I tipped the waitress more than I usually do. She certainly earned it. After I helped my mom into our rental car, and I’d started the motor, and was just about to put the car into reverse, the waitress rushed out with my mom’s purse. Had we pulled out just 15 seconds later, we wouldn’t have learned of the missing purse until we arrived back at the assisted-care home, requiring a second two-hour round trip to retrieve the purse that my mom hadn’t needed in the first place: I was the one with the money.
Later on that trip, on a visit to a retired professor of mine and his wife who live on a gorgeous and remote estate east of Interstate 5, I left behind my baseball cap, which I was relying on to protect my baldness from the sun’s ultraviolet rays. For the second straight day, I bounced along a former logging trail for the roughly mile-and-one-half that led to my professor’s home.
Clearly, by this time, in the spring of 2015, I was experiencing symptoms of Alzheimer’s, though I wasn’t willing to believe it. The return trip to Boston was difficult. In some previous years, my brother Matt had driven me to the Seattle airport, but in this case Matt wasn’t available. I had two choices, neither of which were ideal. To guarantee that I would get to Sea-Tac in time, I would have to catch the 2:50 a.m. shuttle from Burlington, one city over from Mount Vernon, where my brother lives. Instead I chose the 4:50 bus, and I failed to fully account for Seattle-area rush-hour traffic, much worse than when I first started driving in the the Seattle area in the late seventies. Still, I was confident that I would catch my flight. And I almost did. There were just two people ahead of me when I learned that no one else would be allowed to board.
The next available direct flight to Boston was that evening. What did I do with myself for the twelve hours ahead of me? I boarded one of Seattle’s gliding commuter trains, which saved me from a vast expanse of boredom at the airport. I spent a couple of hours in Seattle’s futurist library, an architectural marvel. I walked along the waterfront, with its controversial ferris wheel. I had lunch at Pike Place Market, where I gorged down an oyster sandwich, and hung out for a while in Pioneer Square, where someone was strumming a guitar. Then I rode the commuter train back to the airport with plenty of time to spare before my overnight flight to Boston. Everyone had to deplane after a half-hearted bomb threat. Most amazingly, I actually fell asleep on a plane for at least three hours without the luxury of space to stretch out over empty seats. I arrived at my home feeling oddly refreshed, despite my lack of sleep. About two months later I learned I had Alzheimer’s.
The good news: In this year’s trip, I left nothing behind.

Friday, September 16, 2016

Is driving necessary?

I’ve been asking myself this question in one form or another ever since I was diagnosed with Alzheimer’s. In California, doctors must report to county health departments if the driver’s symptoms are “severe enough to impair a person’s ability to operate a motor vehicle.” Massachusetts’ regulations are less clear-cut, but I understand that the writing is on the wall: My days behind the wheel are likely numbered.
Last fall, this wasn’t the case. I was still occasionally driving my daughter to her school, a 28-mile round trip, and I also navigated my way on Route 128 in heavy traffic en route to an evening conference in Waltham. Returning from Thanksgiving in Rhode Island, Paula observed my driving, and I provided her with no evidence that my highway driving skills were in decline. I also had my peripheral vision tested—over time, people with Alzheimer’s are likely to experience difficulty at the margins of their vision, making driving unwise. The test I underwent indicated that my peripheral vision was fully intact.
But Alzheimer’s is a one-way disease; only the pace of the disease is in question. According to my neuropsychologist, who evaluated me just a few months ago, I scored very low in areas such as attention span and short-term memory. And spatial reasoning—how to get from Point A to Point B—has always been a weakness of mine,  and is getting worse. My doctor indicated that if I wanted to continue driving, I should line up a driving exam as soon as possible. But that may not be worth the trouble. In recent weeks, I’ve noticed trouble with parallel parking—an essential skill on a street where most people don’t have access to off-street parking.
And, as I’ve often reminded Paula, I haven’t had an accident since October 1984, when I hit a slippery patch of dead foliage, and clipped a car’s rear quarter-panel. I’d like to keep my no-accident streak intact. And the only certain way to achieve this goal is to not drive.
There is a second reason why I’m not particularly disturbed about the likelihood of giving up driving. The autumn before I first visited Boston, I spent the fall term in what was then West Germany, in the ancient city of Cologne, home of the 516-foot cathedral that bombers spared when almost everything else was flattened. My roommate and I lived with a family in a suburb on the east side of the Rhine, but the commuter-train was unfailingly on time, delivering us within a short walk to our school. Eight years later, when Paula and I were living in Hamburg, we rarely rode in a car, let alone drive one. The city’s gliding subways were so precisely calibrated that, more often than not, our wait for the connecting train was less than a minute. Sometimes the synchronization was perfect, as if the train conductors were directing an austerely modernist symphony, composed by Philip Glass.

One of the many reasons I wanted to live in Boston in the first place was that the city was presumed to be the most “European” of American cities, with its graceful Back Bay brownstones, a stunningly beautiful state capitol topping Beacon Hill, the quirkiest baseball stadium on the face of the earth, a history that dates to 1630, and, as of 1984, when I first visited, a good deal of neighborhood ethnic strife, spurred by the busing crisis of the previous decade.
And never have I valued my adopted city more than I do now. Technically, I don’t live in Boston. Paula and I reside in neighboring Somerville, which, with roughly 79,000 residents in 4.1 square miles, is one of the nation’s most densely populated municipalities. I can walk to Harvard Square in twenty minutes, but I am rarely aware when I’ve crossed the city boundary—many homeowners and businesses pay property taxes to each city, on a pro-rated basis.
There is no other place where I would rather live. This is not a recent sentiment. I’ve felt so for years, but nowadays I have another reason to love the city. I know it intimately. Brisk walking is one of the ways I get my exercise—bicycling is another—and so long as I stay within familiar territory, there is little chance that I will lose my way.

Note: I will be on vacation for much of the next two weeks, and this will be my last blogpost until at least Friday, September 30. I hope to post fresh material within the first several days of October.

Also: On Sunday, Sept. 25, Paula will be participating in the Walk to End Alzheimer's. People were very generous in sponsoring me in the Ride for Alzheimer's research in June. If anyone would like to sponsor Paula in her walk, please visit my Facebook page for a link to the donation site.

Friday, September 9, 2016

A moment to regret

Five-plus years ago, on one of my visits to my hometown of Bellingham, Washington, where I also attended college, I encountered one of my old professors at the local supermarket. His name was Hugh Fleetwood, and his son Seth, now an attorney and former member of the City Council and the Whatcom Council Council, was one year behind me throughout school.
In 1981, my freshman year at Western Washington University, I enrolled in Professor Fleetwood’s Introduction to Philosophy course. Did that course totally change my life? No, but it did expand my horizons. I’ve often recalled that course for its intellectual rigor. Was free will an illusion? Did one cause lead to another, ad infinitum, throwing into question  the existence of the self, let alone a soul? Was Descartes correct, that the most compelling proof of his own existence was that he was capable of doubting it? I had no great knack for the conundrums of philosophy, but just the fact that people had argued about such questions for so many centuries stimulated me.
After the course ended, I received a letter from Professor Fleetwood, urging me to consider majoring in philosophy. My dad was so flattered on my behalf that he mimeographed the letter and sent copies to our relatives. Neither my parents nor I grasped that the likely reason I received the letter was that the number of philosophy majors was dwindling. I myself was setting out on a more practical double major in journalism and political science. Most of my poly sci  professors had come of age during the 1960s, and almost all of them were on the political left. Being moderately conservative at that age, I sometimes argued with my professors.
During the fall of my junior year, Professor Fleetwood became a controversial figure, at least among student activists. Over the expanse of decades, the details have grown hazy, but one quote from him, uttered in the Faculty Senate, has stayed with me: “Students are not a particular font of wisdom.” Or some other version of the same sentiment. From the uproar that ensued, you might have thought the professor had said that students shouldn’t be allowed to speak at all. He also noted, in a letter to our student newspaper, that the word in question was font, not fount. In 1982, not long after the dawn of the PC revolution, I had no idea what a font was. I don’t think anyone did. In any case, I editorialized in Professor Fleetwood’s favor.
In March 1984, I graduated from Western a quarter early, the better to position me to get my first professional newspaper job. It was in Waterbury, Connecticut. A year-and-a-half later, I moved to Boston. There was no likelihood that I would ever see Professor Fleetwood again. But there I was, on my annual trip to Bellingham, in what I believe was the spring of 2011. And there was he, looking, for lack of a better term, demented. He was leaning unsteadily against his shopping cart. Only later did I learn that his disease was Parkinson’s, not Alzheimer’s—not that the distinction would have meant much to me back then, when I expected to live to at least 86, the age my dad had achieved. There was a fleeting moment when I could have done the gracious thing: Greet him, lay my hand on his shoulder, tell him how much I appreciated his philosophy course.
But I suffered a lack of nerve. There was no possibility that he would have recognized me—by 2011, I scarcely resembled my twenty-year-old self, he had taught thousands of students, and his disease had done its dirty work. But he might have remembered my name, once I stated it. Perhaps I could have expressed my gratitude for helping me widen my intellectual ambitions. But I soon realized that I had flunked the moral philosophy exam. Professor Fleetwood, I recently learned, died a few months later.

Friday, September 2, 2016

High aspirations

In a recent article in New York magazine, journalist Benjamin Wallace described a new kind of aspiration to immortality, one achieved through the pharmaceutical breakthroughs in the years since the human genome project was fully mapped in 2003. The article focuses on Leonard Guarente, a biologist who directs MIT’s AgeLab and cofounded Elysium Health, and is a strong proponent of the notion that longevity, through science, could stretch a good deal farther than most people assume. As described in Wallace’s article, Elysium Health has developed a “daily health product designed to optimize support for your most critical metabolic systems.” The claims include helping people with several chronic diseases, including Alzheimer’s. As Wallace put it, the Elysium brand “began pummeling my awareness for weeks, the ads barreling into my Facebook feed with claims of being the world’s first cellular health product.” “Cellular” suggests that the proposed therapy would work at the level of individual cells, rather than in the brain only.
What is particularly noteworthy is that the company bypassed the Food and Drug Administration, “effectively using its customers as human test subjects, sometimes reviewing their FitBit and other health-tracking data to determine if the pill delivers on its promise—or causes unexpected problems.” Wallace himself took part in the trials. “If I were going to trust anyone in a lab coat promising a magic pill to stay healthy longer, Guarente appeared to be a good bet. As the month’s end drew near, I was reluctant to stop taking Basis,” the drug in question. “But what promise!” Wallace exclaimed, predicting that in the next five to ten years, today’s research will bear fruit. He noted that while people won’t necessarily live longer, they might live better, suffering fewer of the consequences of aging.
Last fall I attended a forum on aging, and the figures cited on longevity were similar to what Wallace describes—the world’s oldest woman, who died at 123, was highlighted. Also noted was that babies being born last year are expected, on average, to live over 100. But the speakers had little to say about how degraded our planet will have become, if current environmental trends continue.
This led me to reflect on two works of the imagination, one a movie from 1973 starring Charlton Heston, Soylent Green. The fictional year is 2022. The authorities keep a tight lid on rioting, and most of the population is miserable in their global-warming-run-amok climate. More relevant is the recently published book by Don DeLillo, one of the most critically acclaimed novelists of the past three decades. To read DeLillo at his best—and his new novel, Zero K, is one of his best—is to perceive reality as it might be just around the corner. The cryonics business—the dream of suspended animation—is flourishing in out-of-the-way places. This is not science fiction; better to think of it as a preview of the near future. In this book, DeLillo’s obsidian-sharp prose is at its best. A key plot point, which I won’t divulge, floored me. And DeLillo is the only writer who could have pulled it off.