Friday, December 22, 2017

The importance of diet

Long before I started recognizing the symptoms of Alzheimer’s, I learned about the virtues of the “Mediterranean Diet.” When I joined the Massachusetts Municipal Association as an editor, reporter and project manager in 2005, I was introduced to the MMA’s insurance wing: The Massachusetts Interlocal Insurance Association—MIIA, for short. One of MIIA’s role’s is to promote healthy habits among public-sector workers, such as police officers and firefighters. The premise is that healthier employees help keep down insurance premiums for cities and towns.
During my ten-plus years at the MMA, and a good deal before, I was following good eating habits. My dad was a commercial salmon fisherman in Puget Sound, and during the summers when I was growing up, especially in the summer, we ate salmon very frequently. Not surprisingly, as a child, I got tired of eating salmon, and I would have been happier eating a hamburger. But what my Slav grandparents, who died before I was born, bequeathed to me were my ancestors’ good genes. Ever since my teenage years, I’ve eaten anything I wanted to, with no more serious consequences than indigestion. Once, in my mid-thirties, my brother described me as “gaunt.” That was an exaggeration. But it is true that I weigh about five pounds less now than I did as a high school football player, when I weighed 145 pounds. I would prefer to be at that weight now consistently, but that’s the drawback of having what I describe as “hummingbird metabolism.”
If there is something like a Mediterranean gene, I would likely have it. And with those of us who have Alzheimer’s disease, consuming olive oil on a daily basis is good idea. I happen to like the taste of olive oil, especially when I have a fresh loaf of crusty bread from a bakery. I sop up the excess oil with the bread. And when I prepare salmon, or other fish, I always chop up at least one clove of garlic.
What prodded my curiosity about olive oil was research in the science journal Nature: “Over the past two decades, substantial research has recognized that “chronic [i.e., “steady”] exposure to the Mediterranean diet is beneficial with respect to reducing the incidence of cardiovascular diseases and metabolism syndrome.” The Nature article’s main point is that, unlike olive oil, canola oil falls short of the threshold to be useful for people with Alzheimer’s. But it’s worth noting that the price differential between olive oil and canola oil is not radical—at least if you shop at the Market Basket in Somerville, where most items are cheaper compared to other grocery-store chains in the Boston area.
And since I left the MMA in the summer of 2015, I’ve had ample time to eat well—and slowly. Of course, it also helps to like the taste of olive oil, as I do. Not everyone does. In terms of efficacy, the Nature article stated, “Over the past two decades, substantial research has recognized that chronic [i.e., exposure] to the Mediterranean Diet is beneficial with respect to reducing the incidence of cardiovascular disease and “metabolism syndrome”—such features as high-blood pressure and high cholesterol, the study noted. More relevant to my purpose, longitudinal studies and clinical trials have shown that adherence to this type of diet is associated with slower rates of cognitive decline and Alzheimer’s itself. And If you don’t like the taste of olive oil, just regard it as a kind of medicine, with a wealth of research backing its efficacy.

Friday, December 8, 2017

A heterogeneous condition

One of the distinctive aspects of Alzheimer’s is that no two people exhibit the precise same symptoms. That was one of the key points in a speech by Dr. Brad Dickerson, a behavioral neurologist at Massachusetts General Hospital. I understand Dickerson’s comment pertaining to the complexity of the human brain and the nervous system, compared, to, say, heart disease, which in recent decades has become a far more manageable condition.
Dickerson, this year’s Simons Symposium speaker in Newton, Massachusetts, touted the promise of “precision medicine” as a possible fruitful means of curbing Alzheimer’s. A key element in this quest are biomarkers, which are defined by one source as a “broad subcategory of medical signs—that is, objective indications of the medical state observed from outside the patient—which can be measured accurately and reproducibly.”
In a practical sense, Dickerson noted that biomarkers can help doctors determine whether patients will stay in the mild cognitive decline or advance into full-blown Alzheimer’s disease. “We actually do a pretty good job with the tools that we have now,” Dickerson noted. And there are now specialized amyloid PET scans involving biomarkers. In some cases, people have a 90 percent chance of  developing Alzheimer’s. But if individuals knew well ahead of time that they were at risk of Alzheimer’s, they could change their lifestyles in terms of exercise, diet, sleep habits and other factors.
One question asked of Dickerson concerned the very broad life range, post-diagnosis. “And as anyone who has spent time in a support group for people with Alzheimer’s, or in a group for their care partners, the progress of the disease can vary widely. “The bottom line is that Alzheimer’s is a heterogeneous condition,” Dickerson stated. “The speed can be very different among individuals.” The amount of shrinkage in the brain varies widely as well.
Dickerson also noted, pointedly, that no new Alzheimer’s drug has been approved for roughly fifteen years, despite the substantial money that has been raised, from both private and public sources. And, as we learned about  a year ago, there was hope that an Eli Lilly drug candidate might slow the pace of the disease. That, in itself, would have been a very encouraging step. But that, too, went by the boards.  Still, for those of us in a relatively early stage of the disease, there is still hope for a breakthrough.

Friday, November 24, 2017

A bridge to nowhere

When I was a sophomore in college, I took part in a televised debate about the future of the Western Washington University football program. It was the early eighties, and the state, like much of the nation, was mired in recession. Staff layoffs were looming, and some untenured professors were in danger of losing their jobs. For people who didn’t care about football—or found the sport immoral—this was the time to scrap the football program forever. My opponent was Greg Sobel, who at the time was the student body president. While Sobel marshaled his facts and figures, I fell back on an organic conservative argument that I had learned from one of my political theory textbooks. This wasn’t Adam Smith’s economics, concerning the “invisible hand” of the marketplace. It was the thinking of the late-eighteenth-century British statesman Edmund Burke, an astute critic of the French Revolution: Change should be gradual; don’t scrap a venerable institution—in this case, a college football program that had endured at least 80 years but eventually was terminated in 2oo9. When I debated Sobel under the lights of the television studio, I was almost entirely at ease.
But this post isn’t about football; nor is it about the French Revolution. It’s about how wary I am these days of speaking publicly without a text in my hands. Roughly two years ago, when I was part of a panel discussion about Alzheimer’s, in Brockton, Massachusetts, I had no difficulty speaking fluidly in front of an audience. But  in November, in a very similar situation, I was feeling anxious. I did bring a prepared statement, but in the spirit of the forum, I was determined to field any questions that came my way. It is no surprise that, given that I’m now five-and-one-half years down the Alzheimer’s trail, that I would be experiencing more pronounced difficulties with the spoken word. But I didn’t anticipate just how steadily my train-of-thought function has been deteriorating. This last May, I rode with several friends to see a friend’s play in upstate New York. For hours, another friend of mine and I conversed. Over the two-day trip I lost my train of thought many, many times.
Much more awkward was a recent incident in Marlborough, Massachusetts, on November 9. Just before the forum, I started writing notes of what I wanted to speak to, but soon the panel discussion was underway, and I felt ill-prepared. And sure enough, when I launched into my answer, within a matter of 15 or 20 seconds, I lost my train of thought. I’m not sure anyone see my face go red; I could certainly feel it. I was now on a bridge to nowhere, in front of perhaps 200 people. I did manage to get back on track, and I even drew some laughs when I imitated the electronic male voice that spoke to me when I was trying to add more credit to my subway pass: You’re running out of time. You’re running out of time.  Still, the experience in Marlborough humbling.

Friday, November 10, 2017

Memory concerns? Read this book

I wish that the book Seven Steps to Managing Your Memory: What’s Normal, What’s Not, and What to Do About It was in print during the second half of 2013 when I was first showing prominent symptoms of early-onset Alzheimer’s. It would have saved me a lot of anguish over the next couple of years. Sure, either way, I would have ended up with a diagnosis of Alzheimer’s. But by reading “Seven Steps,” I could have learned what was likely ahead of me, and dispel my worst fears.
Instead, I found myself in a downward spiral. Neither my editor nor my department head could account for my poor performance. And how could have they? For more than eight years I earned solid performance reviews, but now I was struggling. No one at my workplace suggested that I might have Alzheimer’s or some other form of dementia. It was all beyond our comprehension.
The book, recently published, is exceedingly practical. The authors are Dr. Andrew E. Budson and Maureen O’Connor, an assistant professor at Boston University’s Alzheimer’s Disease Center. The book features many fictional dialogues, involving five characters: an eighty-year-old woman, Mary, who ends up diagnosed with Alzheimer’s; Jack, Sue’s husband; Sara, Jack’s daughter; and Sam, a friend of both Sue and Jack. All of the fictional characters, with the exception of Sara (Jack’s daughter) are 72 or older.
By far the most relevant information for me came in a chapter titled, “What Kinds of Memory Problems Are Not Normal?” Starting in the late summer of 2013, I had pronounced difficulties, both in my job and in other facets of my life. My professional shortcomings were duly noted by my employer, with my annual performance evaluation changed from the standard annual review to one review every three months. Even in my leisure time, outside of our home, I was often anxious. One night in August I was planning to meet a friend at a prominent pizza chain, ahead of attending a Red Sox game. I was having problems with my cellphone (it wasn’t the phone I normally used) and my anxiety was rising. Worse, I mixed up Burtucci’s with Pizzeria Uno; both establishments were close to Fenway Park. We still managed to get to our seats before the first pitch was thrown, but my enjoyment of the game, which had playoff implications, was tempered by my anxiety.
Before discussing dementia directly, the authors sketch in the role of the dual hippocampus, the seat of short-term memory, one on the left brain, and the other on the right. The two sides of the hippocampus have somewhat different roles, with the left sphere specializing in remembering verbal and factual information, and the right hemisphere and the other sphere for nonverbal and emotional information. Other topics include forming long-term memories. As a header section title declares, “You Need To Pay Attention To Form A Memory.” Otherwise the memory won’t endure. In this first chapter, the authors also bring up false memories, which are common among people with dementia. These aren’t necessarily fantasies; in my case, I think of them as errors of facts.
Significantly, Alzheimer’s typically first attacks the hippocampus first, the seat of short-term memory. As the authors ask, “Why does Alzheimer’s cause rapid forgetting? Because Alzheimer’s damages the hippocampus, where new memories are formed and stored. So in Alzheimer’s, even if the frontal lobes are taking in new information related to episodes of our lives and sending it to the hippocampus, new memories are not forming (or are imperfectly forming) because the hippocampus is damaged.”
Other “Steps” recommended in this very practical book includes “What Can I Do To Strengthen My Memory?” and “Which Memory Aids Are Helpful?” The answer is that, for me, I am now taking daily supplements of vitamin D, along with B12 vitamins, which I I’ve been taking daily for at least a year. I gather that the efficacy of these supplements are limited. But, as far as I can tell, there is no downside of these supplements—unless you can’t stomach the dosage, as I recently learned. Yet when you’re facing an adversary like Alzheimer’s, you should do whatever you can do, even if the upside is  marginal.

Friday, October 27, 2017

Into the night

Like many people who are diagnosed with early-onset Alzheimer’s, I really had no idea what to expect. Would I be just a shadow of my former self within a couple of years, reduced to struggle with the most menial tasks? In calmer moments, I would recall that I had been living with the symptoms of the disease as early as the spring of 2012, almost three years before I received my diagnosis. Soon after, I learned of Greg O’Brien and his book On Pluto: Inside the Mind of Alzheimer’s. For me, the book’s timing was perfect. Like O’Brien, I am a former journalist, and we both know how to tell a story.
But one element of O’Brien’s story has dismayed me: his rage. At first, I thought his anger was just part of his personality, and having Alzheimer’s caused him to lose his cool more often. Growing up, I often lost my temper, a trait that descended from my grandmother and mother. Outside of my family, I rarely blew my top, but there were exceptions. Playing in a Babe Ruth League tournament in Camas, Washington, at age fifteen, not far from the Columbia River, I threw a tantrum in front of at least 100 spectators, almost all of whom were hostile to me and my team. The flashpoint came when I was at bat. The pitcher was right-handed, and his pitch was a curveball, high and inside. As I curled away reflexively, the pitch struck my shoulder. I assumed the call would be “hit by pitch.” But the hometown umpire ruled that I swung at the pitch for, strike three. Then I exploded. Before my coach could restrain me, I wheeled around and gave the hostile crowd the finger. This, of course, was a case of pouring gasoline on the fire. But that episode was singular. Once I got past my sophomore year of high school, my temper moderated.
 That’s not to say that these days I am a mellow personality. But major outbursts have been infrequent during the five-plus years since I first noticed a decline in my short-term memory. The first major blow-up occurred in the summer of 2014, about a year ahead of my diagnosis. I was driving to Winchester, Massachusetts, about a six-mile drive from our home in Somerville. Our destination was a swimming pond, where we had a membership. The topic of the argument, oddly, was laundry. We live in a two-unit condominium, and our downstairs neighbors share the washing machine and dryer with us in the building’s basement. We store our dirty laundry in a closet in the master bedroom. We do the laundry twice a week, Sundays and Wednesdays, and we separate the clothes into colors and whites. For the first eighteen years in our home, the process went smoothly.
But then Alzheimer’s entered the equation. The result was that Paula and I had a fierce argument about laundry. I complained that Paula’s system was too complex, like something Franz Kafka might have conjured up. The real problem in the system was my cognitive decline. Our dirty laundry resides in our master bedroom’ closet, three flights above our laundry facilities. Sometimes, by the time I reach the basement, I have forgotten Paula’s directive. Take the dried clothes out the dryer? Simple enough. Use the correct amount of detergent? We only need a concentrated teaspoon. But some garments are more delicate than other garments. Dark fabrics must be separated from whites. My teenage daughter much prefers soft hand towels to stiff hand towels.
A more recent blow-up took place in September 2016, when I was organizing a trip to the Northwest with my adult son. Paula and I argued about whether he and I should rent a car once we were on the ground in the Northwest. My argument was that I knew the roads and highways of my home state very well, and there is less traffic in northwest Washington. Paula’s concern was that I might damage the rental car, boosting our insurance rates. Paula and I got into a shouting match about the issue on the way to our support group meeting at the Alzheimer’s Association, and we had to pull over until I calmed down. Within a matter of weeks, after talking to my doctor, I voluntarily turned in my driving license, with no deep regrets. I hadn’t had an accident since 1984, and I wanted to keep it that way. And in the dense community I inhabit, I can do most of my errands on foot or by bicycle.
This, after all, is the prologue to Greg O’Brien’s forbidding article that might be titled, “The person with Alzheimer’s tale.” Here is what O’Brien has to report from the middle stage of Alzheimer’s three stages. Channeling the mid-twentieth-century poet Dylan Thomas, most known for his poem “Do not go gentle into that good night,” here is what O’Brien has to say about Alzheimer’s and rage, particularly “sundowning: “The darkness can … be a time of great confusion” as “sundowning”—increased agitation, often in the late afternoon and evening—takes hold. As O’Brien notes, “This also occurs often when neurons are not connecting properly, and those of us in the disease lash out unexpectedly, often far outside the traits of our personalities—cursing, screaming, hurling insults, even throwing objects without warning, like cellphones, utensils, shoes and other things, as I have. Those of us on this journey are not stupid, callous, uncaring, or intentionally unhinged: We just have a disease we can’t control. Such behavior can be initiated by loud, throbbing noises, penetrating confusion, excruciating stress, primal fear, cerebral numbness, paranoia, or all of the above.”
Well said, Greg O’Brien. And let me thank you for serving as my Dante to your Virgil. But where do I go from here, once I myself has reached the disease’s unwelcome middle stage?

To read O’Brien’s article in its entirety, do a search for “Greg O’Brien rage.”

Friday, October 13, 2017

Things I don't know

There are known knowns; there are things we know we know; we also know there are unknowns; that is to say, we know there are things that we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know.

—Donald Rumsfeld, commenting on the run-up to the 2003 invasion of Iraq.

Within a couple of months after I was diagnosed with Alzheimer’s in June 2015, I had a revelation. My understanding, at that time, was at a rudimentary stage. Previously, I’d assumed that the only way I could have developed the disease was to inherit it from someone within my family tree. After examining both sides of my extended family, my Anglo side and my Croatian side, I could find no obvious culprit. Two of my four grandparents lived into their sixties, with no signs of dementia. Another died young, in a logging accident, but his parents each lived well into their nineties. My paternal grandmother, who grew up on an island in the Adriatic Sea, died in middle age, from complications of a childhood illness in an era before the advent of antibiotics. But given the link between diet and Alzheimer’s, it’s unlikely that my paternal grandmother would have been carrying the APOE4 gene, which heightens the risk of developing Alzheimer’s. The disease appears to be less common in the Mediterranean region than in northern Europe or North America.
Then I had an epiphany. I recalled that I’d been knocked out for about fifteen minutes when I fell off my brother’s back bicycle fender at the age of six or seven, in an era when almost no cyclists wore helmets. But, recently, I was disabused of the notion that my single concussion could have accounted for my Alzheimer’s diagnosis. Any concussion would have to had lasted at least 30 minutes to sustain the kind of brain damage that could lead to Alzheimer’s directly, many decades later. And, recently, my mom confirmed that I was unconscious for only about fifteen minutes. What that concussion probably did do to me was to leave me with a permanent poor sense of direction, but no more dire consequences.
A more plausible thesis, though, is that “subconcussions,” a relatively new understood phenomenon among head trauma, could have accounted for my dementia. In the 1970s, when I was playing football, coaches often sanctioned “live” hitting during practices almost every day of practice. And I might have been particularly vulnerable because of my small stature (five-foot-seven, 135 pounds, as a high school sophomore). By the time I was a senior I had largely matured physically, but perhaps the damage had already been done.
Recently, I reached out to Shannon Conneely of the Boston University’s BU CTE Center, to see if my appropriate diagnosis could be CTE—short for chronic traumatic encephalopathy. First, Conneelly noted, CTE’s symptoms tend to appear in one’s late forties, and early-onset Alzheimer’s (my diagnosis) typically begins in one’s fifties or later. The most stark difference between the two diagnoses is that people with CTE are much more likely to have weak impulse control and poor judgment, which can lead to aggression, and, in some cases, suicide.
If I had to speculate on how I ended up with early-onset Alzheimer’s, I would go with this: Being on the small size, even in high school, I compensated with over-the-top aggressiveness. I knew how to bring bigger players down by chopping them down at the knees. And I sometimes “saw stars” when I hit heads with another player, producing a sparkler effect at the margins of my vision. Decades later, soon after I was diagnosed with Alzheimer’s, I was contacted by my former football teammate Steve DeWitt, who was one year ahead of me in high school. Like me, Steve had no known family history of Alzheimer’s. At the time, it seemed obvious that my diagnosis was the result of high school football. Roughly 40 years after we’d had any contact between the two of us, we were each diagnosed, a single year apart. The gestation periods of the disease were virtually identical, even accounting for Steve being marginally older.
But my wife, Paula, offered another suggestion. She suggested, sometimes in the realm of Alzheimer’s, things just happen, with no obvious cause of the disease. That could be the case. Still, if I were betting, I’d bet on head trauma, in the form of subconcussions. When Steve and I were playing football in the 1970s, no one had heard of this phenomenon. I expect to learn more about subconcussions in the years ahead. But for now I have to resign myself to the fact that there is no real way to pin down the cause of my dementia with certainty. Maybe that is for the better. My focus, after all, remains on doing whatever I can do to hinder my disease’s progress, through daily exercise, good nutrition, good sleep habits, and staying engaged mentally and socially. For now, short of a medical breakthrough, this is all we can do.

Friday, September 29, 2017

A passion for words

Throughout middle school and especially high school, I tended to be an underachiever. Both of my parents were teachers, so I couldn’t claim that I didn’t have the resources in the home to enable me to do well. With the exception of my senior year, when I failed to make the cut for the basketball team, I was engaged in a sport in every season. And, not coincidentally, when I failed to make the cut for the varsity basketball team, I became more focused on my classes. The book that triggered my transformation was 1984, George Orwell’s famous dystopian novel in which love and intimacy are regarded as crimes against the state.
Urged on by my high school football coach, I tried out for the Western Washington University team as a defensive back, but from day one I sensed that I was out of my league. The Western Washington football team was no powerhouse in those years, but many of those young men were huge. And when a flamboyant wide receiver burned me on a long pass completion during a scrimmage, the receiver shouted, Hey, Coach! Hey, Coach! I just burned your DB! I just burned your DB!”  Yes, I got burned on that pass route, and that was certainly a blessing. If my history of injuries in high school was any guide, it seems likely that I would have gotten hurt frequently on the college team. And even after a summer of weight training, I weighed only 150 pounds.
I quickly redirected my energies. Unlike in high school, when I was often indifferent to my grades, now I took my grades seriously—maybe too seriously. Every academic quarter I looked forward to going to the university bookstore to buy my textbooks. Astronomy, to gaze into a dark winter night and contemplate the vastness of the stars? Rocks for jocks? Boring? How so? The fossil record intrigued me. How cool! Or, should I say, how molten? To have a sense of how old the earth is? And the eons before history, the small-brained reptiles, devoid of mother’s milk.  And at the end of this is us, homo sapiens, humanity.
The significance of the wheel. The mute, inscrutable pyramids. Socrates imbibing the hemlock. The misnamed “dark ages.” And then, Columbus, not just reaching a new land, a new continent, but carrying with him an old-world collection of microbes, from which the native populations had no defense. Lust for Mexico’s gold. Martin Luther, fracturing western Christendom, leading to thirty years of religious war. Plimoth Plantation, suffering deeply, almost to extinction. The Declaration of Independence. Gettysburg. Chlorine and mustard gas. The Great Influenza of 1918. Stalingrad. Hiroshima. Stalemate in Korea. Quagmire in Vietnam. The American century.
My vocabulary, meanwhile, was rapidly growing. In my reading, I made a point to write down unfamiliar words. In the summer after my freshmen year, I chose to read Moby-Dick. Melville’s themes went over my head, but I liked the action scenes. One of my favorite words I learned from Melville was ostentatious. He applied this to the whale itself, and its terrible flukes. Elsewhere in that capacious book, I learned about cetology, the science of whales. Sometimes, I would use words egregiously. I accused my dad, during an argument, of being fastidious—which, decidedly, he wasn’t. And also in Moby-Dick, I first encountered the word audacity—a more elegant version of “boldness.”  While reading a newspaper account of a Vietnam veteran, I learned the word premonition, a hunch that something bad was about to occur. Then the bomb went off.
Within this framework, I began to think for myself. And, in doing this—to acquaint myself with history and literature and philosophy in my college years—I apparently developed a partial prophylactic effect that has helped me forestall Alzheimer’s progress. The term for this is  “cognitive reserve,” and it remains somewhat of a mystery. There are two facets to this. Idea density reflects the ability to pack a lot of information within a small space, a useful skill for journalists, in particular. A related phenomenon is syntactical sophistication. If you’re familiar with the prose styles of Earnest Hemingway and William Faulkner, you can infer that Hemingway’s prose reflected “idea density,” and Faulkner embodies “syntactical sophistication.”
Why this information is worth repeating is that it appears to protect the brain to a significant extent, particularly among people with high levels of formal education. Why this is true is still something of a mystery. Perhaps age twenty is a key period for brain development. It is now known that brain development continues into one’s mid-twenties or later. Unfortunately, cognitive reserve won’t protect people up to a ripe age. The phenomenon is finite.

Friday, September 15, 2017

Is Namenda ‘neuroprotective’?

A couple of years before I was diagnosed with Alzheimer’s, one of my doctor’s, a gerontologist—I was 53 at the time—prescribed Aricept. The clinical name is Donepezil, and it’s likely that my gerontologist strongly suspected that I already had Alzheimer’s or some other emerging form of dementia. In the ensuing months, I underwent  a neuropsychological examination and, despite my eroding short-term memory, my neuropsychologist pronounced that I was functioning at a high cognitive level. When she asked me to discuss a current event (this was the early soring of 2014) I chose the Putin regime’s meddling in the Crimea—the region that Russia and other great powers have vied for influence for centuries. I didn’t say it quite that way—I would have been more articulate in writing—but the doctor pronounced that I was “functioning at a high cognitive level.”
My relief was significant but premature. What the doctor was subtly conveying was that despite my weak short-memory, I was otherwise doing quite well. Later I learned about “cognitive reserve,” the phenomenon that benefits some people with Alzheimer’s, particularly among people with high levels of formal education. While cognitive reserve can be helpful, it can’t confer much protection in the short-term memory sphere, which is typically the first realm of the brain to show decline. A few weeks before I was diagnosed, I had a premonition. I was discussing something with my editor, and I was struggling to grasp what he wanted me to do. I didn’t quite formulate my thought this way, but the sense was, this is what dementia is.
The gerontologist delivered the bad news a few weeks later. That she had prescribed Aricept, the most widely prescribed medication for Alzheimer’s, before I was diagnosed, spoke to her confidence that I already had the disease. But Aricept did not sit well with me. Perhaps I didn’t give Aricept a chance. But after having an intense nightmare that had an almost psychedelic quality to it, I decided to leave my Aricept in the medicine cabinet. For a time I took the drug at a reduced dose, but I decided to go down a different road. By now, I’d learned that regular daily exercise—in my case, swimming, vigorous walking and cycling—could help slow down the disease’s pace. And, no longer employed, I have time to exercise. The exercise, in turn, led to good sleep habits. Sometimes I disturb Paula’s sleep, but that’s another story, still in progress.
A couple of weeks ago, though, my gerontologist noted that another Alzheimer’s drug—memantine—has shown some efficacy in clinical trials. The brand name is Namenda. According to the website drugs.com, Namenda “reduces the actions of chemicals” in the brain that may contribute to the symptoms of the disease. But if you have the patience, there is a little nugget waiting to be exhumed. The paragraph begins discouragingly, pointing out that there is scant evidence that this drug can help people still in the mild or moderate stage of the disease. But there is a “very small but statistically significant effect” from Namenda over six months involving cognition, according to the study. The basic question remains whether Namenda can help people in the early and middle stages of the disease.
 According to a study with 431 participants, there was “no substantial benefit” by taking Namenda. If this is progress, I must be blind to it. But there is at least one thing in Namenda’s favor, at least for me: In the two weeks since I’ve been taking Namenda, I’ve continued to sleep well. That was not the case with Aricept. Nor have I experienced significant dizziness, said to be a common side effect of Namenda. But to be honest, I am not optimistic. Researchers have been studying Namenda since at least 2003, and there is yet no hard evidence that people like me can benefit from this particular drug. It’s a stretch to call that “progress.”

Friday, September 1, 2017

Is football in decline?

Not in financial terms. Overall revenue for the 2016 NFL season was around $13 billion. That’s about $3 billion more than the second-most lucrative North America sport, major league baseball. And major-college football, especially in the Midwest and below the Mason-Dixon line, continues to draw enormous crowds as well as lucrative sponsorships.
What I’m most interested in is football’s appeal at the grass roots. When I was growing up in the seventies, 90 miles north of Seattle, organized team sports were basically limited to the big three: football, basketball and baseball. I played all three. Hockey was a club sport. There were signs, however, by the late seventies that soccer, the most popular sport globally, was establishing a beachhead. A coach and teacher at my middle school, Dominic Garguile, went on to a highly successful run coaching the women’s soccer program at my alma mater, Western Washington University, putting soccer on the local sports map.
A friend of mine, a football teammate for seven years, dislikes soccer. He once complained that soccer was siphoning off potential football players. I understand his grievance, but I don’t share it. Like my friend, I have a deep, if ambivalent, attachment to football. As a seven-year-old, watching Super Bowl III, in January 1969, I was moved to tears when the great Johnny Unitas couldn’t rally the Baltimore Colts against the upstart New York Jets and their “mod” quarterback, Joe Namath, the first quarterback celebrity. A few years later I hitched my wagon to the Pittsburgh Steelers, which were just emerging from roughly 40 years in the NFL wilderness. In the seventies, the Steelers turned the tables in a big way, winning four Super Bowls in six years.
By that time, I’d earned a bit of football glory for myself, playing a significant role in our football victory over our rival high school, boosting my confidence to the extent that, if I remained disciplined, I might achieve something substantial with my life. My aim was to establish myself as a prominent journalist. A few years later I took a left turn, abandoning my career in journalism to devote myself to the writing of fiction. For a year or two ahead of the millennium, I barely watched any football at all—not even the Super Bowl. But living in the Boston area, how could I resist the alchemy between Bill Belichick, a dour football genius, and Tom Brady, likely the most poised athlete on the face of this earth. Throughout our young century, the Patriots have won almost unceasingly: Five Super Bowl victories, while almost always qualifying for the playoffs.
But football, as we’ve learned in recent years, has a dark side. I can’t prove that my diagnosis of Alzheimer’s is connected with head trauma. But given the lack of evidence of Alzheimer’s in my family tree, it seems probable that a serious childhood concussion, followed by seven years of organized football at a time when head trauma was not nearly as well-understood as it is now, accounts for my diagnosis.
It may be instructive to view football through a Red state-Blue state lens. Certainly football is especially popular below the Mason-Dixon line, the fault line of American politics going back well before the Civil War. On the other hand, the birthplace of football—Ohio—is in the north. A more accurate gauge might be the nation’s roughly 3,000 counties. In affluent, more educated counties, I suspect, parents are likely to be less eager to expose their children to football’s hazards—especially what we now know about the consequences of untreated concussions. Of course, football isn’t the only sport to spawn concussions: thanks to youth soccer’s popularity, many young soccer players are suffering concussions as well.
And if you are my age or older (I’m 56), you may recall that heavyweight boxing—particularly the bouts between Joe Frazier and Muhammad Ali in the 1970s—made for great sports theater. Part of the allure was Ali’s magnetism, but there was a panoply of other boxers who had large followings. Without resorting to Google, the only contemporary boxer I could think of was Floyd Mayweather. But going back to the 1930s, when my dad was a teenager, young men in many communities were expected to have basic boxing skills. And we’re not talking about the streets of Brooklyn. This was Bellingham, Washington, not a particular tough place to grow up in, either in my era or my dad’s.
Here in Massachusetts, the city of Brockton’s moniker is the “City of Champions.” The name alludes to two famous boxers who grew up there: the heavyweight champion Rocky Marciano, who died in a plane crash in 1969, and “Marvelous” Marvin Hagler, who dominated the middleweight class throughout most of the 1980s. Further buttressing the city’s sports legacy is that Brockton has won many state football championships over the decades. But in recent decades, Brockton has struggled as a community. According to data from my former employer, the Massachusetts Municipal Association, Brockton was behind only 28 cities and towns in the category of “income per capita,” a key statistic to gauge a community’s wealth or poverty. And being well-known for football or boxing isn’t likely to make much of a dent, in local economic terms.

Friday, August 18, 2017

Executive dysfunction

If you’re a normal, functioning person, you may not be acquainted with the term “executive function.” The key word is “executive,” as in the sentence, “Executives make more money than their employees do.” But during the seventeen years I worked in conventional office jobs (1998-2015), I became acutely aware of my strengths and weaknesses. After a long period in academia in my late twenties and thirties, usually up to my neck with work without much compensation, let alone job security, I was hired as a copy editor at a company that was just beginning to expand. My timing couldn’t have been better—or luckier. It was the only job of mine in which I felt I was being overpaid.
After the dot-com bubble collapsed in March 2000, things slowed down, but not for long. My boss had the judgment to sell the company at an opportune time, and, for those of us who had been around for a while, and had a good deal of responsibility, he rewarded us generously. The new ownership was quite different. From day one, I sensed that my job was in jeopardy. We had been sold to a conglomerate, called Wicks, and, ironically, we were purchased by a private equity firm—the subject of our publications. The irony quickly grew thin. It was clear that my function, as a copy editor, could be accomplished elsewhere within the wider organization. In the emerging world-is-flat economy*, my role could be handled within Wicks’ central office. When I introduced myself to the new chief executive that morning, he happened to be eating from a bowl of cereal, and showed no sign that he wanted to become acquainted with me. At home, I had a two-year-old daughter, and Paula’s income at the time was scant. I managed to survive for another two years, by which time the company had been sold again, this time to Dow Jones—a much more reputable organization than Wicks, but one that still left me at risk of seeing my job eliminated. It was not until my going-away dinner in December 2004, after I’d secured a job in downtown Boston, did I learn that Dave Barry, my supervisor, working behind the scenes, saved me from my copy-editing position being eliminated. Dave’s argument was that my editing was essential to the quality of our publications.
Only once I started my new job, in downtown Boston, at the Massachusetts Municipal Association, did I fully grasp just how easy my previous role had been for me. During my seven years working west of Boston, I rarely worked more than eight hours. And editing had always come easy to me. A colleague of mine once introduced me as “Super-copyeditor.” My job was tedious, yes, but stable. But outside of writing and editing, my range of professional skills were limited. And that was the case long before Alzheimer’s introduced itself. My dad, as I learned many years after the fact, struggled as a junior high school teacher. His organizational skills were subpar. But in a semi-retirement that lasted decades, he became an extremely popular figure as a substitute, even among the middle school kids.
An accurate statement in one of my later performance evaluations at the Massachusetts Municipal Association, where I worked at for just over a decade, was that I tended to work in isolation. Had not my problems grown into Alzheimer’s, I still might have struggled with complex administrative tasks, such as managing our 200-page-plus municipal directory.
And, it’s no surprise, these days, two years after I got my diagnosis, I continue to observe the erosion of my executive function skills. My cooking, for example, is limited to simple recipes, such as rice and beans, and seafood that I can prepare in a Pyrex oven dish. And I’ve always had a poor sense of direction—possibly caused by the concussion I suffered around the age of six, during which I didn’t come to for at least fifteen minutes. 
Among the likely consequences of that one concussion, a half-century ago, included what appears to be a permanently compromised sense of direction. Once, in Brooklyn, in 1985, I led two friends, one of them limping, into the heart of the Bedford-Stuyvesant neighborhood, back when Billy Joel’s song was code for don’t go there. These days, I have learned, the message is, You’ll never afford the rent.

Friday, August 4, 2017

Stay engaged

The brain, like other organs, tends to show some wear and tear as a person ages. The aging can be particularly disconcerting if you fear your condition could bloom into dementia. A Harvard Medical School publication I recently dredged up delineates the steps that people in their sixties and beyond can take to maintain good mental hygiene.
As many people understand, regular exercise brings multiple benefits. A study in 2010 involving more than 13,000 women indicated that getting the most exercise at age 60 were roughly twice as likely to became what the article described as “successful survivors”—those who live beyond 70 “without developing cognitive, physical, or mental health limitations or a major chronic health problem.” The women scored higher on “executive function” skills—being able to prioritize decisions and then implement them. Exercise habits were studied as well. One finding: “Older men who walked less than a quarter-mile per day were 1.8 times more likely to develop dementia” than those who walked more than two miles each day.
As the article makes clear, exercise can benefit people in many ways. Not only is exercise good for the lungs; “people who have good lung function are sending a higher volume of oxygen through their blood vessels and into their brains.” A second factor is that exercise can reduce the risk of stroke, high blood pressure and high cholesterol—factors that raise the risk of dementia. Most interestingly, the article suggests that regular exercise can boost neurotrophins—“substances that nourish brain cells and help protect them against stroke and other injuries.”
The article also highlights the importance of lifelong learning. One major misperception about Alzheimer’s is that once you’ve been diagnosed, you’re not likely going to be learning much new. The Alzheimer’s Association makes a point to arrange outings to art museums and theater performances, among other activities. And I remain a voracious reader. I meet with a friend on a regular basis to discuss books. Occasionally I will read from one of my books that are written in German. This is toilsome. But it’s another way to exercise my brain. After an hour of reading German, I’m mentally exhausted, and ready for sleep.
Writing, of course, is essential to me. Ditto for reading. To draw on an analogy, reading is to writing as weightlifting is to muscle mass. To some extent, I am what I have read. Other valuable advice from the study includes keeping socially engaged. One study from 2008 found that “the higher the individual’s level of social interaction, the better their mental function.”
One topic—managing stress—is particular relevant to me. In my last two years of employment, I was frequently anxious, and for good reason. I knew that my job was hanging by a thread. Once I learned that I had Alzheimer’s, I started to relax. But it took a full week or more for me to decompress and feel like my normal self again.

Note: My next post will appear on Friday, August 18.

Friday, July 28, 2017

Where a bad gene may be a good thing

My neighbor Pagan Kennedy is a multifaceted writer. When I met her two decades ago, she was writing mainstream novels, one of which received the Orange prize, a major honor for women writers in the United Kingdom and other English-speaking nations. In one of her first nonfiction books, The First Man-Made Man: The Story of Two Sex Changes, One Love Affair, and a Twentieth Century Medical Revolution, Kennedy went where other writers hadn’t.
These days, Kennedy is filing articles about science, technology and innovation for the New York Times, and her latest article concerns dementia. In “An Ancient Cure for Alzheimer’s?” Kennedy suggests that indigenous populations in Bolivia could lead researchers to a new understanding of Alzheimer’s, and how it might eventually lead to a cure. Back in 2011, the anthropologist Ben Trumble spent extensive time in the Bolivian jungle, collecting vials of saliva from tribesmen to gauge their testosterone levels. In return, Kennedy noted, Trumble agreed to field-dress the kill. The aim was to see if without industrialization, there might be different outcomes concerning dementia.
Trumble himself was touched, indirectly, by Alzheimer’s. He learned that a favorite uncle had been diagnosed with Alzheimer’s, and was declining rapidly. The uncle died in 2015. As Trumble commented, “We know almost nothing about how dementia affected humans during the 50,000 years before developments like antibiotics and mechanized farming.”
Researchers understand that Americans who carry two copies of the APOE4 gene are ten times likelier to develop later-onset Alzheimer’s compared to the people of the Tsimani. And here’s the surprise: “The Tsimane people have the cleanest arteries that have been studied.” Even more counterintuitive, “Many of those with an extra copy [of the gene] seemed to do better on the cognitive tests.” As Kennedy speculated, “Perhaps the APOE4 gene provided a survival advantage in ancient environments.”
Back at his office in Arizona State University, Trumble discovered what appeared to be a large pimple on his nose. But when the growth continued to enlarge, he recognized for what it was: a flesh-eating parasite. As Kennedy noted, “Chemotherapy saved his nose, and perhaps his life.” Trumble went on to review the data from the Tsimane volunteers. “Sure enough, he found that the Tsimane with infections were more likely to maintain their mental fitness if they carried one or two APOE4 genes. For them, the ‘Alzheimer’s gene’ provided an advantage. For the minority who’d managed to elude parasitic infection, however, the opposite was true.”
Kennedy speculated that the APOE4 gene served as a means of survival in a prehistoric period. “Today only about a quarter of us have a single copy of the APOE4 gene, and only about two in a hundred carry a double dose. But DNA analysis of ancient environments shows that thousands of years ago, the APOE4 genotype was ubiquitous.”
Given my shallow understanding of genetics—I did poorly in high school biology—I suggest that those of you who are interested in the topic, google “Pagan Kennedy ancient care for Alzheimer’s.” The article is worth reading in its entirety.

Friday, July 21, 2017

Unreliable narrators

Those of you who were English majors likely recall the term “unreliable narrator.” If you’re not familiar with the term, here’s one durable definition, coined in 1961 by the literary critic Wayne C. Booth: a fictional narrator whose credibility has been compromised. I myself was frustrated recently while trudging through a 500-page novel by the prolific Victorian-era writer Anthony Trollope. I found the unnamed narrator irritating—and, more to the point, confusing. A more well-known example, written a century later, is Vladimir Nabokov’s Lolita: a novel about an unreliable narrator obsessed with a twelve-year-old girl. And unreliable narrators don’t need to be fictional. One of them is the president of the United States. More to the point, I myself have shown signs at times of being an unreliable narrator myself.
This week I revisited a post from November 2015, titled “A kitchen accident.” At that time, I was more confident in my cooking skills, and I was looking forward to do more of the cooking, as a means of taking the pressure off Paula. That afternoon I rode my bike fifteen miles, and the release of endorphins made me feel exceedingly calm. But that evening’s dinner, a pumpkin melange, didn’t turn out so well. I recall the difficulty of using a paring knife to remove the pumpkin’s rind, and at one point removing the rind, I drew blood under my right thumb’s fingernail, drawing stinging pain. Then things became murky. The Dutch oven I was cooking in had a large Pyrex top, and rather then removing the top from the stove, I set it directly on top of a back burner, which was in the process of becoming cherry-red hot. When I probed the lid with a dinner knife, the lid disintegrated. After that, time slowed down, as in a nightmare.
That was my version, anyways. Paula’s version was different, and, not surprisingly, her version was the accurate one. I had written in my blog that Paula was in the kitchen when the Pyrex top. But that made no sense. If Paula had been in the kitchen with me, she would have been the one dealing with the shattered Pyrex top.
The upside, if there was one, came a couple days later, when I met with my neuropsychiatrist: She assuaged my concerns that I had experienced a hallucination. That, of course, was welcome news.

Friday, July 14, 2017

A temporary constellation

As I mentioned in my previous post, the writer I most admire of my generation of novelists is Jonathan Franzen, whose sprawling novels evoke the ambition of nineteen-century writers such as George Eliot. Only recently did I learn that Franzen’s father had died from Alzheimer’s.
Writing in 2001, the same year he published his first major novel, The Corrections, he commented in a New Yorker essay that Alzheimer’s is “a disease of insidious onset.” But, he continued, “The problem was especially vexed in the case of my father, who was not only depressive and reserved and slightly deaf but also taking strong medicines for other ailments. For a long time it was possible to chalk up his non sequiturs to his hearing impairment, his forgetfulness, his depression, his hallucinations, to his medicines; and chalk them up we did.”
A meticulous observer, Franzen duly noted that his father’s brain weighed 1,225 grams. This suggests that Alzheimer’s had done its work thoroughly. A typical healthy brain is in the range of 1300 to 1400 grams. According to the research at the time, “The brain is not a photo album in which memories are stored discretely, like unchanging photographs.” Instead, a memory is “a temporary constellation of activity”—a necessary approximate excitation of neural circuits that bind a set of sensory images and semantic data into the momentary sensation of a remembered whole.” Franzen went on to comment, “The human brain is a web of a hundred billion neurons, with trillions of axons, and dendrites exchanging quadrillions of messages by at least 50 different chemical transmitters….The organ with we observe and make sense of the universe is, by a comfortable margin, the most complex object we know in the universe. And yet it’s also a lump of meat.”
And, in the central thrust of the article, Franzen stated, “I’ve come to tell, then, as I try to forgive myself for my long blindness to his condition that [his father] was bent on concealing that condition and, for a remarkably long time, retained the strength of character to pull it off.” Referring to Alzheimer’s as a classically “insidious” onset disease, Franzen commented, “Since even healthy people become more forgetful as they age, there’s no way to pinpoint the first memory to fall victim to it.” I’m not sure that’s correct. Thanks to a journal I keep, I’m certain that my short-term memory decline was in progress as early as the spring of 2012.
But back to Franzen’s father. As long as the elder Franzen was still working, the rest of the family “enjoyed autonomy in the respective fiefdoms of home and workplace.” But after the father retired in 1981, the marriage became strained. A letter from Franzen’s mother in 1990 suggests cognitive decline: “Last week one day he had to skip breakfast time medication in order to take some motor skills at Washington University where he is in the Memory & Aging study. That night I awakened to the sound of his electric razor, looked at the clock & he was in the bathroom shaving at 2:30 A.M.”
 Within a matter of months, Franzen’s dad was making so many mistakes and omissions that his wife was led, correctly, that something was deeply wrong. One example: Two times in one week, he had to summon AAA because of dead car batteries. Before long, his wife noted, “I really don’t like  the idea of leaving him in the house for more than a short while.” And, over the years ahead of him, his fate slowly playing out, and without any hope of a medical miracle, Franzen’s remarkable writing skills were no help in this gloomy venue, the elder Franzen left with only impotent words.

Friday, July 7, 2017

Summer reading

Three summers ago, The New Yorker published an essay, “A Place Beyond Words: The Literature of Alzheimer’s.” At that time, just the headline would have made me uncomfortable. Late in 2013 I was informed that I was at an elevated risk for dementia. Dementia? In my early fifties? At that time, the last thing I wanted to read about was dementia.
Now, of course, I am immersed in the topic. And recently I came across a useful survey of fiction about Alzheimer’s. The article, written by Stefan Merril Block, states that “Because the full experience of Alzheimer’s is an account that fiction alone can deliver, it’s no surprise that the go-to book for caregivers and early-stage sufferers is a novel…. Nearly every novel I’ve read that attempts to depict the internal experience of Alzheimer’s also attempts to fit the disease’s retrogenic symptoms to one sort of sentimental trope: a reckoning with a repressed or unacknowledged truth that must come before acceptance is possible. (Retrogenesis, loosely speaking, is the theory that in Alzheimer’s and similar diseases, symptoms appear in the reverse order of the normal aging process, putting some people in  jeopardy of developing Alzheimer’s or similar diseases.)
One novel about Alzheimer’s that Merril Block commented on was Debra Dean’s prize-winning The Madonnas of Leningrad. The book involves a survivor of the siege of Leningrad, which lasted for more than two horrific years. A novel that I read a few months ago—Not Me, by Michael Lavigne—provides some superficial parallels, with a key character who survived one of the Nazi death camps. But Lavigne is essentially a satirist, writing in an era—Not Me was published in 2005—when novelists are free to write about history’s most hideous atrocities in a comic vein.
Among other novels Merril Block cited are Barbara Kingsolver’s 1990 Animal Dreams and Samantha Harvey’s In the Wilderness. In Kingsolver’s novel, Alzheimer’s forces to the surface the memories of a lost grandchild. Samantha Harvey’s In The Wilderness is described as reminiscent of the writing of Virginia Woolf. And I was pleased to see that Merril Block included in his survey Jonathan Franzen, the best living writer of my generation, in my opinion.
Franzen’s break-out novel, in 2001, The Corrections, featured an elderly father who attempted to commit suicide by plunging off the deck of a cruise ship. It’s a striking moment, one that has stayed with me. In the novel, the father is described as being deeply depressed, and the disease in question is Parkinson’s. But Franzen’s father actually succumbed to Alzheimer’s. Franzen put some distance between his father and his fictional creation.
Next week I’ll discuss Franzen’s nonfiction account of his father’s decline and death.