Abstract

Abstract

Friday, September 11, 2015

The aura of football


The doctors talk about the brain as a mystery. What I realized in those sorrowful days is how holy the brain is. It is a temple that houses our fragile selfhood.
            —Steve Almond, author of Against Football: One Fan’s Reluctant Manifesto

Those of you who read my first blogpost, “Should I feel cheated?,” may remember my litany of forbears who lived beyond eighty. This led me to ponder which grandparent might have carried the early-onset Alzheimer’s gene. I zeroed in on my paternal grandmother, an immigrant from Dalmatia who died young as a result of a childhood illness that weakened her heart. My dad recalled his mother as a sweet-tempered soul. But she seemed to be the only suspect.
Over the past several days, I have been pursuing another possibility, one that has nothing to do with my family tree and a lot with the games and sports I played in my childhood and youth. My only diagnosed concussion occurred around age six, when I fell off the back of my brother’s bike and was knocked out. I woke up in the back seat of our Buick, en route to the doctor’s office. Could that single incident foreshadow my difficulties almost a half-century later? I doubt it.
But from an early age, my friends and I often played touch football on our school’s asphalt playgrounds. Being incredibly limber at that age, we endured the occasional painful collision with the ground. If there was sufficient snow on our playground – rare in Puget Sound – we tackled each other. In one snow-related mishap, I banged heads with another boy. I felt no pain, but I did notice a curious smell, similar to rubbing alcohol.
By sixth grade I was playing Boy’s Club football, and three years later I was a wide receiver and a defensive back on the freshman team at my high school.
It’s hard for me to overstate how important football and other sports were to me. I was an under-achieving student and an over-achieving athlete.
The annual football game between the city’s two high schools drew large crowds. I had been plagued by injuries during my sophomore and junior seasons, but as a senior I was a starter on both offense and defense.
 If someone were to ask what was the happiest day of my life, I would have a ready answer: September 21, 1979, the night that we upset our football rivals by a single point. The short version is that I happened to play the best game of my life when it mattered most.
The other day I experienced an epiphany: My diagnosis of Alzheimer’s may not be hereditary after all. I have been in contact with a former high school teammate who also has been diagnosed with Alzheimer’s. He told me that one of his doctors believes that a serious concussion as an adult is responsible for the disease.
This led me to reflect on all the head-banging I did as a football player—in practices as well as games. Just before the start of the NFL season in 1978, Darryl Stingley, a receiver for the New England Patriots, sustained one of the most gruesome injuries in the league’s history. Stingley was almost entirely paralyzed. Very quickly, college and high school players were admonished not to lead with their heads when they tackled. But there were other ways in which we teenagers were at risk.
When I was playing football, there was much more “live hitting” (i.e., simulated game conditions) than is permissible now. The notion, it seemed, was that the more banging of heads during practices, the better we would be prepared for the real thing. As an old friend and teammate remarked recently, many of us, particularly offensive and defensive linemen, might have experienced many “sub-concussions,” too subtle to detect without sophisticated technology.
My teammates and I, as teenagers, understood that concussions were serious matters. But many of us also felt a perverse pride when we hit with our helmets hard enough to see stars. After Stingley’s catastrophe, the directive was to slide our heads to one side when we made a tackle. But many teenage boys – then, and I presume, now – have a hard time acknowledging that they are mortal. 
More than a decade later I wrote about this phenomenon in a fictional context:
The coach stood with both hands on the boy’s shoulders, peering through the rectangular gap in his face mask. It was the are-you-still-with-us look. Once, in turnout, the coach had to ask me who my girlfriend was, and I could not remember her name. I recall nothing of that collision, but there were other times when I’d get slowly to my feet and see a celestial shower at the margins of my vision, an aura of ephemeral light.
Keep in mind that this appeared in a work of fiction. But I did think it was cool to see stars, just as many of my friends did.
The weird thing is this: If I could live my life over, there are some things I would do differently. But in the case of that first-day-of-autumn football game in Bellingham, Washington, thirty-six years ago this month, I would want everything to turn out exactly as it did. The sport of football was so exhilarating for me, I would have been miserable if I had been told I could not play.
My current mental condition, I concede, is a rather dear price for youthful glory – assuming that the sport is to blame. But let me be clear on one point: If I could do it all over again, I almost certainly would.
Maybe my dementia is more advanced than I thought.

Friday, September 4, 2015

A movie better than the book

 Back when I was reviewing novels for a local newspaper, I sometimes offered the backhanded compliment that the book might work better as a movie. The implication was that while the book had a compelling plot, it lacked what I most value in novels: access to the consciousness of fictional characters without resorting to clumsy techniques such as voice-overs.
But not all films are inferior to the books they are based on. Some – and this is the case with Still Alice, the movie about early-onset Alzheimer’s – improve on their source material. It was helpful, of course, to first read the book, which, within its 292 pages, provides a wealth of information about the disease while creating a compelling central figure – a renowned Harvard linguist in her early fifties, perplexed by her memory issues.
There is an obvious irony here. A woman who is an expert on language is reduced to struggling to remember the names of household items and even of her children. I read the book with much interest, but also some dismay. The author, Lisa Genova, never makes clear that Alice’s variety is a rare strand of Alzheimer’s, one that within two years can wreck a person’s life. The book is separated into long sections, each of which identifies the month and year. Alice notices her first symptom – she forgets a key word  in a lecture that she had been giving for years – in September 2003. Three months later she is diagnosed.
She reacts much as I did, much as anyone would: “Time. How much time?”
In Alice’s case, it appears to be not much at all. Her decline is precipitous. Before long she is plotting a suicide plan, ahead of when she will be too impaired to carry out her death. By March 2004, just six months after her first symptoms, she is having trouble finding her way when she walks to the Harvard campus, a walk she had been making for years. Three months after that, conversations on the phone “often baffled her.”  
The closest Genova comes to explaining why Alice’s disease is progressing so quickly is when she comments, “Although Alzheimer’s tended to progress more quickly in the early-onset versus late-onset form, people with early-onset usually lived with the disease for many more years longer, this disease of the mind residing in relatively young and healthy bodies.” But the implication of this awkward sentence is that the fictional Alice is typical of early-onset Alzheimer’s.
There is a world of difference between being diagnosed with standard early-onset Alzheimer’s, where life expectancy can extend beyond a decade, and the much rarer form depicted in the book and movie.
Unlike the book, the movie makes clear that Alice is experiencing a rare form of the disease.
Like the fictional character, I experienced a downturn in my professional work that baffled me. But in the two years since my symptoms became prominent, my decline has been relatively slow. I continue to read books, magazines, and newspapers as I have always done, though at a somewhat slower pace. My math skills, never great, have taken a hit, but with a calculator at hand, I can balance a checkbook.
 I still love to engage in stimulating conversations, though I am more likely to lose my train of thought.
My point is not to congratulate myself. I understand that, barring a medical breakthrough, I, too, will eventually be severely disabled, unless something else kills me first. But the pace of the disease is the crucial distinction. It is one thing to be forgetful, as I am, and to misplace things, as I do. It is quite another thing to leave one’s cellphone in the freezer and not find it until several days later, as Alice does.
It’s understandable, of course, that a novelist would want to limit her novel’s timeframe for dramatic reasons. But in this case, Hollywood exercised the better judgement. The movie does make clear that Alice has a rare, fast-moving form of the disease.
The movie is also superior because of the presence of Julianne Moore. What a magnificent actress. Each stage of decline is depicted plausibly and powerfully.
And this movie’s brief love scene is unlike any I have seen. The mind weakens, fades, but love (and carnality) endure.
 I just wish that this all-too-realistic movie had a happier ending. For that, we may be waiting a long time.


I would be remiss if I didn’t acknowledge the role of two people – one a stranger, the other my wife – who have made important contributions to this blog. Max Maclaren provided much-needed technical assistance in getting the blog to look the way I wanted it to. Paula Woolley, proofreader extraordinaire, is an expert at catching my mistakes.