Tuesday, November 17, 2009

The Two Towers: Trump

Rising out of the Great American Desert, two opposing metropolises: One known, rather affectionately, as Sin City, a place where indiscretions of all shapes and sizes are winked at, encouraged even, a sort of moral dead zone by civic decree. The other, shining with the false light of sanctimony, aggressively wholesome, anodyne, a seat of secretive religious power. Two cities, and two obverse faces of the American experiment. Out of nothing—wilderness, rock, sand—something. Out of a vast freedom, radically different visions of America and its people. A big tent in the desert, enclosing devoutness and dissolution. And in between, 424 miles of highway and an indescribably strange and beautiful landscape, defying belief.

The Great Basin drains an enormous swath of the western United States, more than 200,000 square miles encompassing parts of Oregon, California, Idaho, Utah, and more than half of Nevada. Las Vegas lies somewhere in the southern half of the Great Basin; it is certainly not at its center. Major sinks in the Basin include the Great Salt Lake, Pyramid Lake, and the Humboldt Sink; Las Vegas is not among them. Geographically, factually, hydrologically, this is correct; however, in demographic terms, Las Vegas is the Great Sink into which all of the dregs of humanity drain. A quick survey of the Strip is instructive. Who does one encounter? The eternal frat boy, aged 17-60, staggering boozily between topless establishments, the front of his jeans soiled with the residue of a half-dozen lap dances; the graying bar fly, heavily made up, hemline foreshortened , eyes sharpened to descry the felicitous intersection of money and undiscriminating lust; the sidewalk hustlers, always male, usually brown-skinned, passing hand-to-hand pocket-sized color advertisements for the city’s fleshpots, naked women staring with faux-seduction from thin, glossy cardboard (most end up underfoot); youngish ingĂ©nues of a calculating sort, teetering painfully in enormous heels and skirts just long enough to conceal their thong underwear when standing upright.

Everywhere falsity and vanity: hulking, steroid-fed men, arms, shoulders, and chests so thick with muscle it constrained movement; women with impossibly buoyant breasts, displayed so as to invite maximum leering; skin tanned the unnatural copper of an old penny clutched in too many hands; garish outfits, especially among women, that defied description or logic (in an episode of The Simpsons, a marquee outside of a boxing match admonishes: “Tasteful attire prohibited.” This dictum applies to the whole of Vegas). And so the Strip: everyone, everywhere, crying in one voice: “Look at me! Desire me!” Never was there a place so transparently superficial; beyond the neon travesty (Great Pyramid, Eiffel Tower, Italian villa), the casino, the club, and the stage, lies the uniform vacancy of the hotel room, when at last one must stare at the mirror and the hollowness within.

But Las Vegas was not without pathos. Once I overcame my initial repugnance (overcame perhaps isn’t the right word—set aside would be more apt), if not my feeling of superiority (yes, it was difficult to suppress), I began to feel a distinct pity for those around me, awash in the spurious glamour of the city. I pitied most of all the permanent or semi-permanent residents of the Strip, those cogs in a great machine of sleaze and degradation, rhinestones and chintz—the cocktail waitresses, bartenders, off-hours showgirls, sidewalk promoters, croupiers, card dealers, and the rest, even those invisible—for example, the janitorial staff who improbably ensured that I saw not a single pile of vomit, indoors or out, during my entire night on the town. Most, undoubtedly, had dreams, weren’t content with their lot, but were nonetheless bound as servants to a depraved and demoralizing master and his thousands of insolent, ill-mannered houseguests. And for that—if only that—they deserved pity.

Then the tourists: the well-behaved majority, who drank and gambled in great quantities and wanted nothing more than to drink and gamble in great quantities. It took some profound fathoming on my part to appreciate the character of these quiet thousands; I failed to touch bottom. Ultimately, I was incapable of seeing the world through their eyes—to me, Vegas could be nothing but falsehood and degradation, a place of exceedingly easy, manufactured pleasures. That it could be seen as an escape, a delight, that it could be anticipated, pined over, returned to, again and again, as to an old lover, I found impossible and abhorrent. The world offered so much to the spirit: discovery and endless opportunity for exultation in beauty; Vegas was a negation of this. It wallowed in its ugliness and tackiness; it reveled in its invitation to debauch. Millions chose to pen themselves up in its hotels and casinos, small, artificial ecosystems of mean diversion, and this was a joy to them, to close themselves off and to forget everything save for personal indulgence of the cheapest sort. Empathy failed me; it saddened me beyond words.

Our plan, mercifully, demanded a quick exit. After a single night, we were on the road. My travel companions and I had vowed not to discuss the city until we were beyond its limits. The disgust erupted in torrents. We all agreed that we could not have borne another day spiraling in that Great Sink. The desert awaited.

Wednesday, October 28, 2009

An Argument in Favor of the Public Option

This hard truth—that to provide for universal coverage, health care costs have to be reduced—must be reflected in any serious proposal for system-wide reform. But the Baucus bill fails on this count as well. It proposes a series of half-measures to rein in expense, ignoring the systemic basis for a perpetual spiral. Partially innocently, partially not, the U.S. has established perfect storm conditions for health care costs. First, and most fundamentally, both insurance and care are largely for-profit animals here—and when profits are the goal, prices naturally ascend to the greatest heights permitted by the market. Secondly, the health insurance industry is poorly regulated and opaque, as well as grossly uncompetitive (how it managed to wangle a federal anti-trust exemption in the ’40s and maintain it until the present-day is some sort of marvel of influence peddling, regulatory and legislative myopia, or both). Finally, the prevalence of a perverse system in which doctors’ compensation is directly tied to the amount and cost of care provided—number of tests and procedures performed, follow-ups prescribed, etc.—means that American doctors have undeniable financial incentive to prescribe the most expensive care possible. That isn’t to say that all, or even most, doctors consciously overprescribe in order to line their pockets; however, in the currently configured system, the justification for “just in case” care (i.e. an MRI “just in case” the kid who took a spill on his bike has a brain injury, though he presents no symptoms) is built in.

The Senate Finance Committee bill attempts to address only the second of these three issues, and unsurprisingly so—in spite of the fierceness and financial heft of the insurance industry lobby, it is the easiest to tackle. But it is also the least likely to reduce costs in the long-term. The creation of insurance “exchanges” at the state level, the bill’s most prominent thrust at an uncompetitive industry, may very well decrease the cost of insurance plans in the short-term (and repeal of the industry’s anti-trust exemption, which is currently being bruited, would work toward that end as well), but it does nothing to address the more fundamental sources of ever-buoyant costs in the sector. Changing the system of reimbursement for doctors by divorcing pay from quantity of care is an absolute necessity, but any effort to do this will be a slog fought tooth and nail by the American Medical Association and other interest groups. Encouragingly, models for a better, more rational system of compensation and care already exist in this country (Is there anything inherently strange about a salaried doctor?). Unfortunately, Congress has not yet reached a point where it is willing to acknowledge the importance of the fundamental link between cost and compensatory systems—it’s scarcely come up in the current debate—and it will likely be several more years before it does so.

This leaves Congress to tackle the sector’s profit-orientation, which is an absolute political non-starter. However, it is also where the public option is an essential tool, and why leaving it out of any reform bill may very well doom us to decades more of cost explosion and the continuing shame of millions of uninsured Americans. The primary argument for a public option among Democrats—that it will create competition for the insurance industry and thereby drive down premiums for everyone—is nearly as disingenuous as all of the invocations of Mother Russia by Republicans. If a government-run insurance plan reimburses doctors and hospitals at Medicare-like rates, as it should, the insurance industry simply won’t be able to compete. This runs profoundly counter to our national free market ethos, but far from being the cataclysm that the right-wing suggests it would be, the creation of a public health insurance option for all Americans is a first step toward a necessary realignment in our thinking. A public option would reveal that the government can ensure its citizens’ right to health care less expensively and more efficiently than the private sector, without a decline in the quality of care provided. It would extend coverage to all Americans regardless of their financial standing and medical history. And it could break the back of the perfidious insurance industry, paving the way for a future in which all Americans are insured either through the government or non-profit health care cooperatives.

In the early ‘60s, prominent conservatives such as Barry Goldwater and Ronald Reagan condemned proposed Medicare legislation as “socialized medicine.” Sound familiar? Now conservatives defend Medicare with a vehemence more familiarly employed in Second Amendment debates. And why? Medicare proved that government-run health care worked best for the elderly, just as Medicaid proved that it worked best for the indigent. And including a public option in current legislation could offer the same proof regarding health care for all, or nearly all Americans. News came last weekend that Senate Majority Leader Harry Reid would push anew for the inclusion of a public option in the Senate bill (albeit with states able to opt out—that is, bar their citizens from participating). If Reid is successful, the ensuing legislative battle will be epically pitched. Here’s hoping that Congress does the right things this time—the welfare of millions of Americans, and that of their pocketbooks, depends upon it.

Saturday, October 24, 2009

The Baucus Bill: Simply Not Good Enough

An examination of the health care issue here is long overdue (okay, everything here is long overdue), and I confess that I began a post on the subject at least two months ago—a point in time when it appeared that disingenuous Republican outrage—“socialism,” “death panels,” “rationing,” and the like—threatened to derail passage of major reform legislation altogether. In the meantime, much of the negative momentum generated by that overheated rhetoric and undercooked reasoning has been halted by the feverish efforts (imperfect as they may be) of the Max Baucus-led Senate Finance Committee to craft a compromise bill, as well as Obama’s own willingness to stand up and be counted on the matter (in his characteristically pragmatic, noncommittal way). Now it appears probable that legislation will pass at some point in the months ahead, but absent a new government insurance plan paralleling Medicare and Medicaid (the so-called public option) or any measure that will significantly alter the manner in which insurance companies, hospitals, and doctors do business. At what point does compromise legislation become compromised?

The Senate bill, which is much closer to the bill that, given prevailing currents, will actually come before Obama for signature, is, according to the Congressional Budget Office (CBO), expected to expand coverage to nearly 30 million people primarily through direct government financial assistance to qualified individuals and families—those who aren’t eligible for an expanded Medicaid program but whose household income falls short of a predetermined threshold. Thirty million is nothing to sniff at—it represents a significant expansion of coverage, by most measures. And yet the shortfall is galling: the CBO estimates that under the Senate plan, 25 million would still be without health insurance in 2019, one third of whom would be unauthorized immigrants. Ultimately, 6% of Americans would be without coverage in ten years time, compared to 17% today—a vast improvement—and yet it simply isn’t good enough.

Health care, it seems evident, should be recognized as a human right (and if you disagree with me on this, best not to read further—we’ve got no grounds on which to debate). Every human being—including prisoners, terrorists, torturers, and Yankee fans—has the right to see a doctor and to be treated for illness, accident, and injury. Thus, every human being should have access to essential health care (including primary care) regardless of his or her ability to pay for it. But who guarantees that access? Who guarantees human rights? Certainly not markets, or the private sector. Clearly, it is governments who do this; it is an essential part of the social contract between states and their citizens. However, the way in which governments choose to guarantee those rights is more of an open question—in the case of health care, on the continuum between universal government-financed care and a system in which the private sector insures every citizen (a dream more fanciful than any Dan Brown novel), the possible proportional ratios of public to private coverage, and plans to achieve universal coverage, are virtually infinite.

The current American system, in which most Americans are covered by private insurers through employer-provided health plans and the government picks up significant slack for seniors (Medicare) and the indigent (Medicaid), is both anomaly and quirk. I am more interested in the former than the latter—unraveling how our system came to take the shape that it has would be instructive, but it is a diversion from my purpose. Forthwith the anomaly, oft repeated, but with good reason: The U.S. is the only “Western,” developed nation in which more than a negligible proportion of its citizens is without health insurance. Forty-five million Americans —45 million!—more than the combined populations of Portugal, Senegal, Bolivia, Sweden, and Mongolia—are a single untimely accident or illness from financial ruin. Among a surfeit of national shames—the death penalty, Guantanamo Bay, inner-city public education, Wall Street cupidity, Transformers: Revenge of the Fallen—a health care system that fails so many must rank near the top. The manifest dysfunction of the system is neatly encapsulated by a single fraction—one-sixth—the proportion of Americans without insurance, and the proportion of the nation’s economy given over to health care. Cost is the other head of this health care hydra. Without reducing costs, the effort to insure all Americans—a moral imperative—becomes hopeless.

Sunday, July 19, 2009

Irrelevance

How does one deal with the fact of one’s irrelevance? Because with a broad enough view (and we needn’t get too broad here), face facts, we’re all deeply, irredeemably irrelevant. The greatest of the great men, those who have conspicuously altered the very currents of human history (think Alexander, Caesar, Steven Jobs) , have no more importance in historic, geologic, astronomical terms, than you or me. The human race is a flukish arriviste in the earth’s understanding, grossly fumbling its instant of fame. Human exceptionalism—What makes us different? Why are we better than the animals?—is a tragic canard. We’re dumb as dinosaurs and probably won’t be around nearly as long. The crocodile (200 million years old) and the shark (400 million years) rightly flaunt toothy grins; they know longevity. But even their longevity is a trifling, hapless, self-puffed sort; in earth-time they’re hardly more relevant than us (“You may have survived several major extinction events,” I shout at Jaws, “but we—haha!—we are changing the entire climate of this planet!”).

The bald fact of our irrelevance is sobering, or it should be. How do we reconcile the knowledge of our irrelevance with the fact that we are, that we do exist? For why exist at all, if we have no purpose, no greater relevance? Many reconcile these feelings by means of religion, no doubt. God gives life meaning, or the afterlife meaning, or perhaps life has meaning because of the existence of the afterlife. God does work, heavy lifting, to allay all manner of fears, of which existential irrelevance and meaninglessness must be close to the top. But for those who don’t believe—whether in God or some other divine order—irrelevance is a terrifying realization. What of the secularist, bedfellow of science, rationality, logic, empiricism, who proceeds to the logical terminus of the five centuries-old arc of scientific inquiry? He is confronted with: 1. No God; 2. A meager lifespan on a planet of 4.5 billion years; and 3. That planet, a stripling itself, sailing through a universe incomprehensibly large and around 14 billion years old. Gulp. You, Mr. Secularist, are an instantaneous mote, a speck, a jot on the face of a single grain of sand in the sprawling Sahara of the comprehensible universe. That is a cold and endless desert.

And for those staring out into the desert, those willing to contemplate its vastness, its aridity, its terrific impassivity, from whence comes succor? What comforts? What countervailing knowledge gives shape and meaning to the hours? Because, it seems, if one is perfectly honest with oneself, nothing shy of messianic delusion can offer the sparest hint that a single life, a billion lives, a billion lives for a billion years can be of consequence in this universe. Can it? Face facts, I wrote. Face them, truly face them, and despair, it seems, is all that is left to us. The final legacy of science, knowledge, and curiosity, then, is despair. In this world, ignorance must be an essential shield to deflect the eviscerating thrust of reality; thus, the first question, the most fundamental question facing the individual, is not whether to live or to die (to be or not to be?), but whether to choose ignorance of the bedrock truth of one’s existence (you are ineluctably irrelevant) over the inevitable despair deriving from the acceptance of that truth. Is it really so bleak as all of that?

Tuesday, April 21, 2009

Hunger

I’ve a film to recommend. It's called Hunger, and it is the first cinematic feature from a British visual and video installation artist named Steve McQueen. (Ah, the irony of such a name when attached to such a film!) Hunger examines the existential condition of IRA prisoners in a Northern Irish prison in 1981. The prisoners, seeking political status, are in the midst of a no wash, no clothing strike; their prison behavior, not unlike their lives on the outside, are governed by a single principle: absolute defiance of authority. McQueen retains the consecration of image and visual composition that befits an artist, while largely jettisoning the experimentation at the heart of video installation. The result is a haunting, beautiful, and deeply affecting portrait of brutality and disobedience that both ennobles and critiques the spirit of men who elevated resistance and intransigence to the level of religious observance.

The film is replete with austere and revelatory imagery: a guard canted against an exterior prison wall, smoking a cigarette while snow swirls about, his face betraying the moral ambivalence he feels toward his work; an inmate anchored in the wan light of a broken window, languidly fretting a fly with his index finger, desperate for contact beyond the confines of his cell; the same guard’s hands, livid and swollen by administered beatings, plunged into a sinkful of hot water, a thin cumulus of red rising in the water as he exhales with pain.

McQueen creates an aesthetic of discomfort and degradation that registers every instance of suffering and indignity in the prison as both affront to conscience and celebration of endurance. The inmates, through their capacity for suffering and refusal to be demoralized, hope to convince prison officials to grant them de facto political status (only Margaret Thatcher held that authority in the real world), thereby sanctioning the legitimacy of the IRA’s violent campaign. McQueen, perhaps, hoped to have a similar effect on viewers, forcing degradation upon them so as to inspire outrage—few films seek so plainly to upset their audiences. McQueen’s camera flinches at nothing, and Hunger walks a fine edge between the sort of strict realism where every detail, no matter how gruesome, must be documented and a fetishization of the grotesque and repellent.

At some point in the film, however, one begins to question the actions and motivations of the prisoners. Their defiance, so ingrained, becomes a sort of pathology, and the fundamental dignity of their resistance is degraded by an anarchical spirit. They seem to forfeit some measure of their humanity: smearing excrement on the walls, bestially reveling in squalor and filth and nakedness, and resorting to a reflexive violence in every encounter with prison staff. Did the prison reduce them to this state, or does the absolute rejection of human authority demand reversion to an animal consciousness?

If the prisoners’ defiance becomes a sort of disease that gradually rots their center, then Bobby Sands is that center, an apotheosis of the pathology that afflicts them. He is their leader: when prison officials agree to permit the IRA prisoners to wear civilian clothes, it is Sands who first bellows disapproval when the clothes they are handed prove to be a garishly colored and patterned mockery. His bellow ignites an orgy of destruction in the prisoners’ new, bargained for, clean quarters; they trash the cells in a scene of kinetic fury reminiscent of Samuel Fuller’s Shock Corridor, another film about the violence of institutions. The prisoners return to nakedness and squalor.

Sands, frustrated with the ineffectiveness of other methods, calls, and is the first casualty of the hunger strike that inspired the film’s title and ultimately claimed the lives of nine other prisoners before its termination. He lasts sixty-six days before succumbing; in the last third of the movie, McQueen dutifully documents the excruciating decline of his corpus. Michael Fassbender, the actor portraying Sands, required both a doctor and dietician to monitor his condition as he attempted to physically reproduce, onscreen, the effect of starvation on the human body. We watch Fassbender, as Sands, waste to a sunken, semi-conscious shadow. His breathing is labored, his backside mottled with oozing bedsores. A doctor rubs ointment into his sores, and he convulses with the pain. We convulse too, understanding the pain to be real.

Some critics have suggested that the film’s treatment of Sands is a sort of deification or canonization; in reality, McQueen casts a sober eye on his conduct. He is both sympathetic and skeptical of Sands’ decision; it is heroic in its courage and infuriating in its obstinacy and selfishness. Sands’ parents are given quarters in the prison hospital in his last days. In the end they watch him die, baffled and devastated. Ultimately, Sands death proved politically futile, his grand statement doomed from the start. Though the deaths of the ten prisoners inspired international outrage, their compatriots were never granted political status, and few in the world needed further evidence that the British were bastards. It took eighteen more years to reach an agreement to end “The Troubles,” and another six to implement it. Sands, who demanded an independent Northern Ireland, would not have supported the agreement, which left his country in British hands.

Is a man blindly devoted to principle, willing to die for that principle, a hero or a fool or worse? It is a question Hunger dramatizes poignantly: Are Sands and his fellow IRA inmates paragons of resolution and devotion, or dangerously deluded in their steadfastness? Their spirit and commitment to their tactics in the face of unspeakable brutality and repression is undeniably admirable and even inspiring; one wonders how their Protestant keepers can hold out any hope of ever subduing a movement so irrepressible and unshakably centered in its conviction. We are accustomed to the gilding of our cinematic men of principle; however, through Hunger we see the man of principle darkly. Sands didn’t understand or appreciate that political battles aren’t won through force of will, by single acts, no matter how impressive. They are won, rather, by compromise, and Sands was constitutionally incapable of acknowledging this. Gandhi, no stranger to the protracted fast, understood it, and he got an independent India, albeit a compromised one. Sands, courageous to the last, got streets named in his honor, but Northern Ireland is still part of the UK.

Tuesday, March 17, 2009

A Requiem for Madness: First Movement

March Madness, how I've loved you--ecstatically, faithfully, selflessly. When little in life could bestir a solemn boy's heart, you never failed to quicken my pulse with the first electric whisper of your siren song: I proclaimed, in great earnest, Selection Sunday my favorite day of the year. Tip-off of that first game in the early sweaty minutes of Thursday afternoon offered both glorious release and heavy promise--March Madness, like the tulips, had returned. And that first portentous flip of the ball presaged a later toss, when only two teams would remain to vie for the grandest of titles, on the grandest of stages, for the undying esteem of a nation of transfixed boys and girls, aged 1-92.

Earliest memories: 1987, I am seven and for the first time in my young life have assumed the athletic loyalties of my father. I have become, indivisibly, an Indiana University basketball fan. They are luminous this season, led by Steve Alford, their All-American guard. Entering the tournament as a number one seed, they play Duke in the Sweet Sixteen. We watch at home. In the first half, angered by a referee's call, my dad throws an old, blue-bound Oxford dictionary at the television.

Two days later, having vanquished Duke, IU is matched against an inferior LSU team, a berth to the Final Four in New Orleans at stake. Indiana plays poorly and trails by a significant margin late in the second half. It appears they will lose. My sister and I decide to escape to the outdoors to avoid the suffocating pall that settles over our family room. Some minutes later, we attempt to return and are driven back at the door by strident voices: Get out! Get out! We obey. Our mother joins us in the garage after a time, brimming with excitement. Indiana has won! With the team having embarked upon a stunning comeback at our departure (later, legendary Indiana coach Bobby Knight says that with five minutes remaining he worried that the game was lost: "Then I looked down the floor and saw Dale Brown (LSU's coach), and I knew, well, we had a chance."), our parents superstitiously, and to great effect, barred us from the premises.

The following Monday, Indiana plays Syracuse for the national title. On spring break, the Gaff family is visiting an aunt in St. Paul, Minnesota (sunshine little succored us). We watch the game from our Red Roof Inn room. I spend many minutes of the second half in the bathroom, door closed, too nervous to watch. With five seconds left in the game, Indiana guard Keith Smart hits a fade-away jumper from the left baseline to give the Hoosiers a one-point victory (I emerge in time to witness this splendid, improbable wonder). Pandemonium! The kids leap vertiginously on the disheveled hotel beds. Go ahead, our parents say. Just this once, go ahead.

March Madness, it was love at first acquaintance.

What ever happened to us?