Saturday, September 09, 2006

AlterNet Special



AlterNet:
Imagine the Twin Towers Hadn't Fallen on 9/11


By Tom Engelhardt, The Nation and TomDispatch.com
Posted on September 9, 2006

We knew it was coming. Not, as conspiracy theorists imagine, just a few top officials among us, but all of us - and not for weeks or months, but for more than half a century before September 11, 2001.

That's why, for all the shock, it was, in a sense, so familiar. Americans were already imagining versions of September 11 soon after the dropping of the first atomic bomb on Hiroshima on August 6, 1945. That event set the American imagination boiling. Within weeks of the destruction of Hiroshima and Nagasaki, as scholar Paul Boyer has shown, all the familiar signs of nuclear fear were already in place - newspapers were drawing concentric circles of atomic destruction outward from fantasy Ground Zeroes in American cities, and magazines were offering visions of our country as a vaporized wasteland, while imagining millions of American dead.

And then, suddenly, one clear morning it seemed to arrive - by air, complete with images of the destruction of the mightiest monuments to our power, and (just as previously experienced) as an onscreen spectacle. At one point that day, it could be viewed on more than thirty channels, including some never previously involved with breaking news, and most of the country was watching.

Only relatively small numbers of New Yorkers actually experienced 9/11: those at the tip of Manhattan or close enough to watch the two planes smash into the World Trade Center towers, to watch (as some schoolchildren did) people leaping or falling from the upper floors of those buildings, to be enveloped in the vast cloud of smoke and ash, in the tens of thousands of pulverized computers and copying machines, the asbestos and flesh and plane, the shredded remains of millions of sheets of paper, of financial and office life as we know it. For most Americans, even those like me who were living in Manhattan, 9/11 arrived on the television screen. This is why what leapt to mind - and instantaneously filled our papers and TV reporting - was previous screen life, the movies.

In the immediate aftermath of the attacks, the news was peppered with comments about, thoughts about, and references to films. Reporters, as Caryn James wrote in the New York Times that first day, "compared the events to Hollywood action movies"; as did op-ed writers ("The scenes exceeded the worst of Hollywood's disaster movies"); columnists ("On TV, two national landmarksÖ look like the aftermath in the film Independence Day"); and eyewitnesses ("It was like one of them Godzilla movies"; "And then I saw an explosion straight out of The Towering Inferno"). Meanwhile, in an irony of the moment, Hollywood scrambled to excise from upcoming big- and small-screen life anything that might bring to mind thoughts of 9/11, including, in the case of Fox, promotion for the premiere episode of 24, in which "a terrorist blows up an airplane." (Talk about missing the point!)

In our guts, we had always known it was coming. Like any errant offspring, Little Boy and Fat Man, those two atomic packages with which we had paid them back for Pearl Harbor, were destined to return home someday. No wonder the single, omnipresent historical reference in the media in the wake of the attacks was Pearl Harbor or, as screaming headlines had it, INFAMY, or A NEW DAY OF INFAMY. We had just experienced "the Pearl Harbor of the 21st Century," or, as R. James Woolsey, former CIA director (and neocon), said in the Washington Post that first day, "It is clear now, as it was on December 7, 1941, that the United States is at war.Ö The question is: with whom?"

The Day After

No wonder that what came instantly to mind was a nuclear event. No wonder, according to a New York Times piece, Tom Brokaw, then chairing NBC's nonstop news coverage, "may have captured it best when he looked at videotape of people on a street, everything and everyone so covered with ashÖ [and said] it looked ëlike a nuclear winter in lower Manhattan.'" No wonder the Tennessean and the Topeka Capital-Journal both used the headline "The Day After," lifted from a famous 1983 TV movie about nuclear Armageddon.

No wonder the area where the two towers fell was quickly dubbed "Ground Zero," a term previously reserved for the spot where an atomic explosion had occurred. On September 12, for example, the Los Angeles Times published a full-page series of illustrations of the attacks on the towers headlined: "Ground Zero." By week's end, it had become the only name for "the collapse site," as in a September 18 New York Times headline, "Many Come to Bear Witness at Ground Zero."

No wonder the events seemed so strangely familiar. We had been living with the possible return of our most powerful weaponry via TV and the movies, novels and our own dream-life, in the past, the future, and even - thanks to a John F. Kennedy TV appearance on October 22, 1962, during the Cuban Missile crisis to tell us that our world might end tomorrow - in something like the almost-present.

So many streams of popular culture had fed into this. So many "previews" had been offered. Everywhere in those decades, you could see yourself or your compatriots or the enemy "Hiroshimated" (as Variety termed it back in 1947). Even when Arnold Schwarzenegger wasn't kissing Jamie Lee Curtis in True Lies as an atomic explosion went off somewhere in the Florida Keys or a playground filled with American kids wasn't being atomically blistered in Terminator 2: Judgment Day, even when it wasn't literally nuclear, that apocalyptic sense of destruction lingered as the train, bus, blimp, explosively armed, headed for us in our unknowing innocence; as the towering inferno, airport, city, White House was blasted away, as we were offered Pompeii-scapes of futuristic destruction in what would, post-9/11, come to be known as "the homeland."

Sometimes it came from outer space armed with strange city-blasting rays; other times irradiated monsters rose from the depths to stomp our cities (in the 1998 remake of Godzilla, New York City, no less). After Star Wars' Darth Vader used his Death Star to pulverize a whole planet in 1977, planets were regularly nuclearized in Saturday-morning TV cartoons. In our imaginations, post-1945, we were always at planetary Ground Zero.

Dystopian Serendipity

Increasingly, from Hamburg to Saudi Arabia to Afghanistan, others were also watching our spectaculars, our catastrophes, our previews; and so, as Hollywood historian Neal Gabler would write in the New York Times only days after 9/11, they were ready to deliver what we had long dreamed of with the kind of timing - insuring, for instance, that the second plane arrived "at a decent interval" after the first so that the cameras could be in place - and in a visual language American viewers would understand.

But here's the catch: What came, when it came, on September 11, 2001, wasn't what we thought came. There was no Ground Zero, because there was nothing faintly atomic about the attacks. It wasn't the apocalypse at all. Except in its success, it hardly differed from the 1993 attack on the World Trade Center, the one that almost toppled one tower with a rented Ryder van and a homemade bomb.

OK, the truck of 1993 had sprouted wings and gained all the power in those almost full, transcontinental jet fuel tanks, but otherwise what "changed everything," as the phrase would soon go, was a bit of dystopian serendipity for Al Qaeda: Nineteen men of much conviction and middling skills, armed with exceedingly low-tech weaponry and two hijacked jets, managed to create an apocalyptic look that, in another context, would have made the special-effects masters of Lucas's Industrial Light & Magic proud. And from that - and the Bush administration's reaction to it - everything else would follow.

The tiny band of fanatics who planned September 11 essentially lucked out. If the testimony, under CIA interrogation techniques, of Al Qaeda's master planner Khalid Shaikh Mohammed is to be believed, what happened stunned even him. ("According to the [CIA] summary, he said he ëhad no idea that the damage of the first attack would be as catastrophic as it was.'") Those two mighty towers came crumbling down in that vast, roiling, near-mushroom cloud of white smoke before the cameras in the fashion of the ultimate Hollywood action film (imagery multiplied in its traumatizing power by thousands of replays over a record-setting more than ninety straight hours of TV coverage). And that imagery fit perfectly the secret expectations of Americans - just as it fit the needs of both Al Qaeda and the Bush administration.

That's undoubtedly why other parts of the story of that moment faded from sight. On the fifth anniversary of September 11, there will, for instance, be no memorial documentaries focusing on American Flight 77, which plowed into the Pentagon. That destructive but non-apocalyptic-looking attack didn't satisfy the same built-in expectations. Though the term "ground zero Washington" initially floated through the media ether, it never stuck.

Similarly, the unsolved anthrax murders-by-mail of almost the same moment, which caused a collective shudder of horror, are now forgotten. (According to a LexisNexis search, between October 4 and December 4, 2001, 260 stories appeared in the New York Times and 246 in the Washington Post with "anthrax" in the headline. That's the news equivalent of a high-pitched scream of horror.) Those envelopes, spilling highly refined anthrax powder and containing letters dated "9/11/01" with lines like "Death to America, Death to Israel, Allah Is Great," represented the only use of a weapon of mass destruction in this period; yet they were slowly eradicated from our collective (and media) memory once it became clearer that the perpetrators were probably homegrown killers, possibly out of the very cold war U.S. weapons labs that produced so much WMD in the first place. It's a guarantee that the media will not be filled with memory pieces to the anthrax victims this October.

The 36-Hour War

Indulge me, then, for a moment on an otherwise grim subject. I've always been a fan of what-if history and, when younger, of science fiction. Recently, I decided to take my own modest time machine back to September 11, 2001; or, to be more exact, the IRT subway on several overheated July afternoons to one of the cultural glories of my city, the New York Public Library, a building that - in the realm where sci-fi and what-if history meld - suffered its own monstrous "damage," its own 9/11, only months after the A-bombing of Hiroshima.

In November 1945, Life magazine published "The 36-Hour War," an overheated what-if tale in which an unnamed enemy in "equatorial Africa" launched a surprise atomic missile attack on the United States, resulting in 10 million deaths. A dramatic illustration accompanying the piece showed the library's two pockmarked stone lions still standing, guarding a ground-zero scene of almost total destruction, while heavily shielded technicians tested "the rubble of the shattered city for radioactivity."

I passed those same majestic lions, still standing (as was the library) in 2006, entered the microfiche room and began reading the New York Times as well as several other newspapers starting with the September 12, 2001, issues. Immediately I was plunged back into a hellish apocalypse. Vivid Times words and phrases from that first day: "gates of hell," "the unthinkable," "nightmare world of Hieronymus Bosch," "hellish storm of ash, glass, smoke, and leaping victims," "clamorous inferno," "an ashen shell of itself, all but a Pompeii." But one of the most common words over those days in the Times and elsewhere was "vulnerable" (or as a Times piece put it, "nowhere was safe"). The front page of the Chicago Tribune caught this mood in a headline, "Feeling of Invincibility Suddenly Shattered," and a lead sentence, "On Tuesday, America the invincible became America the vulnerable." We had faced "the kamikazes of the 21st century" - a Pearl Harborish phrase that would gain traction - and we had lost.

A thought came to mind as I slowly rolled those grainy microfiches; as I passed the photo of a man, in midair, falling headfirst from a WTC tower; as I read this observation from a Pearl Harbor survivor interviewed by the Tribune: "Things will never be the same again in this country"; as I reeled section by section, day by day toward our distinctly changed present; as I read all those words that boiled up like a linguistic storm around the photos of those hideous white clouds; as I considered all the op-eds and columns filled with all those instant opinions that poured into the pages of our papers before there was even time to think; as I noticed, buried in their pages, a raft of words and phrases - "preempt," "a new Department of Pre-emption [at the Pentagon]," "homeland defenses," "homeland security agency" - already lurking in our world, readying themselves to be noticed.

Among them all, the word that surfaced fastest on the heels of that "new Day of Infamy," and to deadliest effect, was "war." Senator John McCain, among many others, labeled the attacks "an act of war" on the spot, just as Republican Senator Richard Shelby insisted that "this is total war," just as the Washington Post's columnist Charles Krauthammer started his first editorial that first day, "This is not crime. This is war." And they quickly found themselves in a milling crowd of potential war-makers, Democrats as well as Republicans, liberals as well as conservatives, even if the enemy remained as yet obscure.

On the night of September 11 the President himself, addressing the nation, already spoke of winning "the war against terrorism." By day two, he used the phrase "acts of war"; by day three, "the first war of the twenty-first century" (while the Times reported "a drumbeat for war" on television); by week's end, "the long war"; and the following week, in an address to a joint session of Congress, while announcing the creation of a Cabinet-level Office of Homeland Security, he wielded "war" twelve times. ("Our war on terror begins with Al Qaeda, but it does not end there.")

What If?

So here was my what-if thought. What if the two hijacked planes, American Flight 11 and United 175, had plunged into those north and south towers at 8:46 and 9:03, killing all aboard, causing extensive damage and significant death tolls, but neither tower had come down? What if, as a Tribune columnist called it, photogenic "scenes of apocalypse" had not been produced? What if, despite two gaping holes and the smoke and flames pouring out of the towers, the imagery had been closer to that of 1993? What if there had been no giant cloud of destruction capable of bringing to mind the look of "the day after," no images of crumbling towers worthy of Independence Day?

We would surely have had blazing headlines, but would they have commonly had "war" or "infamy" in them, as if we had been attacked by another state? Would the last superpower have gone from "invincible" to "vulnerable" in a split second? Would our newspapers instantly have been writing "before" and "after" editorials, or insisting that this moment was the ultimate "test" of George W. Bush's until-then languishing presidency? Would we instantaneously have been considering taking what CIA Director George Tenet would soon call "the shackles" off our intelligence agencies and the military? Would we have been reconsidering, as Florida's Democratic Senator Bob Graham suggested that first day, rescinding the Congressional ban on the assassination of foreign officials and heads of state? Would a Washington Post journalist have been trying within hours to name the kind of "war" we were in? (He provisionally labeled it "the Gray War.") Would New York Times columnist Tom Friedman on the third day have had us deep into "World War III"? Would the Times have been headlining and quoting Deputy Defense Secretary Paul Wolfowitz on its front page on September 14, insisting that "it's not simply a matter of capturing people and holding them accountable, but removing the sanctuaries, removing the support systems, ending states who sponsor terrorism." (The Times editorial writers certainly noticed that ominous "s" on "states" and wrote the next day: "but we trust [Wolfowitz] does not have in mind invading Iraq, Iran, Syria and Sudan as well as Afghanistan.")

Would state-to-state "war" and "acts of terror" have been so quickly conjoined in the media as a "war on terror" and would that phrase have made it, in just over a week, into a major presidential address? Could the Los Angeles Daily News have produced the following four-day series of screaming headlines, beating even the President to the punch: Terror/Horror!/"This Is War"/War on Terror?

If it all hadn't seemed so familiar, wouldn't we have noticed what was actually new in the attacks of September 11? Wouldn't more people have been as puzzled as, according to Ron Suskind in his new book The One Percent Doctrine, was one reporter who asked White House press secretary Ari Fleischer, "You don't declare war against an individual, surely"? Wouldn't Congress have balked at passing, three days later, an almost totally open-ended resolution granting the President the right to use force not against one nation (Afghanistan) but against "nations," plural and unnamed?

And how well would the Bush administration's fear-inspired nuclear agenda have worked, if those buildings hadn't come down? Would Saddam's supposed nuclear program and WMD stores have had the same impact? Would the endless linking of the Iraqi dictator, Al Qaeda, and 9/11 have penetrated so deeply that, in 2006, half of all Americans, according to a Harris Poll, still believed Saddam had WMD when the U.S. invasion began, and 85% of American troops stationed in Iraq, according to a Zogby poll, believed the US mission there was mainly "to retaliate for Saddam's role in the 9-11 attacks"?

Without that apocalyptic 9/11 imagery, would those fantasy Iraqi mushroom clouds pictured by administration officials rising over American cities or those fantasy Iraqi unmanned aerial vehicles capable of spraying our East Coast with chemical or biological weapons, or Saddam's supposed search for African yellowcake (or even, today, the Iranian "bomb" that won't exist for perhaps another decade, if at all) have so dominated American consciousness?

Would Osama bin Laden and Ayman al-Zawahiri be sitting in jail cells or be on trial by now? Would so many things have happened differently?

The Opportunity of a Lifetime

What if the attacks on September 11, 2001, had not been seen as a new Pearl Harbor? Only three months earlier, after all, Disney's Pearl Harbor (the "sanitized" version, as Times columnist Frank Rich labeled it), a blockbuster made with extensive Pentagon help, had performed disappointingly at the multiplexes. As an event, it seemed irrelevant to American audiences until 9/11, when that ancient history - and the ancient retribution that went with it - wiped from the American brain the actual history of recent decades, including our massive covert anti-Soviet war in Afghanistan, out of which Osama bin Laden emerged.

Here's the greatest irony: From that time of triumph in 1945, Americans had always secretly suspected that they were not "invincible" but exceedingly vulnerable, something both pop culture and the deepest fears of the cold war era only reinforced. Confirmation of that fact arrived with such immediacy on September 11 largely because it was already a gut truth. The ambulance chasers of the Bush administration, who spotted such opportunity in the attacks, were perhaps the last Americans who hadn't absorbed this reality. As that New Day of Infamy scenario played out, the horrific but actual scale of the damage inflicted in New York and Washington (and to the U.S. economy) would essentially recede. The attack had been relatively small, limited in its means and massive only in its daring and luck - abetted by the fact that the Bush administration was looking for nothing like such an attack, despite that CIA briefing given to Bush on a lazy August day in Crawford ("Bin Ladin Determined To Strike in US") and so many other clues.

Only the week before 9/11 the Bush administration had been in the doldrums with a "detached," floundering President criticized by worried members of his own party for vacationing far too long at his Texas ranch while the nation drifted. Moreover, there was only one group before September 11 with a "new Pearl Harbor" scenario on the brain. Major administration figures, including Vice President Dick Cheney, Defense Secretary Donald Rumsfeld and Deputy Defense Secretary Wolfowitz, had wanted for years to radically increase the power of the President and the Pentagon, to roll back the power of Congress (especially any Congressional restraints on the presidency left over from the Vietnam/Watergate era) and to complete the overthrow of Saddam Hussein ("regime change"), aborted by the first Bush administration in 1991.

We know as well that some of those plans were on the table in the 1990s and that those who held them and promoted them, at the Project for the New American Century in particular, actually wrote in a proposal titled "Rebuilding America's Defenses" that "the process of transformation [of the Pentagon], even if it brings revolutionary change, is likely to be a long one, absent some catastrophic and catalyzing event - like a new Pearl Harbor."

We also know that within hours of the 9/11 attacks, many of the same people were at work on the war of their dreams. Within five hours of the attack on the Pentagon, Rumsfeld was urging his aides to come up with plans for striking Iraq. (Notes by an aide transcribe his wishes this way: "best info fast. Judge whether good enough hit S.H. [Saddam Hussein] at same time. Not only UBL [Osama bin Laden].Ö Go massive. Sweep it all up. Things related and not.")

We know that by the 12th, the President himself had collared his top counterterrorism adviser on the National Security Council, Richard Clarke, and some of his staff in a conference room next to the White House Situation Room and demanded linkages. ("ëLook under every rock and do due diligence.' It was a very intimidating message which said, ëIraq. Give me a memo about Iraq and 9/11.'") We know that by November, the top officials of the Administration were already deep into operational planning for an invasion of Iraq.

And they weren't alone. Within the Pearl Harbor/nuclear attack/war nexus that emerged almost instantly from the ruins of the World Trade Center, others were working feverishly. Only eight days after the attacks, for instance, the complex 342-page Patriot Act would be rushed over to Congress by Attorney General John Ashcroft, passed through a cowed Senate in the dead of night on October 11, unread by at least some of our Representatives, and signed into law on October 26. As its instant appearance indicated, it was made up of a set of already existing right-wing hobbyhorses, quickly drafted provisions and expansions of law enforcement powers taken off an FBI "wish list" (previously rejected by Congress). All these were swept together by people who, like the President's men on Iraq, saw their main chance when those buildings went down. As such, it stands in for much of what happened "in response" to 9/11.

But what if we hadn't been waiting so long for our own thirty-six-hour war in the most victorious nation on the planet, its sole "hyperpower," its new Rome? What if those pre-existing frameworks hadn't been quite so well primed to emerge in no time at all? What if we (and our enemies as well) hadn't been at the movies all those years?

Movie-Made Planet

Among other things, we've been left with a misbegotten "billion dollar" memorial to the attacks of 9/11 (recently recalibrated to $500 million) planned for New York's Ground Zero and sporting the kinds of cost overruns otherwise associated with the occupation of Iraq. In its ambitions, what it will really memorialize is the Bush administration's oversized, crusading moment that followed the attacks. Too late now - and no one asked me anyway - but I know what my memorial would have been.

A few days after 9/11, my daughter and I took a trip downtown, as close to "Ground Zero" as you could get. With the air still rubbing our throats raw, we wandered block after block, peering down side streets to catch glimpses of the sheer enormity of the destruction. And indeed, in a way that no small screen could communicate, it did have the look of the apocalyptic, especially those giant shards of fallen building sticking up like - remember, I'm a typical movie-made American on an increasingly movie-made planet and had movies on the brain that week - the image of the wrecked Statue of Liberty that chillingly ends the first Planet of the Apes film, that cinematic memorial to humanity's nuclear folly. Left there as it was, that would have been a sobering monument for the ages, not just to the slaughter that was 9/11 but to what we had awaited for so long - and what, sadly, we still wait for; what, in the world that George Bush has produced, has become ever more, rather than less, likely. And imagine our reaction then.

Safer? Don't be ridiculous.

Tom Engelhardt, editor of Tomdispatch.com, is co-founder of the American Empire Project and author of The End of Victory Culture.

© 2006 Independent Media Institute. All rights reserved.
View this story online at:
http://www.alternet.org/story/41425/



Slow Food Nation

By Alice Waters, The Nation
Posted on September 9, 2006

It turns out that Jean Anthèlme Brillat-Savarin was right in 1825 when he wrote in his magnum opus, The Physiology of Taste, that "the destiny of nations depends on the manner in which they are fed." If you think this aphorism exaggerates the importance of food, consider that today almost 4 billion people worldwide depend on the agricultural sector for their livelihood. Food is destiny, all right; every decision we make about food has personal and global repercussions. By now it is generally conceded that the food we eat could actually be making us sick, but we still haven't acknowledged the full consequences - environmental, political, cultural, social and ethical - of our national diet.

These consequences include soil depletion, water and air pollution, the loss of family farms and rural communities, and even global warming. (Inconveniently, Al Gore's otherwise invaluable documentary An Inconvenient Truth has disappointingly little to say about how industrial food contributes to climate change.) When we pledge our dietary allegiance to a fast-food nation, there are also grave consequences to the health of our civil society and our national character. When we eat fast-food meals alone in our cars, we swallow the values and assumptions of the corporations that manufacture them. According to these values, eating is no more important than fueling up, and should be done quickly and anonymously. Since food will always be cheap, and resources abundant, it's OK to waste. Feedlot beef, french fries and Coke are actually good for you. It doesn't matter where food comes from, or how fresh it is, because standardized consistency is more important than diversified quality. Finally, hard work - work that requires concentration, application and honesty, such as cooking for your family - is seen as drudgery, of no commercial value and to be avoided at all costs. There are more important things to do.

It's no wonder our national attention span is so short: We get hammered with the message that everything in our lives should be fast, cheap and easy - especially food. So conditioned are we to believe that food should be almost free that even the rich, who pay a tinier fraction of their incomes for food than has ever been paid before in human history, grumble at the price of an organic peach - a peach grown for flavor and picked, perfectly ripe, by a local farmer who is taking care of the land and paying his workers a fair wage! And yet, as the writer and farmer David Mas Masumoto recently pointed out, pound for pound, peaches that good still cost less than Twinkies. When we claim that eating well is an elitist preoccupation, we create a smokescreen that obscures the fundamental role our food decisions have in shaping the world. The reason that eating well in this country costs more than eating poorly is that we have a set of agricultural policies that subsidize fast food and make fresh, wholesome foods, which receive no government support, seem expensive. Organic foods seem elitist only because industrial food is artificially cheap, with its real costs being charged to the public purse, the public health and the environment.

The contributors to this forum have been asked to name just one thing that could be done to fix the food system. What they propose are solutions that arise out of what I think of as "slow food values," which run counter to the assumptions of fast-food marketing. To me, these are the values of the family meal, which teaches us, among other things, that the pleasures of the table are a social as well as a private good. At the table we learn moderation, conversation, tolerance, generosity and conviviality; these are civic virtues. The pleasures of the table also beget responsibilities - to one another, to the animals we eat, to the land and to the people who work it. It follows that food that is healthy in every way will cost us more, in time and money, than we pay now. But when we have learned what the real costs of food are, and relearned the real rewards of eating, we will have laid a foundation for not just a healthier food system but a healthier twenty-first-century democracy. - Alice Waters

Michael Pollan

Every five years or so the President of the United States signs an obscure piece of legislation that determines what happens on a couple of hundred million acres of private land in America, what sort of food Americans eat (and how much it costs) and, as a result, the health of our population. In a nation consecrated to the idea of private property and free enterprise, you would not think any piece of legislation could have such far-reaching effects, especially one about which so few of us - even the most politically aware - know anything. But in fact the American food system is a game played according to a precise set of rules that are written by the federal government with virtually no input from anyone beyond a handful of farm-state legislators. Nothing could do more to reform America's food system - and by doing so improve the condition of America's environment and public health - than if the rest of us were suddenly to weigh in.

The farm bill determines what our kids eat for lunch in school every day. Right now, the school lunch program is designed not around the goal of children's health but to help dispose of surplus agricultural commodities, especially cheap feedlot beef and dairy products, both high in fat.

The farm bill writes the regulatory rules governing the production of meat in this country, determining whether the meat we eat comes from sprawling, brutal, polluting factory farms and the big four meatpackers (which control 80 percent of the market) or from local farms.

Most important, the farm bill determines what crops the government will support - and in turn what kinds of foods will be plentiful and cheap. Today that means, by and large, corn and soybeans. These two crops are the building blocks of the fast-food nation: A McDonald's meal (and most of the processed food in your supermarket) consists of clever arrangements of corn and soybeans - the corn providing the added sugars, the soy providing the added fat, and both providing the feed for the animals. These crop subsidies (which are designed to encourage overproduction rather than to help farmers by supporting prices) are the reason that the cheapest calories in an American supermarket are precisely the unhealthiest. An American shopping for food on a budget soon discovers that a dollar buys hundreds more calories in the snack food or soda aisle than it does in the produce section. Why? Because the farm bill supports the growing of corn but not the growing of fresh carrots. In the midst of a national epidemic of diabetes and obesity our government is, in effect, subsidizing the production of high-fructose corn syrup.

This absurdity would not persist if more voters realized that the farm bill is not a parochial piece of legislation concerning only the interests of farmers. Today, because so few of us realize we have a dog in this fight, our legislators feel free to leave deliberations over the farm bill to the farm states, very often trading away their votes on agricultural policy for votes on issues that matter more to their constituents. But what could matter more than the health of our children and the health of our land?

Perhaps the problem begins with the fact that this legislation is commonly called "the farm bill" - how many people these days even know a farmer or care about agriculture? Yet we all eat. So perhaps that's where we should start, now that the debate over the 2007 farm bill is about to be joined. This time around let's call it "the food bill" and put our legislators on notice that this is about us and we're paying attention.

Peter Singer

There is one very simple thing that everyone can do to fix the food system. Don't buy factory-farm products.

Once, the animals we raised went out and gathered things we could not or would not eat. Cows ate grass, chickens pecked at worms or seeds. Now the animals are brought together and we grow food for them. We use synthetic fertilizers and oil-powered tractors to grow corn or soybeans. Then we truck it to the animals so they can eat it.

When we feed grains and soybeans to animals, we lose most of their nutritional value. The animals use it to keep their bodies warm and to develop bones and other body parts that we cannot eat. Pig farms use six pounds of grain for every pound of boneless meat we get from them. For cattle in feedlots, the ratio is 13:1. Even for chickens, the least inefficient factory-farmed meat, the ratio is 3:1.

Most Americans think the best thing they could do to cut their personal contributions to global warming is to swap their family car for a fuel-efficient hybrid like the Toyota Prius. Gidon Eshel and Pamela Martin of the University of Chicago have calculated that typical meat-eating Americans would reduce their emissions even more if they switched to a vegan diet. Factory farming is not sustainable. It is also the biggest system of cruelty to animals ever devised. In the United States alone, every year nearly 10 billion animals live out their entire lives confined indoors. Hens are jammed into wire cages, five or six of them in a space that would be too small for even one hen to be able to spread her wings. Twenty thousand chickens are raised in a single shed, completely covering its floor. Pregnant sows are kept in crates too narrow for them to turn around, and too small for them to walk a few steps. Veal calves are similarly confined, and deliberately kept anemic.

This is not an ethically defensible system of food production. But in the United States - unlike in Europe - the political process seems powerless to constrain it. The best way to fight back is to stop buying its products. Going vegetarian is a good option, and going vegan, better still. But if you continue to eat animal products, at least boycott factory farms.

Winona LaDuke

It's Manoominike Giizis, or the Wild Rice Making Moon, here on the White Earth reservation in northern Minnesota. The sound of a canoe moving through the wild rice beds on the Crow Wing or Rice lakes, the sound of laughter, the smell of wood-parched wild rice and the sound of a traditional drum at the celebration for the wild rice harvest links a traditional Anishinaabeg or Ojibwe people to a thousand years of culture and the ecosystem of a lake in a new millennium. This cultural relationship to food - manoomin, or wild rice - represents an essential part of what we need to do to repair the food system: We need to recover relationship.

Wild rice is the only North American grain, and today the Ojibwe are in a pitched battle to keep it from getting genetically engineered and patented. A similar battle is under way in Hawaii between Native Hawaiians and the University of Hawaii, which recently agreed to tear up patents on taro, a food sacred to Native Hawaiians. At one point "agriculture" was about the culture of food. Losing that culture - in favor of an American cultural monocrop, joined with an agricultural monocrop - puts us in a perilous state, threatening sustainability and our relationship to the natural world.

In the Ojibwe struggle to "keep it wild," we have found ourselves in an international movement of Slow Food and food sovereignty activists and communities who are seeking the same - the recovery or sustaining of relationship as a basic element of our humanity and as a critical strategy. In the Wild Rice Making Moon of the North Country, we will continue our traditions, and we will look across our lakes to the rice farmers of the rest of the world, to the taro farmers of the Pacific and to other communities working to protect their seeds for future generations, and we will know that this is how we insure that those generations will have what they need to be human, to be Anishinaabeg.

Vandana Shiva

Humanity has eaten more than 80,000 plant species through its evolution. More than 3,000 have been used consistently. However, we now rely on just eight crops to provide 75 percent of the world's food. With genetic engineering, production has narrowed to three crops: corn, soya, canola. Monocultures are destroying biodiversity, our health and the quality and diversity of food.

In 1998 India's indigenous edible oils made from mustard, coconut, sesame, linseed and groundnut processed in artisanal cold-press mills were banned, using "food safety" as an excuse. The restrictions on import of soya oil were simultaneously removed. Ten million farmers' livelihoods were threatened. One million oil mills in villages were closed. And millions of tons of artificially cheap GMO soya oil continue to be dumped on India. Women from the slums of Delhi came out in a movement to reject soya and bring back mustard oil. "Sarson bachao, soyabean bhagao" (save the mustard, drive away the soyabean) was the women's call from the streets of Delhi. We did succeed in bringing back mustard through our "sarson satyagraha" (non-cooperation with the ban on mustard oil).

I was recently in the Amazon, where the same companies that dumped soya on India - Cargill and ADM - are destroying the Amazon to grow soya. Millions of acres of the Amazon rainforest - the lung, liver and heart of the global climate system - are being burned to grow soya for export. Cargill has built an illegal port at Santarém in Brazil and is driving the expansion of soya in the Amazon rainforest. Armed gangs take over the forest and use slaves to cultivate soya. When people like Sister Dorothy Stang oppose the destruction of the forests and the violence against people, they are assassinated.

People in Brazil and India are being threatened to promote a monoculture that benefits agribusiness. A billion people are without food because industrial monocultures robbed them of their livelihoods in agriculture and their food entitlements. Another 1.7 billion are suffering from obesity and food-related diseases. Monocultures lead to malnutrition - for those who are underfed as well as those who are overfed. In depending on monocultures, the food system is being made increasingly dependent on fossil fuels - for synthetic fertilizers, for running giant machinery and for long-distance transport, which adds "food miles."

Moving beyond monocultures has become an imperative for repairing the food system. Biodiverse small farms have higher productivity and generate higher incomes for farmers. And biodiverse diets provide more nutrition and better taste. Bringing back biodiversity to our farms goes hand in hand with bringing back small farmers on the land. Corporate control thrives on monocultures. Citizens' food freedom depends on biodiversity.

Jim Hightower

In the very short span of about fifty years, we've allowed our politicians to do something remarkably stupid: turn America's food-policy decisions over to corporate lobbyists, lawyers and economists. These are people who could not run a watermelon stand if we gave them the melons and had the Highway Patrol flag down the customers for them - yet, they have taken charge of the decisions that direct everything from how and where food is grown to what our children eat in school.

As a result, America's food system (and much of the world's) has been industrialized, conglomeratized and globalized. This is food we're talking about, not widgets! Food, by its very nature, is meant to be agrarian, small-scale and local.

But the Powers That Be have turned the production of our edibles away from the high art of cooperating with nature into a high-cost system of always trying to overwhelm nature. They actually torture food - applying massive doses of pesticides, sex hormones, antibiotics, genetically manipulated organisms, artificial flavorings and color, chemical preservatives, ripening gas, irradiation...and so awfully much more. The attitude of agribusiness is that if brute force isn't working, you're probably just not using enough of it.

More fundamentally, these short-cut con artists have perverted the very concept of food. Rather than being both a process and product that nurtures us (in body and spirit) and nurtures our communities, food is approached by agribusiness as just another commodity that has no higher purpose than to fatten corporate profits.

There's our challenge. It's not a particular policy or agency that must be changed but the most basic attitude of policy-makers. And the only way we're going to get that done is for you and me to become the policy-makers, taking charge of every aspect of our food system - from farm to fork.

The good news is that this "good food" movement is already well under way and gaining strength every day. It receives little media coverage, but consumers in practically every city, town and neighborhood across America are reconnecting with local farmers and artisans to de-industrialize, de-conglomeratize, de-globalize - de-Wal-Martize - their food systems.

Of course, the Powers That Be sneer at these efforts, saying they can't succeed. But, as a friend of mine who is one of the successful pioneers in this burgeoning movement puts it: "Those who say it can't be done should not interrupt those who are doing it."

Look around wherever you are and you'll find local farmers, consumers, chefs, marketers, gardeners, environmentalists, workers, churches, co-ops, community organizers and just plain folks who are doing it. These are the Powers That Ought to Be - and I think they will be. Join them!

Alice Waters is the founder of Chez Panisse Restaurant and director of the Chez Panisse Foundation in Berkeley, California.

© 2006 Independent Media Institute. All rights reserved.
View this story online at:
http://www.alternet.org/story/41131/



War-Mongering America Terrorizes the World

By Howard Zinn, AlterNet
Posted on September 9, 2006

There is something important to be learned from the recent experience of the United States and Israel in the Middle East: that massive military attacks, inevitably indiscriminate, are not only morally reprehensible, but useless in achieving the stated aims of those who carry them out.

The United States, in three years of war, which began with shock-and- awe bombardment and goes on with day-to-day violence and chaos, has been an utter failure in its claimed objective of bringing democracy and stability to Iraq. The Israeli invasion and bombing of Lebanon has not brought security to Israel; indeed it has increased the number of its enemies, whether in Hezbollah or Hamas or among Arabs who belong to neither of those groups.

I remember John Hersey's novel, "The War Lover," in which a macho American pilot, who loves to drop bombs on people and also to boast about his sexual conquests, turns out to be impotent. President Bush, strutting in his flight jacket on an aircraft carrier and announcing victory in Iraq, has turned out to be much like the Hersey character, his words equally boastful, his military machine impotent.

The history of wars fought since the end of World War II reveals the futility of large-scale violence. The United States and the Soviet Union, despite their enormous firepower, were unable to defeat resistance movements in small, weak nations - the United States in Vietnam, the Soviet Union in Afghanistan - and were forced to withdraw.

Even the "victories" of great military powers turn out to be elusive. Presumably, after attacking and invading Afghanistan, the president was able to declare that the Taliban were defeated. But more than four years later, Afghanistan is rife with violence, and the Taliban are active in much of the country.

The two most powerful nations after World War II, the United States and the Soviet Union, with all their military might, have not been able to control events in countries that they considered to be in their sphere of influence - the Soviet Union in Eastern Europe and the United States in Latin America.

Beyond the futility of armed force, and ultimately more important, is the fact that war in our time inevitably results in the indiscriminate killing of large numbers of people. To put it more bluntly, war is terrorism. That is why a "war on terrorism" is a contradiction in terms. Wars waged by nations, whether by the United States or Israel, are a hundred times more deadly for innocent people than the attacks by terrorists, vicious as they are.

The repeated excuse, given by both Pentagon spokespersons and Israeli officials, for dropping bombs where ordinary people live is that terrorists hide among civilians. Therefore the killing of innocent people (in Iraq, in Lebanon) is called accidental, whereas the deaths caused by terrorists (on 9/11, by Hezbollah rockets) are deliberate.

This is a false distinction, quickly refuted with a bit of thought. If a bomb is deliberately dropped on a house or a vehicle on the grounds that a "suspected terrorist" is inside (note the frequent use of the word suspected as evidence of the uncertainty surrounding targets), the resulting deaths of women and children may not be intentional. But neither are they accidental. The proper description is "inevitable."

So if an action will inevitably kill innocent people, it is as immoral as a deliberate attack on civilians. And when you consider that the number of innocent people dying inevitably in "accidental" events has been far, far greater than all the deaths deliberately caused by terrorists, one must reject war as a solution for terrorism.

For instance, more than a million civilians in Vietnam were killed by US bombs, presumably by "accident." Add up all the terrorist attacks throughout the world in the 20th century and they do not equal that awful toll.

If reacting to terrorist attacks by war is inevitably immoral, then we must look for ways other than war to end terrorism, including the terrorism of war. And if military retaliation for terrorism is not only immoral but futile, then political leaders, however cold-blooded their calculations, may have to reconsider their policies.

Howard Zinn is a professor emeritus at Boston University and the author of the forthcoming book, "A Power Governments Cannot Suppress" (City Lights Books, Winter 2007).

© 2006 Independent Media Institute. All rights reserved.
View this story online at:
http://www.alternet.org/story/41430/





Wind Power Is Energy for Optimists

By Komanoff Charles, Orion Magazine
Posted on September 9, 2006

It was a place I had often visited in memory but feared might no longer exist. Orange slabs of calcified sandstone teetered overhead, while before me, purple buttes and burnt mesas stretched over the desert floor. In the distance I could make out southeast Utah's three snowcapped ranges - the Henrys, the Abajos, and, eighty miles to the east, the La Sals, shimmering into the blue horizon.

No cars, no roads, no buildings. Two crows floating on the late-winter thermals. Otherwise, stillness.

Abbey's country. But my country, too. Almost forty years after Desert Solitaire, 35 since I first came to love this Colorado River plateau, I was back with my two sons, eleven and eight. We had spent four sun-filled days clambering across slickrock in Arches National Park and crawling through the slot canyons of the San Rafael Reef. Now, perched on a precipice above Goblin Valley, stoked on endorphins and elated by the beauty before me, I had what might seem a strange, irrelevant thought: I didn't want windmills here.

Reprint Notice:

This article appears in the September-October 2006 issue of Orion magazine, 187 Main Street, Great Barrington, MA 01230, 888/909-6568, ($35/year for 6 issues). A free copy of the magazine can be obtained through Orion's website at oriononline.org.

Not that any windmills are planned for this Connecticut-sized expanse - the winds are too fickle. But wind energy is never far from my mind these days. As Earth's climate begins to warp under the accumulating effluent from fossil fuels, the increasing viability of commercial-scale wind power is one of the few encouraging developments.

Encouraging to me, at least. As it turns out, there is much disagreement over where big windmills belong, and whether they belong at all.

Fighting fossil fuels and machines powered by them, has been my life's work. In 1971, shortly after getting my first taste of canyon country, I took a job crunching numbers for what was then a landmark exposé of U.S. power plant pollution, The Price of Power. The subject matter was drier than dust - emissions data, reams of it, printed out on endless strips of paper by a mainframe computer. Dull stuff, but nightmarish visions of coal-fired smokestacks smudging the crystal skies of the Four Corners kept me working 'round the clock, month after month.

A decade later, as a New York City bicycle commuter fed up with the oil-fueled mayhem on the streets, I began working with the local bicycle advocacy group, Transportation Alternatives, and we soon made our city a hotbed of urban American anti-car activism. The '90s and now the '00s have brought other battles - "greening" Manhattan tenement buildings through energy efficiency and documenting the infernal "noise costs" of Jet Skis, to name two - but I'm still fighting the same fight.

Why? Partly it's knowing the damage caused by the mining and burning of fossil fuels. And there's also the sheer awfulness of machines gone wild, their groaning, stinking combustion engines invading every corner of life. But now the stakes are immeasurably higher. As an energy analyst, I can tell you that the science on global warming is terrifyingly clear: to have even a shot at fending off climate catastrophe, the world must reduce carbon dioxide emissions from fuel burning by at least 50 percent within the next few decades. If poor countries are to have any room to develop, the United States, the biggest emitter by far, needs to cut back by 75 percent.

Although automobiles, with their appetite for petroleum, may seem like the main culprit, the number one climate change agent in the U.S. is actually electricity. The most recent inventory of U.S. greenhouse gases found that power generation was responsible for a whopping 38 percent of carbon dioxide emissions. Yet the electricity sector may also be the least complicated to make carbon free. Approximately three-fourths of U.S. electricity is generated by burning coal, oil, or natural gas. Accordingly, switching that same portion of U.S. electricity generation to nonpolluting sources such as wind turbines, while simultaneously ensuring that our ever-expanding arrays of lights, computers, and appliances are increasingly energy efficient, would eliminate 38 percent of the country's CO2 emissions and bring us halfway to the goal of cutting emissions by 75 percent.

To achieve that power switch entirely through wind power, I calculate, would require 400,000 windmills rated at 2.5 megawatts each. To be sure, this is a hypothetical figure, since it ignores such real-world issues as limits on power transmission and the intermittency of wind, but it's a useful benchmark just the same.

What would that entail?

To begin, I want to be clear that the turbines I'm talking about are huge, with blades up to 165 feet long mounted on towers rising several hundred feet. Household wind machines like the 100-foot-high Bergey 10-kilowatt BWC Excel with 11-foot blades, the mainstay of the residential and small business wind turbine market, may embody democratic self-reliance and other "small is beautiful" virtues, but we can't look to them to make a real dent in the big energy picture. What dictates the supersizing of windmills are two basic laws of wind physics: a wind turbine's energy potential is proportional to the square of the length of the blades, and to the cube of the speed at which the blades spin. I'll spare you the math, but the difference in blade lengths, the greater wind speeds higher off the ground, and the sophisticated controls available on industrial-scale turbines all add up to a market-clinching five-hundred-fold advantage in electricity output for a giant General Electric or Vestas wind machine.

How much land do the industrial turbines require? The answer turns on what "require" means. An industry rule of thumb is that to maintain adequate exposure to the wind, each big turbine needs space around it of about 60 acres. Since 640 acres make a square mile, those 400,000 turbines would need 37,500 square miles, or roughly all the land in Indiana or Maine.

On the other hand, the land actually occupied by the turbines - their "footprint" - would be far, far smaller. For example, each 3.6-megawatt Cape Wind turbine proposed for Nantucket Sound will rest on a platform roughly 22 feet in diameter, implying a surface area of 380 square feet - the size of a typical one-bedroom apartment in New York City. Scaling that up by 400,000 suggests that just six square miles of land - less than the area of a single big Wyoming strip mine - could house the bases for all of the windmills needed to banish coal, oil, and gas from the U.S. electricity sector.

Of course, erecting and maintaining wind turbines can also necessitate clearing land: ridgeline installations often require a fair amount of deforestation, and then there's the associated clearing for access roads, maintenance facilities, and the like. But there are also now a great many turbines situated on farmland, where the fields around their bases are still actively farmed.

Depending, then, on both the particular terrain and how the question is understood, the land area said to be needed for wind power can vary across almost four orders of magnitude. Similar divergences of opinion are heard about every other aspect of wind power, too. Big wind farms kill thousands of birds and bats...or hardly any, in comparison to avian mortality from other tall structures such as skyscrapers. Industrial wind machines are soft as a whisper from a thousand feet away, and even up close their sound level would rate as "quiet" on standard noise charts...or they can sound like "a grinding noise" or "the shrieking sound of a wild animal," according to one unhappy neighbor of an upstate New York wind farm. Wind power developers are skimming millions via subsidies, state-mandated quotas, and "green power" scams... or are boldly risking their own capital to strike a blow for clean energy against the fossil fuel Goliath.

Some of the bad press is warranted. The first giant wind farm, comprising six thousand small, fast-spinning turbines placed directly in northern California's principal raptor flyway, Altamont Pass, in the early 1980s rightly inspired the epithet "Cuisinarts for birds." The longer blades on newer turbines rotate more slowly and thus kill far fewer birds, but bat kills are being reported at wind farms in the Appalachian Mountains; as many as two thousand bats were hacked to death at one forty-four-turbine installation in West Virginia. And as with any machine, some of the nearly ten thousand industrial-grade windmills now operating in the U.S. may groan or shriek when something goes wrong. Moreover, wind power does benefit from a handsome federal subsidy; indeed, uncertainty over renewal of the "production tax credit" worth 1.9 cents per kilowatt-hour nearly brought wind power development to a standstill a few years ago.

At the same time, however, there is an apocalyptic quality to much anti-wind advocacy that seems wildly disproportionate to the actual harm, particularly in the overall context of not just other sources of energy but modern industry in general. New York State opponents of wind farms call their website "Save Upstate New York," as if ecological or other damage from wind turbines might administer the coup de grâce to the state's rural provinces that decades of industrialization and pollution, followed by outsourcing, have not. In neighboring Massachusetts, a group called Green Berkshires argues that wind turbines" are enormously destructive to the environment," but does not perform the obvious comparison to the destructiveness of fossil fuel-based power. Although the intensely controversial Cape Wind project "poses an imminent threat to navigation and raises many serious maritime safety issues," according to the anti-wind Alliance to Protect Nantucket Sound, the alliance was strangely silent when an oil barge bound for the region's electric power plant spilled ninety-eight thousand gallons of its deadly, gluey cargo into Buzzards Bay three years ago.

Of course rhetoric is standard fare in advocacy, particularly the environmental variety with its salvationist mentality - environmentalists always like to feel they are "saving" this valley or that species. It all comes down to a question of what we're saving, and for whom. You can spend hours sifting through the anti-wind websites and find no mention at all of the climate crisis, let alone wind power's potential to help avert it.

IN FACT, many wind power opponents deny that wind power displaces much, if any, fossil fuel burning. Green Berkshires insists, for example, that "global warming [and] dependence on fossil fuels ... will not be ameliorated one whit by the construction of these turbines on our mountains."

This notion is mistaken. It is true that since wind is variable, individual wind turbines can't be counted on to produce on demand, so the power grid can't necessarily retire fossil fuel generators at the same rate as it takes on windmills. The coal- and oil-fired generators will still need to be there, waiting for a windless day. But when the wind blows, those generators can spin down. That's how the grid works: it allocates electrons. Supply more electrons from one source, and other sources can supply fewer. And since system operators program the grid to draw from the lowest-cost generators first, and wind power's "fuel," moving air, is free, wind-generated electrons are given priority. It follows that more electrons from wind power mean proportionately fewer from fossil fuel burning.

What about the need to keep a few power stations burning fuel so they can instantaneously ramp up and counterbalance fluctuations in wind energy output? The grid requires this ballast, known as spinning reserve, in any event both because demand is always changing and because power plants of any type are subject to unforeseen breakdowns. The additional variability due to wind generation is slight - wind speeds don't suddenly drop from strong to calm, at least not for every turbine in a wind farm and certainly not for every wind farm on the grid. The clear verdict of the engineers responsible for grid reliability - a most conservative lot - is that the current level of wind power development will not require additional spinning reserve, while even much larger supplies of wind-generated electricity could be accommodated through a combination of energy storage technologies and improved models for predicting wind speeds.

With very few exceptions, then, wind output can be counted on to displace fossil fuel burning one for one. No less than other nonpolluting technologies like bicycles or photovoltaic solar cells, wind power is truly an anti-fossil fuel.

I made my first wind farm visit in the fall of 2005. I had seen big windmills up close in Denmark, and I had driven through the big San Gorgonio wind farm that straddles Highway I-10 near Palm Springs, California. But this trip last November had a mission. After years of hearing industrial wind turbines in the northeastern United States characterized as either monstrosities or crowns of creation, I wanted to see for myself how they sat on the land. I also wanted to measure the noise from the turning blades, so I brought the professional noise meter I had used in my campaign against Jet Skis.

Madison County occupies the broad middle of New York State, with the Catskill Mountains to the south, Lake Ontario to the northwest, and the Adirondacks to the northeast. Its rolling farms sustain seventy thousand residents and, since 2001, two wind farms, the 20-windmill Fenner Windpower Project in the western part of the county and the seven-windmill Madison Windpower Project twenty miles east.

At the time of my visit Fenner was the state's largest wind farm, although that distinction has since passed to the 120-windmill Maple Ridge installation in the Tug Hill region farther north. It was windy that day, though not unusually so, according to the locals. All twenty-seven turbines were spinning, presumably at their full 1.5-megawatt ratings. For me the sight of the turning blades was deeply pleasing. The windmills, sleek, white structures more than three hundred feet tall sprinkled across farmland, struck me as graceful and marvelously useful. I thought of a story in the New York Times about a proposed wind farm near Cooperstown, New York, in which a retiree said that seeing giant windmills near your house "would be like driving through oil derricks to get to your front door." To my eye, the Fenner turbines were anti-derricks, oil rigs running in reverse.

For every hour it was in full use, each windmill was keeping a couple of barrels of oil, or an entire half-ton of coal, in the ground. Of course wind turbines don't generate full power all the time because the wind doesn't blow at a constant speed. The Madison County turbines have an average "capacity factor," or annual output rate, of 34 percent, meaning that over the course of a year they generate about a third of the electricity they would produce if they always ran at full capacity. But that still means an average three thousand hours a year of full output for each turbine. Multiply those hours by the twenty-seven turbines at Fenner and Madison, and a good 200,000 barrels of oil or 50,000 tons of coal were being kept underground by the two wind farms each year - enough to cover an entire sixty-acre farm with a six-inch-thick oil slick or pile of coal.

The windmills, spinning easily at fifteen revolutions per minute - that's one leisurely revolution every four seconds - were clean and elegant in a way that no oil derrick or coal dragline could ever be. The nonlinear arrangement of the Fenner turbines situated them comfortably among the traditional farmhouses, paths, and roads, while at Madison, a grassy hillside site, the windmills were more prominent but still unaggressive. Unlike a ski run, say, or a power line cutting through the countryside, the windmills didn't seem like a violation of the landscape. The turning vanes called to mind a natural force - the wind - in a way that a cell phone or microwave tower, for example, most certainly does not.

They were also relatively quiet. My sound readings, taken at distances ranging from one hundred to two thousand feet from the tower base, topped out at 64 decibels and went as low as 45 - the approximate noise range given for a small-town residential cul-de-sac on standard noise charts. It's fair to say that the wind turbines in Madison County aren't terribly noisy even from up close and are barely audible from a thousand feet or more away. The predominant sound was a low, not unpleasant hum, or hvoohmm, like a distant seashore, but perhaps a bit thicker.

Thinking back on that November day, I've come to realize that a windmill, like any large structure, is a signifier. Cell-phone towers signify the intrusion of quotidian life - the reminder to stop off at the 7-Eleven, the unfinished business at the office. The windmills I saw in upstate New York signified, for me, not just displacement of destructive fossil fuels, but acceptance of the conditions of inhabiting the Earth. They signified, in the words of environmental lawyer and MIT research affiliate William Shutkin, "the capacity of environmentalists - of citizens - to match their public positions with the private choices necessary to move toward a more environmentally and economically sustainable way of life."

THE NOTION OF CHOICES points to another criticism of wind turbines: the argument that the energy they might make could be saved instead through energy-efficiency measures. The Adirondack Council, for example, in a statement opposing the 10-windmill Barton Mines project on a former mountaintop mine site writes, "If the Barton project is approved, we will gain 27 to 30 megawatts of new, clean power generation. Ironically, we could save more than 30 megawatts of power in the Adirondack Park through simple, proven conservation methods in homes and businesses."

The council's statement is correct, of course. Kilowatts galore could be conserved in any American city or town by swapping out incandescent light bulbs in favor of compact fluorescents, replacing inefficient kitchen appliances, and extinguishing "vampire" loads by plugging watt-sucking electronic devices into on-off power strips. If this notion sounds familiar, it's because it has been raised in virtually every power plant dispute since the 1970s. But the ground has shifted, now that we have such overwhelming proof that we're standing on the threshold of catastrophic climate change.

Those power plant debates of yore weren't about fuels and certainly not about global warming, but about whether to top off the grid with new megawatts of supply or with "negawatts" - watts that could be saved through conservation. It took decades of struggle by legions of citizen advocates and hundreds of experts (I was one) to embed the negawatt paradigm in U.S. utility planning. But while we were accomplishing that, inexorably rising fossil fuel use here and around the world was overwhelming Earth's "carbon sinks," causing carbon dioxide to accumulate in the atmosphere at an accelerating rate, contributing to disasters such as Hurricane Katrina and Europe's 2003 heat wave, and promising biblical-scale horrors such as a waning Gulf Stream and disappearing polar icepacks.

The energy arena of old was local and incremental. The new one is global and all-out. With Earth's climate, and the world as we know and love it, now imperiled, topping off the regional grid pales in comparison to the task at hand. In the new, ineluctable struggle to rescue the climate from fossil fuels, efficiency and "renewables" (solar and biomass as well as wind) must all be pushed to the max. Those thirty negawatts that lie untapped in the kitchens and TV rooms of Adirondack houses are no longer an alternative to the Barton wind farm - they're another necessity.

In this new, desperate, last-chance world - and it is that, make no mistake - pleas like the Adirondack Council's, which once would have seemed reasonable, now sound a lot like fiddling while the Earth burns. The same goes for the urgings by opponents of Cape Wind and other pending wind farms to "find a more suitable site"; those other suitable wind farm sites (wherever they exist) need to be developed in addition to, not instead of, Nantucket Sound, or Barton Mines, or the Berkshires.

There was a time when the idea of placing immense turbines in any of these places would have filled me with horror. But now, what horrifies me more is the thought of keeping them windmill free.

Part of the problem with wind power, I suspect, is that it's hard to weigh the effects of any one wind farm against the greater problem of climate change. It's much easier to comprehend the immediate impact of wind farm development than the less tangible losses from a warming Earth. And so the sacrifice is difficult, and it becomes progressively harder as rising affluence brings ever more profligate uses of energy.

Picture this: Swallowing hard, with deep regret for the change in a beloved landscape formerly unmarked in any obvious way by humankind, you've just cast the deciding affirmative vote to permit a wind farm on the hills outside your town. On the way home you see a new Hummer in your neighbor's driveway. How do you not feel like a self-sacrificing sucker?

Intruding the unmistakable human hand on any landscape for wind power is, of course, a loss in local terms, and no small one, particularly if the site is a verdant ridgeline. Uplands are not just visible markers of place but fragile environments, and the inevitable access roads for erecting and serving the turbines can be damaging ecologically as well as symbolically. In contrast, few if any benefits of the wind farm will be felt by you in a tangible way. If the thousands of tons of coal a year that your wind farm will replace were being mined now, a mile from your house, it might be a little easier to take. Unfortunately, our society rarely works that way. The bread you cast upon the waters with your vote will not come back to you in any obvious way - it will be eaten in Wyoming, or Appalachia. And you may just have to mutter an oath about the Hummer and use your moral imagination to console yourself about the ridge.

But what if the big push for wind power simply "provides more energy for people to waste?" as Carl Safina, an oceanographer who objects to the Cape Wind project, asked me recently. Safina is unusual among Cape Wind opponents, not just because he is a MacArthur Fellow and prize-winning author (Song for the Blue Ocean, Voyage of the Turtle), but because he is completely honest about the fact that his objections are essentially aesthetic.

"I believe the aesthetics of having a national seashore with a natural view of the blue curve of the planet are very important," he wrote in an e-mail from coastal Long Island, where he lives. "I think turbines and other structures should be sited in places not famed for natural beauty" - a statement that echoed my feelings about Utah's Goblin Valley.

"Six miles is a very short distance over open water," Safina continued, referring to the span from the public beach at Craigville on Cape Cod to the closest proposed turbine, "and a group of anything several hundred feet high would completely dominate the view." While the prominence of the turbines when seen from the shore is open to debate (the height of a Cape Wind tower from six miles would be just two-thirds of one degree, not quite half the width of your finger held at arm's length), there is no question that the wind turbines would, in his words, "put an end to the opportunity for people to experience an original view of a piece of the natural world in one of America's most famously lovely coastal regions."

Yet for all his fierce attachment to that view, Safina says he might give it up if doing so made a difference. "If there was a national energy strategy that would make the U.S. carbon neutral in fifty years," he wrote, "and if Cape Wind was integral and significant, that might be a worthwhile sacrifice." But the reality, as Safina described in words that could well have been mine, is that "Americans insist on wasting energy and needing more. We will affect the natural view of a famously beautiful piece of America's ocean and still not develop a plan to conserve energy."

Safina represents my position and, I imagine, that of others on both sides of the wind controversy when he pleads for federal action that could justify local sacrifice for the greater good. If Congress enacted an energy policy that harnessed the spectrum of cost-effective energy efficiency together with renewable energy, thereby ensuring that fossil fuel use shrank starting today, a windmill's contribution to climate protection might actually register, providing psychic reparation for an altered viewshed. And if carbon fuels were taxed for their damage to the climate, wind power's profit margins would widen, and surrounding communities could extract bigger tax revenues from wind farms. Then some of that bread upon the waters would indeed come back - in the form of a new high school, or land acquired for a nature preserve.

I It's very human to ask, "Why me? Why my ridgeline, my seascape, my viewshed?" These questions have been difficult to answer; there has been no framework - local or national - to guide wind farm siting by ranking potential wind power locales for their ecological and community suitability. That's a gap that the Appalachian Mountain Club is trying to bridge, using its home state of Massachusetts as a model.

According to AMC research director Kenneth Kimball, who heads the project, Massachusetts has ninety-six linear miles of "Class 4" ridgelines, where wind speeds average fourteen miles per hour or more, the threshold for profitability with current technology. Assuming each mile can support seven to nine large turbines of roughly two megawatts each, the state's uplands could theoretically host 1,500 megawatts of wind power. (Coastal areas such as Nantucket Sound weren't included in the survey.)

Kimball's team sorted all ninety-six miles into four classes of governance - Appalachian Trail corridor or similar lands where development is prohibited; other federal or state conservation lands; Massachusetts open space lands; and private holdings. They then overlaid these with ratings denoting conflicts with recreational, scenic, and ecological values. The resulting matrix suggests the following rankings of wind power suitability:

1. Unsuitable - lands where development is prohibited (Appalachian Trail corridors, for example) or "high conflict" areas: 24 miles (25 percent).

2. Less than ideal - federal or state conservation lands rated "medium conflict": 21 miles (22 percent).

3. Conditionally favorable - Conservation or open space lands rated "low conflict," or open space or private lands rated "medium conflict": 27 miles (28 percent).

4. Most favorable Unrestricted private land and "low conflict" areas: 24 miles (25 percent).

Category 4 lands are obvious places to look to for wind farm development. Category 3 lands could also be considered, says the AMC, if wind farms were found to improve regional air quality, were developed under a state plan rather than piecemeal, and were bonded to assure eventual decommissioning. If these conditions were met, then categories 3 and 4, comprising approximately fifty miles of Massachusetts ridgelines, could host four hundred wind turbines capable of supplying nearly 4 percent of the state's annual electricity - without grossly endangering wildlife or threatening scenic, recreational, or ecological values (e.g., critical habitat, roadless areas, rare species, old growth, steep slopes).

Whether that 4 percent is a little or a lot depends on where you stand and, equally, on where we stand as a society. You could call the four hundred turbines mere tokenism against our fuel-besotted way of life, and considering them in isolation, you'd be right. But you could also say this: Go ahead and halve the state's power usage, as could be done even with present-day technology, and "nearly 4 percent" doubles to 7-8 percent. Add the Cape Wind project and other offshore wind farms that might follow, and wind power's statewide share might reach 20 percent, the level in Denmark.

Moreover, the windier and emptier Great Plains states could reach 100 percent wind power or higher, even with a suitability framework like the AMC's, thereby becoming net exporters of clean energy. But even at 20 percent, Massachusetts would be doing its part to displace that 75 percent of U.S. electricity generated by fossil fuels. If you spread the turbines needed to achieve that goal across all fifty states, you'd be looking to produce roughly eight hundred megawatt-hours of wind output per square mile - just about what Massachusetts would be generating in the above scenario. And the rest of New England and New York could do the same, affording these "blue" states a voice in nudging the rest of the country greenward.

So goes my notion, anyway. You could call it wind farms as signifiers, with their value transcending energy-share percentages to reach the realm of symbols and images. That is where we who love nature and obsess about the environment have lost the high ground, and where Homo americanus has been acting out his (and her) disastrous desires - opting for the "manly" SUV over the prim Prius, the macho powerboat over the meandering canoe, the stylish halogen lamp over the dorky compact fluorescent.

Throughout his illustrious career, wilderness champion David Brower called upon Americans "to determine that an untrammeled wildness shall remain here to testify that this generation had love for the next." Now that all wild things and all places are threatened by global warming, that task is more complex.

Could a windmill's ability to "derive maximum benefit out of the site-specific gift nature is providing - wind and open space," in the words of aesthetician Yuriko Saito, help Americans bridge the divide between pristine landscapes and sustainable ones? Could windmills help Americans subscribe to the "higher order of beauty" that environmental educator David Orr defines as something that "causes no ugliness somewhere else or at some later time"? Could acceptance of wind farms be our generation's way of avowing our love for the next?

I believe so. Or want to.

Charles Komanoff, an economic policy analyst and environmental activist, is the author of Power Plant Cost Escalation. He lives in New York City and advocates for energy efficiency, bicycle transportation, and urban revitalization.

© 2006 Independent Media Institute. All rights reserved.
View this story online at:
http://www.alternet.org/story/41426/

0 Comments:

Post a Comment

<< Home