Wednesday, December 13, 2006

That's no lady . . .

Christopher Hitchins, who in the September 12, 2005 labeled the Iraq disaster a “war to be proud of,” proved in the recent issue of Vanity Fair that his imbecility extends not only to foreign policy. Hitchins constructs a ridiculous stereotype of the entire human race, male and female, in an article called, “Why Women Aren’t Funny.”

If you don’t have a strong stomach for sexist simple-mindedness and decline to read the piece, here’s his argument in a nutshell: Women don’t need humor to attract men, because men will chase after them regardless. Men as a group are less appealing than women and evolved humor as a way to get laid.

Some observations here: some women would be amazed to find out that attracting men is so easy. Perhaps Hitchins could have talked to women who are shunned because they are too smart, too assertive, or they weigh two ounces beyond the anorectic, disempowering standard of American culture. Or, perhaps in his extensive research Hitchins could have spoken to the women who starve themselves to reach Nicole Ritchie disproportions and then find that men are universally revolted.

His article reminds me of the Lenny Bruce routine, responding to the racist cliché about miscegenation, "Would you want one of them to marry your daughter?' In this sketch, Lenny imagines forcing the Grand Dragon of the Ku Klux Klan to say whether he would rather marry a black, black woman or a white, white woman, when the black woman is Lena Horne and the white woman is Kate Smith.

"If we want to get down to basics, let's persecute the ugly people," Lenny said.

That is the issue here, regardless of gender. Unattractiveness is pain in this culture. Humor is an excellent tool for coping with pain. It is also a devastating weapon of self-defense. If you are small and weak, what better way is there to keep from getting beaten up than to tell a joke that makes the bully laugh? If you are being tormented, what better way to change the subject than to mock someone else's shortcomings?

All women in American culture are made to feel unappealing and bullied. I have to wince when I have my students read Betty Freidan’s “The Feminine Mystique” as she analyzes the vapidity of women’s magazines in the 1950 and early 1960s because not a point of IQ has been added to the content of such publications since. In one of my lectures, I flip through an issue of "Good Housekeeping" and point out that’s there’s a colorful picture of a rich, calorie-laden cake on the cover while half the articles inside are about dieting. The rest of the articles tell women they face numerous health disasters with every passing moment of age, that their relationships are emotional Baghdads and that it’s up to them to revive the “spark,” and that it's almost impossible to balance home and career although everyone will expect them to. Every girl growing up in this country, whether she looks like Angelina Jolié or Tammy Faye Baker, is told by the media that she is fat, sick, mismanaging her time or is equally unimaginative in the bed and the kitchen.

Smart women respond to feelings of inadequacy the way men do. I don’t know what alternate universe Hitchins lives in, but I reside in a world of hilarious women, from my wife, to my funny and beautiful sisters-in-law, to my close friends Mary and Susan. Growing up, I recall laughing out loud at the writing of Dorothy Parker and Flannery O'Connor, the acting of Mae West, Myrna Loy (in the "Thin Man" movies) and Audrey Meadows (in "The Honeymooners"), and the 1980s standup of Carol Leifer (a regular contributor to “Seinfeld.") I think that Tina Fey is one of the funniest people on television today, male or female. Even the decidedly attractive but tragically abused Marilyn Monroe had impeccable comic timing, though that too often was less noticed than her cleavage. Wit emerges from inner demons, not from gonads.

I'm not surprised that Hitchens, who has suffered from testosterone poisoning since 9/11, turns a discussion of humor into a sexist claim of male comedic supremacy. He's not exactly the smartest bear in the zoo, even if he writes for a self-important magazine like Vanity Fair. He should go back to writing laughably wrong-headed screeds about Iraq.


Michael Phillips has authored the following:

White Metropolis: Race, Ethnicity and Religion in Dallas, Texas, 1841-2001 (Austin:  University of Texas Press, 2006)

(with Patrick L. Cox) The House Will Come to Order: How the Texas Speaker Became a Power in State and National Politics. (Austin: University of Texas Press, 2010)

“Why Is Big Tex Still a White Cowboy? Race, Gender, and the ‘Other Texans’” in Walter Buenger and Arnoldo de León, eds., Beyond Texas Through Time: Breaking Away From Past Interpretations (College Station: Texas A&M Press, 2011)

“The Current is Stronger’: Images of Racial Oppression and Resistance in North Texas Black Art During the 1920s and 1930s ”  in Bruce A. Glasrud and Cary D. Wintz, eds., The Harlem Renaissance in the West: The New Negroes’ Western Experience (New York: Routledge, Taylor and Francis Group, 2011)

“Dallas, 1989-2011,” in Richardson Dilworth, ed. Cities in American Political History (Washington, D.C.: CQ Press, 2011)

(With John Anthony Moretta, Keith J. Volonto, Austin Allen, Doug Cantrell and Norwood Andrews), Keith J. Volonto and Michael Phillips. eds., The American Challenge: A New History of the United States, Volume I.   (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Keith J. Volanto), Keith J. Volonto and Michael Phillips, eds., The American Challenge: A New History of the United States, Volume II. (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Carl J. Luna), Imperial Presidents: The Rise of Executive Power from Roosevelt to Obama  (Wheaton, Il.: Abigail Press, 2013). 

“Texan by Color: The Racialization of the Lone Star State,” in David Cullen and Kyle Wilkison, eds., The Radical Origins of the Texas Right (College Station: University of Texas Press, 2013).

He is currently collaborating, with longtime journalist Betsy Friauf, on a history of African American culture, politics and black intellectuals in the Lone Star State called God Carved in Night: Black Intellectuals in Texas and the World They Made.

Wednesday, December 06, 2006

George Orwell and George W. Bush

No irony of George W. Bush’s misbegotten presidency ranks as strange and surprising as his impact on the English language. Few men so terminally tongue-tied have so shaped political discourse. Phrases like “compassionate conservatism,” “faith-based initiatives,” “war on terror,” “axis of evil,” “evil-doers,” “shock and awe,” “mission accomplished” “heckuva job Brownie,” and “food insecurity” (the administration’s new term for “hunger”) have tumbled from the lips of this chief executive and molded public debate in the past half-decade.

Previous presidencies, like Franklin Roosevelt’s or John Kennedy’s, nudged the usual pedestrian blather of Washington, D.C., towards poetry. Bush’s impact on American English is most extraordinary because of its lack of musicality, it’s clunky simplicity, and its utter dishonesty. There was been no more sinister example of “Bush-speak” than the Pentagon policy labeled “extraordinary rendition.”

This maddeningly opaque double talk concealed a deep betrayal of American law and moral standing. As reported in a February 2005 issue of the New Yorker, Maher Arar, a Syrian-born engineer, lived the sadistic experience of extraordinary rendition. Officials arrested Arar at John F. Kennedy Airport in September 2002, as the McGill University graduate returned from a vacation in Tunisia en route to his Canadian home. Officials had placed Arar on the U.S. Watch List of Suspected terrorists not because of anything he had done specifically, but because he had worked with the brother of another suspect. Airport security handed Arar over to a “Special Removal Unit” which flew the young man to Syria, one of our purported enemies in the Iraq War and in Lebanon. Once there, Syrian secret police whipped Arar’s hands with two-inch thick electric cables and kept him in a windowless underground cell Arar later likened to a grave. After slightly more than a year of interrogation and torture, Arar would be released without charges.

The United States has not only rendered prisoners to secret locations in Syria. Another choice location for such detainees during the first three years of the war on terror was Egypt, where, according to one state department document, prisoners are frequently “stripped and blindfolded; suspended from a ceiling or doorframe with feet just touching the floor; beaten with fists, whips, metal rods, or other objects; subjected to electrical shocks; and doused with cold water [and] sexually assaulted.” Terror suspects, never given a trial, have on the behalf of the United States, been hanged by Egyptian officials, while one told later investigators he had suffered electric shocks to his genitals administered by Egyptian guards, and was hung upside from his limbs and left in a cell while kept in filthy water up to his knees. Our torture subcontractors in Egypt make the Army amateurs responsible for the human rights abuses at Abu Ghareib look like Red Cross volunteers.

To be fair, the policy of extraordinary rendition was not invented by the Bush White House, but actually had its origins in the mid-1990s as part of the Clinton administrations’ failed attempt to destroy Al Qaeda. The Bush White House, however, fully embraced this tactic, a shockingly candid Dick Chaney rationalizing such methods as trip to what he called, evoking the movie Star Wars’ Darth Vader, as a trip to the “dark side” necessitated by a ruthless enemy unbound by any ordinary moral constraints. It was, tragically, also a dangerous policy that gave a green light for the United States’ battlefield enemies to torture American soldiers. Extraordinary rendition also rested on the fatally flawed notion that useable information could be obtained through torture. During the mid-1300s, during an outbreak of the Bubonic Plague, Gentiles in several German communities tortured hundred of Jews into confessing that they had caused the Black Death through magic. Similarly, torture over the ages has coerced confessions of witchcraft, sexual liaisons with the devil and other outlandish supernatural fantasies from victims eager to tell tormentors what they want to hear.

The story of extraordinary renditions actually represents a rare triumph for journalism in the days of Bush. It took publications like the Washington Post and the aforementioned New Yorker to reveal the policy and call it what it was: outsourcing torture. Renditions endangered American democracy not just because they represented a complete abandonment of due process, but also because the phrase itself represented an assault on truth and meaning.

Language provides a frustratingly imprecise instrument even when people communicate with the purest of intentions. Otherwise, debates such as over the meaning of the Constitutional ban on “cruel and unusual punishment” would never have been necessary. A clear and present danger arises, however, when the powerful use language not to express ideas but to cloud reality. Again, this is not a tactic invented by George Bush and his minions. Bush’s predecessor in the White House once famously quibbled over the meaning of “is.” Bush’s battles with the English language, however, have too often been true acts of war, with real body counts, such as when he claims to “defend life” in the stem cell controversy while condemning Parkinson patients and similarly tortured souls to an early, agonizing, and unnecessary death.

The case of extraordinary renditions aside, too often the press in the past five years has collaborated with Bush’s Orwellian obfuscations. By January 22, 2004, the American CIA warned the administration that Iraq was sliding into civil war. By February 9, 2004, 109 Iraqi Kurds died in a car bombing set off mostly like by Iraqi Sunnis in Irbil. A wave of suicide bombings undertaken by Iraqis against their fellow citizens took 104 lives in Basra on April 21 of that year. February 22 of 2006 saw the bombing of the Golden Mosque, a highly revered Shiite shrine in Samarra, by Sunni extremists.

With the exception of little-read liberal bloggers, few voices dare call this war by its true name. With painful slowness, media outlets like the Los Angeles Times, the Christian Science Monitor, and Time Magazine quietly started using "civil war" to describe the conflict in Iraq. The same month as the Golden Mosque bombing, even Fox News, an administration echo chamber, backhandedly acknowledged reality, though with typical mind-bending News Corp spin. During a debate on Iraq’s sectarian violence, viewers saw the on-screen caption: “All-Out Civil War in Iraq: Could It Be a Good Thing?” Meanwhile, the bloodshed spun out of control, with 1,666 bombs piercing the Iraqi sky in July alone. In spite of this intense, continuous violence between Sunnis, Kurds, and Shiites within Iraq, it took three years for the first major American broadcast news outfit, NBC, to call events in Iraq a “civil war.”

Why the endless dithering over this label? The tenth edition of Merriam-Webster’s Collegiate Dictionary gives a breathtakingly simple definition of “civil war”:


A war between opposing groups of citizens of the same country.


By this simple, blunt criteria Iraq was in civil war less than a year after the American invasion. Yet a widespread use of the term did not take place. This was not for reasons of linguistic precision. As James Poniewozik of Time Magazine admitted in a December 3 column, the mainstream media rolled over and accepted the administration’s doublespeak on the war because:



the media business was, and is, existentially scared. TV audiences and print readerships are shrinking, along with media payrolls; nightly newscasts and newspapers wonder how much longer they will exist, much less thrive. The Administration has played on that fear of irrelevance, freezing out big institutions in favor of friendly local outlets and allies. A Bush aide told reporter Ron Suskind that journalists were an ineffectual ‘reality-based community.’ Were the mainstream media dying? The ebullient Bushies seemed to answer, “They're already dead!”



What is mind-boggling is that this media cowardice remained even after Bush’s incompetence stood in its full, sorry nakedness after Katrina. Thus, the president was recently able to mask the destruction of habeas corpus rights, first enshrined in English law in the Magna Carta almost eight centuries ago, under the innocuously-named “Military Commissions Act” of 2006. This law gave the president the right to declare American citizens “enemy combatants” and to hold them without trail as part of the war on terror. My hometown newspaper, the Austin American-Statesman, thought the death of an 800-year-old civil liberty so irrelevant that they crammed the story reporting the president’s signature of the Military Commissions Bill on page six.

As Poniewozik sadly confesses, the media only found the chutzpah to call a spade a spade, and a civil war a civil war, after Bush and his Republican army suffered their Waterloo during this past November election. This sudden discovery of courage, however, also bodes ill for those who pray that the media will serve as an independent institution that relies neither on government directives or temporary swings in public opinion for guidance. Today’s Washington press corps are unworthy peers of the journalists who have died covering the war in Iraq and sorry inheritors of the mantle once held by Elijah Lovejoy, an Illinois abolitionist who spoke truth not just to power but also to fickle, panic-stricken mobs, a stand that cost Lovejoy his life.

Lovejoy’s neighbors in the town of Alton felt the subject of slavery was too dangerous to openly discuss and threatened the publishers’ life unless he kept silent on the subject. On the night of November 7, 1837, 20 of Lovejoy’s supporters kept company with him as he guarded a brand new press to be installed at the offices of his newspaper, The Observer. The crowd drew the attention of Lovejoy’s angry opposition, which chunked rocks at the warehouse windows. When Lovejoy’s friends responded by throwing earthenware pots at the mob, gunfire erupted. One man climbed a ladder planning to set the warehouse on fire. As Lovejoy and a friend tried to stop the arson, someone in the crowd fatally shot the publisher. The assembly rushed inside the warehouse, shattered the press and threw the parts into the river.

Today’s reporters face far less danger within the United States than Lovejoy, but where he once bravely stood, his modern peers quaver. What James Madison once described as the “tyranny of the majority” threatens freedom no less than a deceptive, power-hungry White House. How much comfort can we take from a press that meekly whispered truth to power because of a momentary shift in the polls? Sixty-three percent of that same public told pollsters after the 9/11 attacks that they believed they would need to give up “some personal liberties in order to feel safe” according to the Roper Center. How long will the press corps’ tentative courage stand should there be another terror attack and the voting public again rallies almost unanimously to the banner of this president or some other future demagogue willing to manipulate fear to achieve political advantage?

Journalists can’t be fully blamed fully for failing to check the power of a president who sees language as a tool not to enlighten, but to conceal and frighten. But they should at least be counted on to be guardians of the language. Free speech can survive only if we agree that words have agreed-upon meaning, that war isn’t peace, that freedom isn’t slavery, that ignorance isn’t strength. How odd it is that Charlie Chaplain, the star of so many silent movies, showed the power of language dedicated to truth at the close of his 1940 classic The Great Dictator.

In the movie, Chaplain plays a Jewish barber who eerily resembles the title character, a fascist despot clearly patterned on Adolf Hitler. Through a series of mishaps, the barber has switched places with the dictator and is given a chance to address his conquering army during a national radio address. In the script, Chaplain addresses soldiers, but his words could as easily be aimed at journalists today. He says, in part:



Don’t give yourself to brutes, men who despise you, enslave you, who regiment your lives, tell you what to do, what to think and what to feel: who drill you . . . treat you like cattle, use you as cannon fodder. Don’t give yourselves to these unnatural men, machine men with machine minds and machine hearts. You are not machines. You are men. You have the love of humanity in your heart. You don’t hate: only the unloved hate, the unloved and the unnatural

. . . Don’t fight for slavery. Fight for liberty. In the seventeenth chapter of Saint Luke, it is written that “the kingdom of God is within man” — not one man, but in all men, in you. You have the power: the power to create machines, the power to create happiness. You the people have the power to make this life free and beautiful, to make this life a wonderful adventure.

Then, in the name of democracy, let us use that power. Let us all unite. Let us fight for a new world, a decent world that will give men a chance to work, that will give you the future and old age a security. By the promise of these things, brutes have risen to power, but they lie! They do not fulfill their promise; they never will. Dictators free themselves, but they enslave the people. Now let us fight to fulfill that promise. Let us fight to free the world, to do away with national barriers, to do away with greed, with hate, with intolerance. Let us fight for a world of reason, a world where science and progress will lead to men’s happiness . . . In the name of democracy, let us all unite.



That’s a tall order. Sadly, we bequeath you a world as corrupt and violent as the one we inherited. Against the journalist who would fight for Chaplain’s contrasting vision of justice are elites who have on their side wealth, police power, and institutional inertia.

But you, who will be broadcast and print reporters in the near future, come to the table not altogether empty-handed. You come into this world knowing that you and you children deserve so much better. And on your side are arrayed youth, energy and time. Earlier journalists battling slavery, the disenfranchisement of women, and the rapacious exploitation of robber baron capitalism in the late 19th century, lived in a more sinister world and faced bigger challenges and they re-recreated that society.

The changes they wrought were not always revolutionary, but each reform made life more decent and livable. But your time is fleeting and the window of opportunity for you to change even a small part of this unjust world rapidly closes.


Michael Phillips has authored the following:

White Metropolis: Race, Ethnicity and Religion in Dallas, Texas, 1841-2001 (Austin:  University of Texas Press, 2006)

(with Patrick L. Cox) The House Will Come to Order: How the Texas Speaker Became a Power in State and National Politics. (Austin: University of Texas Press, 2010)

“Why Is Big Tex Still a White Cowboy? Race, Gender, and the ‘Other Texans’” in Walter Buenger and Arnoldo de León, eds., Beyond Texas Through Time: Breaking Away From Past Interpretations (College Station: Texas A&M Press, 2011)

“The Current is Stronger’: Images of Racial Oppression and Resistance in North Texas Black Art During the 1920s and 1930s ”  in Bruce A. Glasrud and Cary D. Wintz, eds., The Harlem Renaissance in the West: The New Negroes’ Western Experience (New York: Routledge, Taylor and Francis Group, 2011)

“Dallas, 1989-2011,” in Richardson Dilworth, ed. Cities in American Political History (Washington, D.C.: CQ Press, 2011)

(With John Anthony Moretta, Keith J. Volonto, Austin Allen, Doug Cantrell and Norwood Andrews), Keith J. Volonto and Michael Phillips. eds., The American Challenge: A New History of the United States, Volume I.   (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Keith J. Volanto), Keith J. Volonto and Michael Phillips, eds., The American Challenge: A New History of the United States, Volume II. (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Carl J. Luna), Imperial Presidents: The Rise of Executive Power from Roosevelt to Obama  (Wheaton, Il.: Abigail Press, 2013). 

“Texan by Color: The Racialization of the Lone Star State,” in David Cullen and Kyle Wilkison, eds., The Radical Origins of the Texas Right (College Station: University of Texas Press, 2013).

He is currently collaborating, with longtime journalist Betsy Friauf, on a history of African American culture, politics and black intellectuals in the Lone Star State called God Carved in Night: Black Intellectuals in Texas and the World They Made.

Sunday, November 26, 2006

Bobby's Shadow on Today's Democrats

In an interview with ABC News Emilio Estevez, the director and writer of the new biopic “Bobby,” suggested that the assassination of Sen. Robert Kennedy in June 1968 marked a key turning point in American history, a time when the nation lost its collective innocence. "I believe that the death of Bobby Kennedy was, in many ways, the death of decency in America, the death of formality and manners, and the death of poetry," he said. Elsewhere he quotes his father, Martin Sheen, as describing the Ambassador Hotel in Los Angeles, where the murder took place, as “where the music died.”

Back in 1988, on the 20th anniversary of the assassination, historian Arthur Schlessinger was more specific, and more sweeping, in his assessment of what America lost when Sirhan Sirhan killed Kennedy. Schlessinger wrote in a Newsweek column that, had Kennedy not been murdered, he would have catapulted from his victory in the California primary to the Democratic nomination. He then would have beaten Richard Nixon in the November presidential election. Winning the White House, Kennedy would have ended the Vietnam War much sooner, cutting in half the number names now on that tragic Vietnam Memorial in Washington.

President Robert Kennedy, Schlessinger speculated, would have continued the reform tradition of the New Deal and New Frontier, might have achieved racial reconciliation between whites and blacks and, by defeating Nixon, would have prevented the national malaise ushered in by Watergate and the later failed presidencies of Gerald Ford and Jimmy Carter.

That’s a huge, messianic burden for a one-term U.S. Attorney General and four-year Senator from New York to bear, a similar one that has been thrust upon the shoulders of Bobby’s similarly-martyred brother John F. Kennedy. The idolization of both Kennedys represent prime exhibits of what historian refer to, usually with derision, as the “Great Man “ theory of history — the notion that the times are shaped not by larger forces like industrialization or racism, but by bold individuals of unique vision who rise above the moment and bend the world to their will.

Undoubtedly, the Kennedys are so widely mourned, and so much idealistic fantasy is still projected upon their memories, because of their youth at the time of their deaths, and their soaring eloquence. The Kennedy legend has mutated into a second, uniquely American historical myth, a “Garden of Eden” legend that America was innocent and infused with youthful energy before the martyrdom of Jack and Bobby, and that these deaths, along with the killing of civil rights leader Martin Luther King, Jr., jolted the nation into a terrible, destructive trajectory.

Ironically, given the populist nature of Bobby Kennedy’s anti-Vietnam War presidential campaign, such lionization is dangerously anti-democratic and disempowering. Depicting the American past at any point as innocent can only leave one paralyzed in the decidedly violent, corrupt, imperfect world we live in today. Such a view also is a lie. There was nothing pure about the America Bobby Kennedy inhabited in his 43 years — not the America of segregation or of the McCarthy hearings (of which he was a supportive side player as a member of the red-baiting Wisconsin senator’s legal staff.) There was nothing innocent in the America that in 1955 murdered 14-year-old Emmett Till for whistling at a white girl or that butchered three civil rights workers in 1964 for trying to make the theoretical right to vote a reality for African Americans. By the time Bobby died, the United States had dropped atom bombs on Hiroshima and Nagasaki and American soldiers had already slaughtered about 300 unarmed men, women and children in the Vietnamese village of My Lai.

There’s reason to think that not much would have changed had Bobby Kennedy lived. If he had reached the White House and fulfilled his campaign promise to withdraw from Vietnam, Republicans and conservative Democrats would have pilloried him as the man who “lost Southeast Asia,” much as Harry Truman had been condemned as the man who supposedly lost China in 1949. There likely would have been a post-war recession, as happened under Nixon, when defense spending inevitably declined. White Americans would still have been frightened by the rise of assertive, black nationalist groups like the Black Panthers and liberal judges would still probably have ordered school busing in places like Boston, sparking a similar white backlash as has defined American politics for nearly four decades.

Bobby alone could not have healed the Arab-Israeli divide, which might still have sparked the Yom Kippur War in 1973 and the resulting Arab oil embargo, an event that exposed the United States as economically vulnerable. In any case, the deindustrialization of the Northeast and the growing support for free trade in both parties would have eroded the strength of the union vote so essential to the Kennedy family’s national political ambitions.

Still, give Bobby his due. Here stood an unusually intelligent man whose command of the language approached Shakespearean grace. No better speech has ever been made by an American politician, except maybe by Abraham Lincoln in Gettysburg, than the one Kennedy delivered spontaneously to an African American crowd one April 1968 night in Indiana after he learned of Rev. King’s assassination. His growth as a politician, from the harsh and intimidating bully who managed his brother’s presidential campaign in 1960 to the gentle man who caressed a Mississippi child sickened by malnutrition during a 1967 Senate investigation of poverty, only becomes more poignant and inspiring because he lived not in the Garden, but in a world that for too many was nasty, brutish and short.

His embrace of the sick, underfed child in that Mississippi shack stands out even more starkly because, by the 1990s, both parties had completely abandoned the poor. The image of that starving American faded, to be replaced by the ugly, Reagan-era cartoon of the “Welfare Queen.” The “New Democrats” of the Clinton era felt there was no percentage in reaching out to people too hungry and desperate to vote, when rich electoral awards awaited those who promised “to end welfare as we know it.” The platform of the new Democratic House leadership — long overdue raises in the minimum wage, reduced interest rates on student loans, and such — stand anemically aside earlier, bold visions of wars on poverty embraced by Democrats of Bobby Kennedy’s time.

Since Clinton, the left has clung to the Democrats mostly out of the fear of lost abortion rights and the threat of a Republican-led theocracy. Progressives settled for mediocrities like Clinton because giants like the Kennedy brothers don’t come very generation. Hero myths like the Kennedy legend ultimately poison democracy because they bind ideas that should transcend time to the stiff chains of mortality. When such heroes inevitably die, left behind is a demoralized public awaiting another knight on horseback to replace the slain idol.

Bobby’s ultimately doomed quest for the presidency, however, is less a story about how a young politician could have saved the world but for a gunman’s bullet, but about how ordinary, faceless and nameless Americans, many working class and not college educated, turned against the Vietnam War and demanded a new direction. Kennedy’s last political crusade may ultimately have been deferred for four years, but it did not die in that kitchen in the Ambassador Hotel. Kennedy lived and died in a world without easy victories and without Christ-like heroes, but only people who wrestled with and slowly came to terms with the truth and managed their best to live accordingly. To mythologize the Kennedy past Hollywood-fashion, sadly, is to deny the late senator’s true bravery.


Michael Phillips has authored the following:

White Metropolis: Race, Ethnicity and Religion in Dallas, Texas, 1841-2001 (Austin:  University of Texas Press, 2006)

(with Patrick L. Cox) The House Will Come to Order: How the Texas Speaker Became a Power in State and National Politics. (Austin: University of Texas Press, 2010)

“Why Is Big Tex Still a White Cowboy? Race, Gender, and the ‘Other Texans’” in Walter Buenger and Arnoldo de León, eds., Beyond Texas Through Time: Breaking Away From Past Interpretations (College Station: Texas A&M Press, 2011)

“The Current is Stronger’: Images of Racial Oppression and Resistance in North Texas Black Art During the 1920s and 1930s ”  in Bruce A. Glasrud and Cary D. Wintz, eds., The Harlem Renaissance in the West: The New Negroes’ Western Experience (New York: Routledge, Taylor and Francis Group, 2011)

“Dallas, 1989-2011,” in Richardson Dilworth, ed. Cities in American Political History (Washington, D.C.: CQ Press, 2011)

(With John Anthony Moretta, Keith J. Volonto, Austin Allen, Doug Cantrell and Norwood Andrews), Keith J. Volonto and Michael Phillips. eds., The American Challenge: A New History of the United States, Volume I.   (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Keith J. Volanto), Keith J. Volonto and Michael Phillips, eds., The American Challenge: A New History of the United States, Volume II. (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Carl J. Luna), Imperial Presidents: The Rise of Executive Power from Roosevelt to Obama  (Wheaton, Il.: Abigail Press, 2013). 

“Texan by Color: The Racialization of the Lone Star State,” in David Cullen and Kyle Wilkison, eds., The Radical Origins of the Texas Right (College Station: University of Texas Press, 2013).

He is currently collaborating, with longtime journalist Betsy Friauf, on a history of African American culture, politics and black intellectuals in the Lone Star State called God Carved in Night: Black Intellectuals in Texas and the World They Made.

Tuesday, October 31, 2006

Caps, Gowns and White Sheets

Earlier this year, former left-wing radical turned current right-wing extremist author David Horowitz achieved minor fame with the publication of his book The Professors: The 101 Most Dangerous Academics in America. Horowitz, who receives generous corporate funding, charges that American universities are centers of left-wing, anti-American propaganda and has campaigned for what he calls an “Academic Bill of Rights” he hopes will be adopted by state legislatures across the country. This so-called bill of rights would allow students to file complaints with universities about professors they consider politically biased and allow schools to discipline scholars who bring supposedly irrelevant political discussion into the classroom.

Horowitz makes wild claims in his book. He writes that left-wing radicals among faculty members used to outnumber right-wingers by a 9-1 ratio, but now these campus commies prevail by a 30-1 ratio. Actually, the most comprehensive study of political viewpoints held by scholars, one conducted by UCLA, revealed that professors holding views from mainstream liberalism to socialism outnumber conservatives by only a 3-1 ratio. This trend says nothing about the quality of these scholars or their fairness towards students with whom they disagree.

It remains unclear, beyond giving students the power to silence political opinions they don’t like in the classroom, what solution Horowitz advocates short of some sort of ideological affirmative action. Of course, what Horowitz fails to acknowledge, bias varies by department. Bolsheviks, for instance, don’t tend to dominate business schools, where many faculty members draw handsome endowments from Fortune 500 companies. The same is the case with petroleum engineering departments and in the sciences where many professors’ financial comfort is tied to defense spending or big oil.
Horowitz’s book was sloppy, with the author labeling as “dangerous” professors he disagreed with on the Iraq war, race relations, or affirmative action. To Horowitz, his opponents don’t simply hold bad ideas, they are bad people. One target, Caroline Higgins, made the list for teaching classes on peace and social justice at Earlham College, a Quaker college. Quakers are, by theology, pacifists and teaching a class on peace at such a school is the equivalent of teaching about Jesus at Baylor.

University of Texas at Arlington political science professor Jose Angel Gutierrez got (black)listed because of his student activism in La Raza Unida back in the 1960s and 1970s and a book he wrote as a young man 32 years ago called “A Chicano Manual on How to Handle a Gringo,” a work Horowitz decried as racist.

Far more interesting and revealing than who Horowitz listed as dangerous was who he left out. What Horowitz deliberately ignores is a disturbing trend in academic disciplines such as psychology and sociobiology across the United States and Canada. In these fields, many academics have embraced not merely right-wing politics, but overt racism. A shockingly large number of professors in these disciplines argue that real biological and intellectual differences exist between races and because of these differences, African Americans and Mexican Americans are less intelligent than whites and more prone to crime, welfare and childbirth out of wedlock.

Recent genetic research confirms that race is an illusion. It seems certain that the first human inhabitants of Europe shared dark skin with their African ancestors. Recent DNA mapping demonstrates that all present-day homo sapiens are more than 99.9 percent genetically identical. Yet, upon this infinitesimal degree of genetic difference, some white supremacists within the academy have built claims of immense differences between whites and their black and brown neighbors in terms of intelligence and character.

This belief is not limited to a marginal group of cranks. In one 1980s survey, 50 percent of physical anthropologists and 73 percent of animal behaviorists accepted the notion that biologically distinct races exist within the human species. Meanwhile, “[p]sychologists tend to base their racial classifications on categories defined socially, rather than by physical or biological anthropology,” author Marek Kohn has noted. “Yet their findings have evidently persuaded many of them that races are both real and intellectually unequal.”

The resurgence of racism, particularly in the field of psychology, came to the public attention with publication of the surprise bestseller of 1994, The Bell Curve: Intelligence and Class Structure in American Life by Richard Herrnstein and Charles Murray. The late Herrnstein served as a tenured Harvard psychology professor while Murray worked as a richly compensated researcher and polemicist for right-wing think tanks. The Bell Curve argued, among other highly debatable positions, that a 15-point gap existed between the mean IQs of whites and blacks, that lower IQ scores meant that blacks had a higher tendency towards crime, illegitimate births and welfare dependency, that lower black IQs stemmed from genetics and not social conditions, and that social programs aimed at improving black life in America such as affirmative action were pointless and doomed by biological destiny to failure.

Murray and Herrnstein’s thesis rested on the dubious notion that races are real and definable with something approaching scientific precision. It furthermore depended on the problematic notion that a phenomena as complex as intelligence can be reduced to a single assigned number on a test that measures almost exclusively verbal and mathematic skills while leaving other forms of intelligence such as artistic creativity, muscle memory and reflex, strategic skills and other factors unmeasured.

Even if one accepts the questionable methodology employed in IQ tests, recent research on intelligence demonstrates that the alleged IQ gap between blacks and whites, a gap that incidentally does not exist between whites and black in countries like England, has rapidly closed. This clearly suggests that Murray and Herrnstein, who blamed supposedly lower black and brown IQ scores on genetics, were flat wrong and that poverty, hunger, discrimination and a poor school environment, can influence how an individual does on a standardized test.

Nevertheless, The Bell Curve overwhelmed readers with more than 100 pages of appendices, a dense array of graphs and obtuse discussion of high-level statistical methods such as regression analysis. Thus, it received a shockingly positive response from publications like the supposedly liberal New Republic and the New York Times, whose reviewers felt incompetent to judge the book’s deficient science and who felt no moral impulse to condemn its politics.

The Bell Curve did bring a storm of protests, almost exclusively from those outside the media on the political left, prompting a December 13, 1994, The Wall Street Journal letter defending major contentions made by the book. Written by 52 academics, including seven professors from Texas universities, the letter asserted that blacks on average scored 15 points lower than whites, with “the bell curve for American blacks roughly around 85,” only 15 points above the threshold of retardation. The letter declared heredity to be the primary cause of IQ differences between blacks and browns on the low end of the intelligence spectrum and Anglos and Asians on the high end, and that “black 17-year-olds perform, on the average, more like white 13-year-olds in reading, math, and science, with Hispanics in between . . .”

Among the signatories were five psychology professors at the University of Texas at Austin: David B. Cohen, Joseph M. Horn, John C. Loehlin, Del Theissen and Lee Willerman. Willerman joined the American Eugenics Society in 1974. Eugenics is the pseudo-science of race betterment through selective breeding. He helped direct the Texas Adoption Project, which purported to demonstrate the genetic basis of intelligence and was funded in part by the eugenicist Pioneer Fund, founded by a wealthy Nazi sympathizer in the 1930s, which on its webpage describes most of its funded researchers as “race-realists [who] view race as a natural phenomenon to observe, study, and explain. They believe that human race is a valid biological concept, similar to sub-species or breeds or strains.”

Whether or not all the signers of the Wall Street Journal letter can be accurately described as racist, it would be hard to find any other label for one of the signatories, J. Phillipe Rushton, a tenured psychology professor at the University of Western Ontario. Murray and Herrnstein used Rushton as a major resource for the Bell Curve. Rushton claims that human races developed as a result of different evolutionary strategies. Rushton claims, for instance, that whites have bigger brains than blacks (which he says correlates to higher intellect) and that blacks have larger genitals (information he derives in part from responses to “surveys” he conducts of unsuspecting blacks in shopping malls.) Higher IQs allowed whites to survive in cold climates, Rushton has written, while larger genitals allowed less intelligent blacks to survive because of higher rates of reproduction. “It's a trade-off: more brain or more penis,” Rushton once told Rolling Stone Magazine. “You can't have everything.”

Rushton, used as a recurring guest writer on racial issues for the “mainstream” conservative magazine National Review, also claims that the average nation has a collective IQ of 90, (with only five holding a supposed national IQ equal near Great Britain’s score of 100.) Some of Rushton’s dubious information on African intelligence comes from IQ tests administered to black South African children in the era of apartheid. From such data, Rushton claims that African nations possess an average IQ of 70. “An IQ of 70 suggests mental retardation [and] . . . is equivalent to a mental age of about 11 years,” Rushton wrote. “. . . [T]he low African IQ of 70 remains hard for many to accept. One reason for the disbelief: Africans —— and African Americans —— display high levels of social competence. They are outgoing, talkative, sociable, warm, and friendly . . . It is this ‘winning personality’ among Blacks, I believe, that makes it hard for so many to accept the validity of their failing tests of abstract reasoning ability.”

In attacking affirmative action, University of Texas law professor Lino Graglia drew on the work of academic racists like Richard Herrnstein. On September 10, 1997, at a press conference noting the one-year anniversary of the United States Fifth Circuit Court of Appeals ruling in Hopwood v. Texas case that temporarily voided UT’s affirmative action program (a decision later overturned in the 2003 Supreme Court case Grutter v. Bollinger), Graglia, a former United States Justice Department attorney under Dwight Eisenhower who began teaching at UT in 1966, declared that ``Blacks and Mexican-Americans are not academically competitive with whites in selective institutions. It is the result primarily of cultural effects. They have a culture that seems not to encourage achievement. Failure is not looked upon with disgrace.”

Of course, the veteran right-wing legal scholar needed now prompting from Herrnstein and Murray to express contempt for African Americans and Mexican Americans. The long-active Republican had been considered in 1986 a finalist by the Reagan administration for a seat on the Fifth Circuit Court (the same body that later made the Hopwood ruling) but his appointed fizzled as news that he had urged Austin residents in his writings to defy a court-ordered busing plan and that he had referred to blacks students in his class "pickaninnies" became widely known.

During his September 10, 1997 press conference, Graglia poured contempt on top of ridicule, claiming that minority students who couldn’t compete in real academic classes “insist that the game be changed. Let’s study something else. Let’s have black studies instead of chemistry.” The following day, in an interview with the Austin American Statesman, Graglia said he had urged parents in Austin to resist a busing order because “I don't know that it's good for whites to be with the lower classes ... (because) ... they perform less well in school. They tend towards greater violent behavior.”

David Horowitz, so deeply wounded by the alleged racism of UTA Professor Gutierrez’s earlier Chicano racism, hailed Graglia for his brave honesty. In Texas, Graglia’s words sparked a firestorm of controversy, with student groups and members of the Texas Legislature calling for his dismissal or resignation and with Jesse Jackson leading a protest at the campus leading to a sit-in at the university’s law school. While some professors condemned his words, others worried about the chilling effect on academic freedom that would result from any censure. As such, many of Graglia’s peers at UT refused to take the professor at his word, insisted that all evidence to the contrary, that he was not a racist, and rallied behind him, albeit in a lukewarm fashion. Horowitz tried to elevate Graglia to martyrdom, calling him a victim of an academic lynching, but Graglia kept his job and the vigorous defense mounted by his liberal colleagues of the privileges of tenure makes ridiculous Horowitz’s portrayal of American universities as drowning in political correctness. Graglia certainly didn’t feel stifled. In a 1999 debate on affirmative action held at UT, Graglia proved unrepentant. ``If (African Americans and Hispanics) were competitive, there wouldn't be preferences,'' Graglia said to a crowd of 300.


Not all academic racists are crude Negrophobes like Graglia. Some are raving anti-Semites. Another psychology professor, Kevin McDonald, who teaches at California State University-Long Beach, earned a master’s degree in evolutionary biology and holds a doctorate in bio-behavioral sciences. In three books, McDonald has argued that Jews have dominated societies in which they represent a tiny minority because they use a “group evolutionary strategy” that involves “backing democracy, equality, socialism and the like in order to weaken the dominant ethnic group . . .” as noted by the anti-racist group the Southern Poverty Law Center.

Sadly, if Rushton and McDonald represent extreme examples, their differences with many of their colleges are of degree, not of kind. David Horowitz is right about one issue: ideas matter and what is said at universities often shapes American society, for better or worse. An earlier generation of academic racists, for instance, successfully lobbied the United States Congress to pass the harsh Immigration Restriction Act of 1924, which essentially closed immigration here to the Italians, Poles, Greeks, Russians and Jews deemed by the science of that era to be intellectually and culturally inferior to Northern European “Nordics.” The law tragically shut off one avenue of escape for Jews fleeing the Holocaust in the 1930s and 1940s.

Similarly, the words of modern-day academic racists echoed in the 1990s effort to “end welfare as we know it,” with works such as The Bell Curve brandished by conservatives as hard evidence supporting their Social Darwinist agenda. Such crude biological determinism can also be heard in hysterical, bigoted denunciations by present-day nativists such as Pat Buchanan, the immigrant-phobic CNN news anchor Lou Dobbs, and members of the Minuteman movement, who attack Mexican immigrants as tending towards crime and having intellects and cultures that will dangerously “dumb down” the country.

Such talk of innate difference, legitimized by academics, no doubt lies behind the move to build an ugly Berlin Wall across the Mexican border, a plan that reveals a nation that has returned to a 1924 view of the world as peopled by inherently distinct tribes unable to communicate with and adapt to each other, but programmed by genes to perpetual cultural warfare.

A danger lurks at American universities, but it does not stem from the academic left. It comes from right-wingers who believe that ideas matter less than DNA and the chromosomes make the citizen. The Jerry Springer show has deceived us into thinking that racists all wear white sheets and live in trailer parks. Some don caps and gowns and teach at the highest reaches of academia.


Michael Phillips has authored the following:

White Metropolis: Race, Ethnicity and Religion in Dallas, Texas, 1841-2001 (Austin:  University of Texas Press, 2006)

(with Patrick L. Cox) The House Will Come to Order: How the Texas Speaker Became a Power in State and National Politics. (Austin: University of Texas Press, 2010)

“Why Is Big Tex Still a White Cowboy? Race, Gender, and the ‘Other Texans’” in Walter Buenger and Arnoldo de León, eds., Beyond Texas Through Time: Breaking Away From Past Interpretations (College Station: Texas A&M Press, 2011)

“The Current is Stronger’: Images of Racial Oppression and Resistance in North Texas Black Art During the 1920s and 1930s ”  in Bruce A. Glasrud and Cary D. Wintz, eds., The Harlem Renaissance in the West: The New Negroes’ Western Experience (New York: Routledge, Taylor and Francis Group, 2011)

“Dallas, 1989-2011,” in Richardson Dilworth, ed. Cities in American Political History (Washington, D.C.: CQ Press, 2011)

(With John Anthony Moretta, Keith J. Volonto, Austin Allen, Doug Cantrell and Norwood Andrews), Keith J. Volonto and Michael Phillips. eds., The American Challenge: A New History of the United States, Volume I.   (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Keith J. Volanto), Keith J. Volonto and Michael Phillips, eds., The American Challenge: A New History of the United States, Volume II. (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Carl J. Luna), Imperial Presidents: The Rise of Executive Power from Roosevelt to Obama  (Wheaton, Il.: Abigail Press, 2013). 

“Texan by Color: The Racialization of the Lone Star State,” in David Cullen and Kyle Wilkison, eds., The Radical Origins of the Texas Right (College Station: University of Texas Press, 2013).

He is currently collaborating, with longtime journalist Betsy Friauf, on a history of African American culture, politics and black intellectuals in the Lone Star State called God Carved in Night: Black Intellectuals in Texas and the World They Made.

Thursday, October 26, 2006

My October 24 speech before the Dallas Historical Society and Dallas Heritage Village

When most people think of race relations, they associate the term with black and white battles over civil rights, the Anglo relationship to Hispanic people, or the contemporary, frequent conflicts between African Americans and Mexican Americans. Categories like “white,” “black” and “brown” are taken as a given and the general public uses these terms without reflection on the vagueness of the categories or the general haziness surrounding the concept of race itself.

Most Americans would probably be surprised to find out that groups lumped together today by the United States Census Bureau under the term “Caucasian,” such as the Irish, Jews, and Italians, were considered non-whites by English-descended elites in the late 19th and the early 20th century. Because of this misunderstanding, many miss the historical drama of race relations in cities like Dallas, where many racial struggles took place between powerful Anglos and marginalized “probationary whites” denied a place in the city’s power structure.

The late Harvard paleontologist Stephen Jay Gould and most of the scientific community today has argued that race has no real scientific meaning. There is more genetic variation — deviations in skin pigment, hair texture, inherited disorders, etc. — within the arbitrary racial boxes used to divide humanity than between each category.

Since miscegenation has proved as certain in human history as death, war and taxes, and since the purity of each group is a fiction, the definitions of these supposedly distinct categories change each time a child is born. As sociologist Howard Winant points out, " [I]n the United States, hybridity is universal: most blacks have ‘white blood,’ and many millions of whites have ‘black blood.’ . . . colonial rule, enslavement, and migration have dubious merits, but they are all effective 'race mixers.'"
Nevertheless, the idea of race arose in Western society around the time of Columbus and served to rationalize the slavery of Africans and the extermination of Native Americans from the 1500s to the 1800s. Creating a race hierarchy in which one gained membership in an elite caste, and avoided enslavement, simply based on skin color admirably served the self-interests of the wealthy and powerful. Millions of whites in the antebellum South lived lives of desperate poverty and little or no political influence, yet they could claim, at least, a superior status to the black-skinned “property,” who often toiled endlessly beside them.

With the abolition of slavery in 1865, however, racial politics became infinitely more complex. The post-Civil War 13th, 14th, and 15th amendments to the U.S. Constitution abolished slavery and granted citizenship rights, including the right to vote, to African American men. The caste privilege enjoyed by otherwise powerless poor whites had disappeared and they had hence lost their assumed vested interest in the power structure erected in cities like Dallas.

The former Confederate Postmaster and Dallas State Senator John H. Reagan saw danger in poor whites, who now might perceive that they lost status as a result of new civil liberties granted freedmen. Just after the Civil War, Reagan proposed to the Texas political leadership that the state grant voting rights to a limited number of literate freedmen in order to appease Northerners and keep them from pursuing a more aggressive program of black civil rights. At the same time, he wanted to eliminate voting rights from what he feared might be a more restless and radicalized population of “white trash.” Calling for requiring voters to pass literacy or intelligence tests, Reagan wrote to Texas Governor J.W. Throckmorton that any intelligence test required of potential voters that "would only affect the negroes, and would allow whites of a less degree of intelligence . . . to vote, would do no good towards securing the great ends we desire to attain."

That great end was the political extinction of poor men and women viewed as off-white. What haunted the imaginations of Dallas-area elites in the late 19th century was the prospect that poor whites might discover that, skin color aside, they had more in common with African Americans than they did with wealthy white elites. This prospect arose as a frightening reality in the late 19th century with the rise of the Populist Movement in Texas and the rest of the South and Midwest.

The seeds of Dallas working class radicalism had been sown in the agricultural misery of late nineteenth century. Rural refugees suffering from a farm depression raging since the 1870s made up much of Dallas County’s new population. Hammered by a tight money supply, falling cotton prices aggravated by overproduction, and excess interest charged by creditors, farmers in the Dallas area found they worked harder only to sink deeper in debt. Desperate, many of these farmers, "poverty-stricken men, women and little children," fled to Dallas and other cities in search of a better life but instead found "petty money wages, almost nothing an hour, and no limit to the hours" as Dallas socialist George Clifton Edwards said. These rural refugees found hope in the reform agenda of the Populist movement and the Socialist Party, which demanded higher wages, an eight-hour work day, an end to child labor and the right of workers to organize unions.

Led largely by middle-class, well-educated men and women, the Populist and socialist movements tapped into agrarian and working class culture in building their movements, "holding ‘Encampments’ which were more like Methodist camp meetings" where socialist ideology was discussed, Edwards recalled. Carl Brannin, another Dallas radical, said radical political meetings at the turn of the century gave “people a vision and a desire for the kingdom of heaven on earth, where justice between mankind will prevail and where unemployment, crime, disease and unrighteous acts will be unknown . . ."
The boom-and-bust economy from the 1870s to the 1930s deepened Dallas county's class divide. Radicalized by hardship, farmers formed the Texas People’s Party, better known as the Populists, in the 1890s. Populists adopted a bold political program in 1886 in Cleburne, Texas, 50 miles southwest of Dallas. The so-called "Cleburne Demands" called for a vast expansion of the nation’s money supply and for the government to provide direct credit to farmers as a way to cut out greedy middlemen.

Although the Populists kept their movement segregated, the fact that white members collaborated with parallel black Populist organizations led Texas Democrats to charge the Peoples’ Party with undermining white supremacy. By the 1890s, because of violence, intimidation, and the charge that it advocated "race mixing," the Populist movement collapsed. Nevertheless, a Dallas "Labor-Populist alliance" elected union painter Patrick H. Golden as the county’s state representative in 1892, butcher Max Hahn to the Dallas City Council from 1898-1900, and union musician John W. Parks to the Texas state legislature from 1912-1918.

Under pseudo-Populists like Texas governor Jim Hogg, the state Democratic Party absorbed the Peoples’ Party by the end of the 1890s and Populism virtually disappeared as a political alternative for farmers and workers. "Texans had responded to the Populist agitation rather warmly but the Populists were not very far from the Democrats in principle . . . and the movement faded more quickly than it arose," Edwards said.

The sight of black and white political cooperation during the Populist era raised for traditional elites the terrifying prospect of class revolt and the power structure in Dallas and elsewhere in the South moved to erect a new structure separating the races to serve the same function that has once been filled by slavery: to make poor whites to fill invested in a power structure that still left them economically and politically powerless. It was in the late 19th century and early 20th centuries that Jim Crow segregation arose in Dallas. Elites amended the city charter in 1907 to provide for segregation in schools, churches and public amusement venues. In 1916, by a referendum vote of 7,613 to 4,693, Dallas became the first city in Texas to allow racial housing segregation by law.

The law created three categories of neighborhoods – white, black and open. Neighborhoods already exclusively occupied by one race would be closed to the other. Open blocks, made up of poor and working class families, were already integrated and would remain so. The Texas Supreme Court invalidated the ordinance in 1917, but in 1921 the Dallas City Council passed a new law by which residents of a neighborhood could request that their block be designated as white, black or open. Once a designation was made, only a written request by three-fourths of the residents in that block could change the neighborhood's racial assignment.

Frightened by marginal and potentially revolutionary poor whites, Dallas' prosperous, meanwhile, retreated into protected enclaves on Ross Avenue and The Cedars, an affluent neighborhood south of downtown "enclosed by a natural thicket of cedar tress that blocked out much of the noise and confusion of the city." Landscape architect Wilbur David Cook developed Highland Park in 1907 as a refuge from an increasingly diverse city. Completely surrounded by Dallas, Highland Park incorporated as a separate town in 1913 and bitterly resisted attempts at annexation by its urban neighbor. Highland Park became the residence of "the executives of big businesses, utility companies and bankers" who founded the mini-city as a congenial tax dodge. Residents protected "from the depredations of the minorities" avoided higher city taxes while Dallas provided them with water at much lower cost even as rates climbed for city residents. The city limits of in-burbs like Highland Park and University Park, with their own school systems and police departments, became moats and the residents eagerly raised the drawbridges to keep away frightening African Americans, Mexican Americans and white radicals.

As law professor Ian F. Haney López argues, segregation made concrete the racial and, by extension, the class differences asserted by elite ideology. In other words, through their control of urban planning, elites limited city services in black, brown and poor white neighborhoods. Such neighborhoods became crowded because segregation law limited housing options for people of color and because low wages limited housing available to impoverished whites. The near-monopoly of the wealthy on political power also guaranteed that unhealthy developments like dumps and liquor stores would always be located within impoverished, disenfranchised communities. The resulting crowding, poor maintenance and filth provided proof in the elite mind of the inferiority of the poor and colored, and marginally white masses. This was true across the United States and Dallas certainly was no exception.

The process of class segregation accelerated in the 1920s, the white working class concentrating in the southern end of East Dallas while a middle class community formed near Baylor Hospital. By 1925, 60 percent of elites lived in Highland Park or North Dallas and 25 percent along toney Swiss Avenue in East Dallas. Only 14 percent still lived in South Dallas, with the remaining one percent holding out in the strongly blue-collar Oak Cliff community.

Almost simultaneously, elites began to mythologize themselves. It was in this era that Dallas leadership assembled the components of what would come to be known in the late twentieth century as the “Origin Myth.” The first histories of Dallas, produced in the late 19th century, focused on the supposedly exclusive role played by white businessmen in creating Dallas. The city’s first "historian," newspaper editor, author, state legislator, mayor of Galveston and then of Dallas, John Henry Brown, began this dominant Whiggish tradition in Dallas historiography. Starting with Brown's History of Dallas County, Texas: From 1837 to 1887, such "boosters" portrayed the city’s history as beginning with the arrival of John Neely Bryan, the first Anglo settler, in 1841. This narrative thus erases some 14,000 years of Native American history in the Dallas area. Dallas emerged, Brown said, when the first Texans conquered not only Indians, but Mexicans, "a nation of mixed blooded people, who had been held, for three hundred years, in abject subjection to a foreign, absolute monarchy . . ."
Brown already fully articulated a central feature in Dallas mythology, that whites represented progress and that people of color represented savagery.

In the twentieth century, local historians continued to use race as one explanation for the success of white colonizers in Dallas. "The Anglo-Saxon element, which penetrated the North East section of Texas, was of the same strain as King Alfred the Great, who wrote the charter of English liberty, and of the same blood that coursed through the veins of Oliver Cromwell," wrote one Dallas chronicler, Mattie Jacoby Allen. Allen, like Brown, saw liberty as grounded in whiteness. Allen described independence as innate in the conquering Anglo-Saxons. "These pioneers, who blazed the way into a savage and unfriendly country . . . relied on . . . the physical strength which they possessed, and their own individualism."

Brown’s themes were reiterated 22 years later by the second major local historian, prominent attorney Philip Lindsley. In his 1909 work A History of Greater Dallas and Vicinity, Lindsley described the Anglo conquest of Texas in 1836 as "the reassertion of the inherent superiority of the Anglo-Saxon over the Latin races." The hatred of "oppression and misrule" and "the universal longing for freedom" all derived from Anglo-Saxon racial traits, Lindsley argued. The city’s earliest chroniclers, like Brown and Lindsley, crafted most elements of the city’s Origin Myth. A fertile land lay wasted in the hands of colored peoples. A group of brave and determined white businessmen took the crude but favorable elements and fashioned from them a meritocracy on the plains.

Dallas’ success and growth rested on the accomplishments of white men, according to the city’s first historians. It’s future, some elites believed, could only be imperiled with the political empowerment of poor and working class whites. This viewpoint found voice most clearly, and harshly, in the writings and speeches of Dallas attorney Lewis Meriwether Dabney, a Virginia transplant and son of a University of Texas at Austin philosophy professor. Dabney opened his Dallas law practice in 1888, soon befriending some of the most powerful men in the city.

Dabney urged other Dallas leaders to restrict immigration and eliminate the right to vote to all but the most "qualified" white men. Dabney's comments came in the context of mass German and Jewish immigration into Dallas in the late nineteenth century, the arrival of Mexican Americans in large number between 1910-1930 and the development of a growing Sicilian community in the first three decades of the twentieth century. He blamed the growth of those communities on Anglo-Saxons who created such a comfortable civilization that now even the racial dregs of the world thrived. "As society has advanced from the primitive to the semi-civilized . . . its functioning has been biologically adverse to the best strains and favorable to the worst," Dabney said in an address to Dallas' influential Critic Club in December 1922.

Dabney feared the rise not just of the "African Hottentot" in multi-cultural districts like Deep Ellum, but also white racial deterioration. American cities had filled with inferior whites, such as "mongrelized Asiatics, Greeks, Levantines, Southern Italians, and sweepings of the Balkans, of Poland and of Russia," Dabney complained. Dabney told one friend that he did not, for the most part, regret the South losing the Civil War except that "as the negroes put it, 'the bottom rail got on top.'" The real tragedy of the Confederacy's defeat more than half a century earlier was "the emerging of these 'half-strainers' from the bottom to the top. These the war liberated much more than it did the Africans. This is the day of the poor white in the South . . .”

Like African Americans and Mexican Americans, the white working class was seen as carrying racially impure blood and was thus incapable of civilization. These poor whites, like their black and Mexican peers, were seen as a disease on the body politic that had to be quarantined and, if possible, eliminated. Elites went further than physically segregating marginal whites. They sought to increase the breeding of better whites and to keep inferior whites from reproducing. In 1914, a "Better Baby Contest" proved one of the most popular events at the Texas State Fair in Dallas. A committee of doctors measured the skulls and other traits of the 500 entrants, with $15 dollars given to the parents of the "best" child, any class, and $5 for the best twins and best triplets. The children in such contests, as historians have noted, were awarded in a similar way that prize "cattle, chickens and pigs" received blue ribbons elsewhere on the fairgrounds.

Winners of the “Better Baby” contests were inevitably white, flaxen-haired and the scions of elite families. Run by elites, these contests confirmed in the minds of the powerful their own superiority. Elites, however, hoped to convince the masses as well of the superiority of the rich. Hoping to evangelize the Dallas crowds to the gospel of better breeding, A. Caswell Ellis of the University of Texas flattered the crowd, declaring "Texas babies are better babies than the babies of any other state," before he "lightly touched on eugenics."

The chief lesson of eugenics was that the dysgenic white threatened the republic as much as the enfranchised black. This fear reverberated throughout America in the first three decades of the twentieth century. Like John H. Reagan before him, leading American eugenicists like Madison Grant feared the voting power of poor whites as much as he did extending the franchise to other races. The advance of democracy, Grant argued, led to "the transfer of power from the higher to the lower races, from the intellectual to the plebian class . . .," with the universal franchise resulting in the political triumph of the mediocre. Around the same time, an Alabama doctor warned of the threat posed by so-called white trash and proposed at a state medical convention a final solution to prevent a racial apocalypse. Genetically inferior poor whites, he said, "ought not to be allowed to get married, and men who persist in [degenerate behavior] ought to be confined in reformatory institutions, or have their testicles removed, so that it would be impossible for them to propagate." Dallas School Superintendent Justin Kimball also worried about the influence of a lower-class electorate who might be unfit for full citizenship. "Ignorant or corruptible citizens can always be counted on to vote, although they usually vote wrong," he wrote.

Dallas school textbooks in the early twentieth century echoed Kimball’s suspicion of lower-class whites and the superintendent’s discomfort with democracy. The wisdom of the Constitution, according to the authors of the 1935 Record of America lay in its protection of elite rule against the shortsighted demands of the unpropertied. Leaving political power in the exclusive hands of the rich was the American way, Dallas schools taught the city’s children. The Founding Fathers, the Dallas textbook said, "had little faith in the ability of people as a whole to maintain self-control and wisdom in government. They had no confidence in the man without property . . . a man who had failed to [accumulate property] . . . would be regarded as shiftless, lazy, or incompetent, and not deserving a voice in the government of others."

Lewis Dabney, meanwhile, echoed Madison Grant's sentiments about the dangers of mass politics. Dabney regretted the rising power of the poor white and prophesied the collapse of American civilization if those inferiors attained too much influence. Democracy, he warned, "by its very nature rejects the best and seeks the worst and is stumbling down into the mire."

Later in the twentieth century, Dallas elites acted on their terror of marginal whites. The 1920s and 1930s would see the rise in this city of the Open Shop Association, which crushed union organizing, and the Dallas Citizens Council, an elite cabal so autocratic that a candidate favored by the group could win a city council seat without making a single speech or making a single statement on the issues. Democracy withered in the city with scarcely a whimper.

The white supremacist ideas embraced by the Dallas city leadership, and shared by wealthy elites across the Western world, would reach their tragic, violent, but logical conclusion in the gas ovens of Auschwitz and Buchenwald. The horrors of the concentration camps forced a tactical retreat on the part of organized elite racists in Europe and the United States, but by then white elite racial ideology had claimed millions of victims and made a travesty of democracy in Europe, the United States, and here in Dallas.

After federal courts mandated black and brown representation in city government beginning in the 1970s, and open, and often messy debate, finally began in city hall and the county courthouse, one could immediately hear the sighing nostalgia of elites longing for the good old days when our Anglo superiors made our decisions for us. Dictatorship would be preferable, the sentiment seemed to be, to the squalid shouting matches at City Hall, the school board and the county commissioners court.

Racism, of course, never disappeared and old ideas of eugenics lurk behind modern discussions of the supposed IQ gap between whites and blacks and the debate over whether we should build a Berlin Wall across the Mexican border to keep out dark-skinned people deemed incapable of becoming productive American citizens.

As the elite voices from the late 19th and early 20th century I have quoted perhaps unintentionally teach us, racism is a deadly toxin for a democracy and relentlessly seeks new victims. A society that seeks to redefine black and brown people as less than human has never stopped there and must constantly seek new objects of fear as elites scramble to justify their ever-greater acquisition of wealth and power. Elite rule depends on the politics of divide and conquer. Genuine democracy, by contrast, can only thrive in a color-blind society. The battle to create that society is not just an issue of fairness. It is a matter of survival.


Michael Phillips has authored the following:

White Metropolis: Race, Ethnicity and Religion in Dallas, Texas, 1841-2001 (Austin:  University of Texas Press, 2006)

(with Patrick L. Cox) The House Will Come to Order: How the Texas Speaker Became a Power in State and National Politics. (Austin: University of Texas Press, 2010)

“Why Is Big Tex Still a White Cowboy? Race, Gender, and the ‘Other Texans’” in Walter Buenger and Arnoldo de León, eds., Beyond Texas Through Time: Breaking Away From Past Interpretations (College Station: Texas A&M Press, 2011)

“The Current is Stronger’: Images of Racial Oppression and Resistance in North Texas Black Art During the 1920s and 1930s ”  in Bruce A. Glasrud and Cary D. Wintz, eds., The Harlem Renaissance in the West: The New Negroes’ Western Experience (New York: Routledge, Taylor and Francis Group, 2011)

“Dallas, 1989-2011,” in Richardson Dilworth, ed. Cities in American Political History (Washington, D.C.: CQ Press, 2011)

(With John Anthony Moretta, Keith J. Volonto, Austin Allen, Doug Cantrell and Norwood Andrews), Keith J. Volonto and Michael Phillips. eds., The American Challenge: A New History of the United States, Volume I.   (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Keith J. Volanto), Keith J. Volonto and Michael Phillips, eds., The American Challenge: A New History of the United States, Volume II. (Wheaton, Il.: Abigail Press, 2012).

(With John Anthony Moretta and Carl J. Luna), Imperial Presidents: The Rise of Executive Power from Roosevelt to Obama  (Wheaton, Il.: Abigail Press, 2013). 

“Texan by Color: The Racialization of the Lone Star State,” in David Cullen and Kyle Wilkison, eds., The Radical Origins of the Texas Right (College Station: University of Texas Press, 2013).

He is currently collaborating, with longtime journalist Betsy Friauf, on a history of African American culture, politics and black intellectuals in the Lone Star State called God Carved in Night: Black Intellectuals in Texas and the World They Made.