- - Wednesday, April 30, 2014

Sometimes doctrines just vanish, once they appear as naked as the proverbial emperor in his new clothes.

Something like that seems now to be happening with affirmative action. Despite all the justifications for its continuance, polling shows the public still strongly disagrees with the idea of using racial criteria for admissions and hiring.

Its dwindling supporters typically include those who directly benefit from it, or who are not adversely affected by it. Arguments for the continuance of affirmative action are half-hearted and may explain why some supporters descend into name-calling directed at those who dare question its premises.

The Supreme Court, by a 6-2 majority, recently upheld the decision by Michigan voters that their state would neither favor nor discriminate against applicants to the state’s public universities on the basis of race.


Recently, a group of liberal Asian-American state lawmakers in California — a state that is more than 60 percent non-white — successfully blocked a proposed return to racial considerations in college admissions.

Asian-American students are now disproportionately represented in the flagship University of California system at nearly three times their percentages in the state’s general population. If race were reintroduced as a consideration for admission, Asian-Americans would have had their numbers radically reduced in the California system at the expense of other ethnic-minority students, regardless of their impressive ethnically blind grades and test scores.

Expect more such pushback.

In the 1950s, when the country was largely biracial — about 88 percent so-called white and 10 percent black — and when the civil rights movement sought to erase historical institutionalized bias in the South against blacks, affirmative action seemed to be well-intentioned and helpful.

More than a half-century later, though, and in a vastly different multiracial America, affirmative action has been re-engineered as something perpetual and haphazardly applicable to a variety of ethnicities.

Class divisions are mostly ignored in admissions and hiring criteria, but in today’s diverse society they often pose greater obstacles than race. The children of “1 percenters” such as Beyonce and Jay-Z will have doors opened to them that are not open to those in Pennsylvania who, according to President Obama, “cling to guns or religion.”

Race itself also is increasingly a problematic concept in 21st-century America. The more we talk about Hispanics, blacks, Asians and others as if they were easily distinguishable groups, the less Americans fit into such neat rubrics. In an age of intermarriage, assimilation and global immigration, almost every American family has been redefined by members who are one-half this or one-quarter that.

Yet if verifiable hyphenation is to be our touchstone to career or academic identity, how do we certify minority status in an increasingly intermarried and multiracial society where there soon will be, as in California, no majority ethnic group? Are we to wear DNA badges to certify the exact percentages of our racial pedigrees — to prevent another Elizabeth Warren or Ward Churchill from gaming the system?

Affirmative action once was defended as redress for the odious sins of slavery and Jim Crow segregation. But almost 150 years after the end of slavery, and a half-century after the establishment of civil rights legislation, it is hard to calibrate the interplay between race, relative past oppression and the need for compensatory action.

In a zero-sum, multiracial society, how do we best appreciate past suffering? How do we compare the Jewish-American whose grandparents were wiped out in the Holocaust with the grandchildren of those Japanese who were interned during World War II?

If compensation is not historically based, what then are the criteria that calibrate ongoing victimization? Would a European-Argentinian immigrant with a Hispanic name better qualify for affirmative action than a Bosnian Muslim refugee?

Story Continues →