The Great Land Grab and the New Colonialism by Hugh Hamilton

November 18, 2009

You can’t eat money — which partly explains why rich countries and private investors have been on a spending spree in much of the developing world, buying up every square foot of farmland they can lay their hands on from Cambodia to the Congo. It’s the latest international land grab – the new colonialism — under which nearly 50 million acres of farmland have either changed hands in the past three years, or are currently under negotiation. Billions of dollars are at stake.

For relatively rich food-importing countries like Saudi Arabia, Kuwait, South Korea and China, it’s all about food security. By buying up farmland in less affluent nations like Paraguay, Pakistan, Madagascar, Ethiopia and Sudan, they are able to guarantee future food supplies for their own people at affordable cost by outsourcing the production to countries where land and labor are comparatively cheap. Proponents say it’s a win-win situation, in which struggling economies get a financial shot-in-the-arm from the much-needed investment, and the investors can rest easy knowing where the next meal is coming from.

But land is a finite resource, and critics warn that much of the land being traded in these transactions is being taken away from small-scale farmers who have no other way of  supporting themselves or feeding their families. Indeed, a new report from the San Francisco-based Oakland Institute argues that the win-win scenario deliberately sidelines the issue of food security for the poor, and the plight of the independent farmers who are being displaced from their land or turned into plantation workers. According to the report, an estimated 1.02 billion people – one sixth of humanity – suffer from chronic hunger; and in one of the world’s cruelest ironies, 70 percent of that starving population live and work on small-scale farms in rural areas. Those are the very people who stand to lose the most in this new form of “food colonialism.”

But that’s only part of the story.

Food Security vs. Financial Speculation

The other part of the story was first revealed just over a year ago in a groundbreaking report by the international non-governmental organization GRAIN. Titled, Seized: The 2008 Land Grab for Food and Financial Security, the report identified “two parallel agendas driving two kinds of land grabbers.”  On the one hand are those rich countries that rely on food imports and worry about future security amid the threat of soaring prices and crippling shortages on the international food market. But on the other hand are those international investors and speculators looking to maximize their financial gain at any cost. If there are any bad guys in this business, these guys would be the leading contenders. According to GRAIN:

“For a lot of people in power, the [2008] global food crisis laid bare an overarching problem: that no matter where you look, climate change, soil destruction, the loss of water supplies and the plateauing of monocultured crop yields are bearing down as big threats on our planet’s future food supplies. This translates into forecasts of tight markets, high prices and pressure to get more from the land. At the same time, the finance industry, which has gambled so much on squeezing money from debt and lost, is looking for safe havens. All these factors make agricultural land a smart new toy to make profits with.”

In short, the food crisis, coupled with the broader financial crisis, has turned control over land into an important new magnet for private investors.

For these investors, the formula goes like this: food has to be produced, prices will likely remain high, and cheap land is currently available in poor countries, therefore an investment in farmland is bound to pay off. It was against this backdrop that GRAIN first sounded  the alarm on how “an army of investment houses, private equity funds, hedge funds and the like have been snapping up farmlands throughout the world,” aided and abetted in  the process by supposedly reputable international agencies like the World Bank, its International Finance Corporation and the European Bank for Reconstruction and Development. All these agencies are now in the business of “greasing the way for this investment flow” by pressuring governments in poor countries to change land ownership laws in favor of foreign investors.

Some of the biggest names in private investment – from Deutsche Bank and Goldman Sachs to Morgan Stanley and BlackRock Inc. were among last year’s leading land grabbers. The countries they targeted for farmland acquisitions included Malawi, Senegal, Nigeria, Brazil, Paraguay, Ukraine, Uzbekistan, Russia, Georgia and Australia.

But it is the role of the international financial agencies and institutions in this process that many consider to be particularly odious. When I discussed this issue last spring with Anuradha Mittal, executive director of the Oakland Institute, she described it as nothing short of “colonization all over again.”

“Instead of talking about land reform and constructive measures like land redistribution that would enable poor people to engage in a useful profession by growing food for their families and their communities, what we find these agencies promoting is the sale of land as an investment opportunity,” she explained. “What makes this trend especially dangerous is that poorer countries which are facing food insecurity, and where the poor are going hungry, are now seeing their land taken away for meeting the needs of the rich in the rich countries. And the entire process is accompanied by an enormous amount of arm-twisting in the name of aid. We have to go after that trend and highlight it, and drag this Dracula into the sunshine.”

That’s exactly what her Oakland Institute has attempted to do in its latest report released this fall. Titled, The Great Land Grab: Rush for World’s Farmland Threatens Food Security for the Poor, the report warns that throughout history, wherever corporate agribusiness has established itself in developing countries, the effect has been either to drive independent farmers off their land, or to reduce them to servitude through  plantation labor. It documents multiple case studies from Latin America and the Caribbean to the Philippines, in which such transactions have fuelled social unrest, deepened socio-economic inequities, spawned political upheaval and generated even greater food insecurity.

“The history of foreign direct investment in agriculture belies the claim that the current land acquisitions will positively impact the development of poor nations,” says Shepard Daniel, lead author of the new report and a fellow at the Oakland Institute. “No matter how convincing the claim that these massive international acquisitions will bring much needed agricultural investment to poor countries, evidence shows there is simply no place for the small farmer in the vast majority of these land-grab situations.”

That’s precisely the message that small farmers and their supporters took to Rome this week, where they convened a People’s Food Sovereignty Forum to coincide with the 2009 World Food Summit sponsored by the United Nations Food and Agriculture Organization. The summit was attended by dozens of world leaders. Let’s hope the world leaders were listening.

Print This Article


Chris Hedges, the End of Literacy, and the Triumph of Spectacle by Hugh Hamilton August 30th, 2009

August 30, 2009

Chris Hedges speaks with almost evangelical fervor in describing the ambition behind his new book, “Empire of Illusion: The End of Literacy and the Triumph of Spectacle.”

“It’s a kind of wake-up call,” he tells me. “If we don’t grasp the disparity between what we are and what we think we are, we are going to commit an act of collective self-annihilation.”

And who exactly do we think we are? For that, go directly to his chapter on “The Illusion of America,” wherein Hedges warns that Americans “embrace the dangerous delusion that we are on a providential mission to save the rest of the world from itself, to impose our virtues – which we see as superior to all other virtues – on others, and that we have the right to do this by force.” It is a misguided belief, he writes, that has corrupted both Republicans and Democrats, and one that signals our inability to distinguish between illusion and reality. It is this tragic flaw that has differentiated the dying gasp of every previous empire – from the Aztecs to the Austro-Hungarians. And now it may be our turn.

In “Empire of Illusion,” Hedges introduces us to an America he portrays as floundering  in a state of “moral and physical decay.”  It is an America trapped in an epidemic of functional illiteracy: nearly a third of the nation’s population is illiterate or barely literate – a figure that is growing by more than 2 million a year; a third of high school graduates never read another book for the rest of their lives, and neither do 42 percent of college graduates.

“We are a culture that has been denied, or passively given up, the linguistic and intellectual tools to cope with complexity,” he writes. “Propaganda has become a substitute for ideas and ideology. And in this precipitous decline of values and literacy …fertile ground for a new totalitarianism is being seeded.”

There are moments in this book that blur the already uncertain boundaries between prognostication, prophecy and apocalypticism. But Hedges is uncompromising in the certitude of his pronouncements, as when he warns that “[a]t no period in American history has our democracy been in such peril or the possibility of totalitarianism as real. Our way of life is over. Our profligate consumption is finished. Our children will never have the standard of living we had. This is the bleak future. This is reality.”

Perpetual Infantilism

In a conversation lasting just over an hour last Tuesday, I asked Hedges to discuss some of the conditions that have enabled this triumph of illusion over reality.

“The best example would be this perverted ethic that we can have everything that we want if we just dig deep enough within ourselves,” he told me. “It is the idea that if we [believe] we are truly exceptional, reality will never be an impediment to what we desire. That basic message is peddled by the consumer culture, by corporatism, by the Christian Right, by the entertainment industry. And as that chasm widens between who we think we are and what we actually are, it becomes more and more dangerous.”

The pervasiveness of illusion, he explained, has the effect of keeping us in a state of “perpetual infantilism or childishness,” so that when eventually we are forced to confront reality – whether because of a home foreclosure or expired unemployment benefits – we react as children. That’s when we begin “looking for demagogues and saviors, for revenge, for moral renewal. And we can already see signs of that in these proto-fascist movements leaping up around the fringes of American society.”

At 232 pages, Empire of Illusion is not a hefty volume. But the linguistic and intellectual tools for which Hedges is widely renowned are very much in evidence throughout this compact and rigorously crafted work. In chapters separately documenting the pervasiveness of illusion in areas ranging from Literacy, Love and Wisdom, to Happiness and ultimately, America itself, he decries the “bankruptcy of our economic and political systems,” and warns of a looming economic and political Armageddon unless we develop the courage and wherewithal to confront and reverse the stranglehold of the corporate state.

Hedges’ chapter on the Illusion of Love, for example is an X-rated expose on the excesses of the multi-billion-dollar pornographic industry. He travels to Las Vegas for the annual Adult Video News Expo and delivers a ball-by-blow commentary on what passes for life behind and beyond the cameras. Las Vegas, he writes, is  the “corrupt, willfully degenerate heart of America,” a city built on illusions and one that lends itself to the commodification of human beings as depicted in the horrifying degradations of the porn industry. Porn, he writes, is about reducing women to corpses; it is about necrophilia.

“When I would interview the women who are on the porn set, they talked about swallowing handfuls of painkillers, being completely black-and-blue by the time they were finished, about vaginal and anal tears that require surgery,” he told me. “Having suffered from post traumatic stress disorder myself, it became instantly clear when I interviewed these women that I was dealing with people who were victims of trauma. This commodification of women is just one more example of the commodification of everything within our culture. It is about the loss of the sacred — about the loss of a belief that human beings, like the rest of the natural world, have an intrinsic value beyond a monetary value.”

But Hedges seems to harbor a particular resentment for the Illusion of Wisdom.

“The multiple failures that beset our country, from our mismanaged economy to our shredding of Constitutional rights to our lack of universal health care to our imperial debacles in the Middle East, can be laid at the door of institutions that produce and sustain our educated elite,” he writes. He rails against the retreat by elites into “specialized ghettos,” the assault on the humanities, and the transformation of the nation’s universities into “glorified vocational schools for the corporations.”

Hedges — himself the holder of a Master of Divinity from Harvard University – brings to this analysis the perspective of an insider who has both studied and taught at some of the country’s most prestigious academic institutions.

“I have taught at some of these schools – including Princeton, Columbia and NYU,” he told me. “They churn out systems managers. They reward a very peculiar kind of intelligence – essentially an analytical intelligence. The danger of this is that people who have emotional, intuitive or creative intelligence are excluded. So what you end up with are essentially drones – people who have the capacity to do a prodigious amount of work but lack the ability or the moral autonomy to question assumptions and structures.”

Real intelligence, he argues, is by its very nature subversive. It is about challenging cultural, political, societal and economic assumptions. But that ability is not taught at universities, and those few professors who do try to teach those values are attacked as being “liberal.”

But Hedges believes that “liberal” is itself a code word. “What they’re really attacking,” he says, is “moral autonomy, the capacity to challenge and think about structures in a new way. And by silencing that ability, by refusing to teach – even among those within elite universities – those broader questions of meaning, purpose, dislocation, inequity and power, they end up producing wave after wave of ‘systems managers’ who lack the capacity to do anything but serve a dying system.”

Roosevelt, Milosevic or Hitler

The road ahead is grim, Hedges warns, and we have few tools left to dig our way out of the looming crisis. He hedged his bets when asked what form the crisis will take – maybe environmental, maybe economic, maybe a confluence of both. But he is concerned that even more important than the crisis itself could be the kind of reaction it engenders if the people are unprepared.

“When societies are unprepared for what is happening around them, then they react as children react. And I think we are already seeing signs of that reaction that are very frightening,” he told me.

“You can in a time of crisis end up with a Franklin Delano Roosevelt, or you can end up with a Slobodan Milosevic or an Adolph Hitler, and I think those are the choices that we face. We have very virulent and powerful forces of hate – many of whom we dismiss naively as buffoons. But in a moment of crisis, to enraged, bewildered and confused people, these demagogues will talk a language of violence that may be all that dispossessed and confused people will be able to understand.”

Ultimately, Hedges has concluded that the only way to avert the threat of collective self-annihilation is to retreat from the rampant consumerism, militarism, and the cult of individualism that have come to define our way of life. It will require that we manage our expectations differently, but it’s the only prescription he has to offer.

“Not consuming at the levels we are consuming and not killing at the levels that we kill may diminish our power and diminish the material goods around us, but it won’t necessarily diminish the capacity for a life of meaning,” he told me. “Learning a new humility, learning to live with less, learning to rebuild a community that has been destroyed by the consumer and commodity culture which perpetuates this cult of the self, can actually create a lifestyle that will be enhanced even though it will be materially deprived.”

A simpler, more modest future, perhaps.

But a future, nonetheless.

Print This Article

“There … but for the Grace of God…” by Hugh Hamilton July 18, 2009

July 21, 2009

It was the moment of empathy we’ve all been waiting for: addressing delegates to the centennial anniversary convention of the NAACP this week, President Barack Obama came clean with a confession that was instantly recognizable to almost every conscious Black male of a certain age in America.

“When I drive through Harlem and I drive through the South Side of Chicago and I see young men on the corners,” Obama remarked, “I say, ‘there – but for the Grace of God – go I.’”

It was a noteworthy admission from a president who in the past has attracted considerable criticism for his willingness (some even say propensity), to chastise Black men in public, without a commensurate acknowledgment of the historical legacies that circumscribe their prospects in contemporary America.

Even as a candidate in last year’s election, Obama ignited widespread controversy over a Father’s Day speech in which he took  absentee Black fathers to task while ignoring the totality of circumstances that often render many of those fathers unable to fulfill their paternal obligations. As author and political analyst Earl Ofari Hutchinson observed at the time:

“[T]his kind of over the top, sweeping talk about alleged black father irresponsibility from Obama isn’t new…Whether Obama is trying to shore up his family-values credentials with conservatives, or feels the need to vent personal anger from the pain and longing from being raised without a father is anybody’s guess. Or maybe he criticizes black men out of a genuine concern about the much media-touted black family breakup. But Obama clearly is fixated on the ever media-popular notion of the absentee black father. And that fixation for whatever reason is fed by a mix of truth, half truths and outright distortion.”

Indeed, with the notable exception of his famous Philadelphia speech last year, and his often exasperating tendency to pontificate on the ostensible failures of Black fatherhood, Obama hitherto has displayed a marked reluctance to engage publicly on the complexities of race. So much so that when offered the chance by ABC news reporter Ann Compton last March to comment on the role that race has played so far in his presidency, he fumbled the opportunity and squandered what should have been a “teachable moment.” (In my earlier post on this issue, I outlined in some detail an alternative response that the president might have offered to Compton’s question, beginning with the incontestable proposition that “…this recession — painful as it is for all of us — has exacted a heavier toll on some than on others. And for those who historically have been marginalized and discriminated against on the basis of race, that toll has been heaviest of all.”

This time, though, Obama got it right. Maybe it was the historic significance of the occasion: America’s first Black president addressing the 100th anniversary convention of  the nation’s oldest civil rights organization. Whatever the reason, he delivered an oration that not only challenged Black Americans to seize control of their destiny; he also took appropriate notice of the context in which that struggle would be waged and the barriers that remain. Consider the following under-reported excerpts from the president’s NAACP speech:

  • “We know that even as our economic crisis batters Americans of all races, African Americans are out of work more than just about anyone else;
  • “We know that even as spiraling health-care costs crush families of all races, African Americans are more likely to suffer from a host of diseases but less likely to own health insurance than just about anyone else;
  • “We know that even as we imprison more people of all races than any nation in the world, an African-American child is roughly five times as likely as a white child to see the inside of a jail;
  • “We also know that prejudice and discrimination are not even the steepest barriers to opportunity today. The most difficult barriers include structural inequalities that our nation’s legacy of discrimination has left behind; inequalities still plaguing too many communities and too often the object of national neglect.”

The corporate media have been largely silent on this aspect of Obama’s remarks, choosing instead to highlight the bit about your destiny being in your hands. That may be true enough – at least for some — and most African Americans already know that. What they need from a leader who purports to understand their concerns and represent their interests is a little bit of empathy. And at long last, that’s what Obama finally delivered last Thursday.

Print This Article

Will the Voting Rights Act Become a Victim of its Own Success? …..By Hugh Hamilton

April 27, 2009

When the Supreme Court hears arguments this Wednesday (April 29) in a case challenging one of the key components of the federal Voting Rights Act, it will be doing so amid a political landscape that is far less openly hostile to Blacks than when the law was first enacted 44 years ago. But the court should not infer from this that the landmark legislation is no longer needed. On the contrary, many of the political gains that our so-called “racial, ethnic and language minorities” now enjoy are directly attributable to the vigorous application of the law. The success of the law therefore argues in favor of its retention, rather than any retreat from its enforcement.

At issue is the “pre-clearance” provision of the Act, which requires that before making any changes to current electoral procedures, states and other jurisdictions with a history of practices that restrict minority voting rights must first obtain permission from the Justice Department or federal court. In adopting the requirement – commonly known as Section 5 – Congress sought to remedy a history of discriminatory electoral practices targeting Blacks in direct violation of the 15th Amendment. (Such practices included an assortment of restrictive voting requirements, literacy tests, poll taxes, intimidation and often, outright violence). As President Lyndon Johnson remarked in 1965 upon signing the bill into law:

This act flows from a clear and simple wrong. Its only purpose is to right that wrong. Millions of Americans are denied the right to vote because of their color. This law will ensure them the right to vote. The wrong is one which no American, in his heart, can justify. The right is one which no American, true to our principles, can deny.”

The pre-clearance requirement applies to nine states – Alabama, Alaska, Arizona, Georgia, Louisiana, Mississippi, South Carolina, Texas and Virginia – and to various counties and towns in seven others. In New York, three of the five boroughs are covered by the provision. Since its initial enactment, the law has been extended four times – most recently in 2006 for a 25-year period.

The legal architect of this latest challenge to the law is Gregory S. Coleman, a former Texas solicitor general who brought the suit on behalf of the Northwest Austin Municipal Utility District No. 1 – a tiny utilities district in Austin that is covered by the law. As The Washington Post reports, Coleman is a politically active lawyer who once clerked for Justice Clarence Thomas and testified before Congress in opposition to extending the Voting Rights Act in 2006. He recruited the utilities district for a test case after Congress enacted its latest extension.

Not surprisingly, the election last fall of Barack Obama to the presidency features prominently in Coleman’s brief to the court:

The America that has elected Barack Obama as its first African American president is far different than when Section 5 was first enacted in 1965,” he wrote. “There is no warrant for continuing to presume that jurisdictions first identified four decades ago as needing extraordinary federal oversight through Section 5 remain uniformly incapable or unwilling to fulfill their obligations to faithfully protect the voting rights of all citizens in those parts of the country.

I put that argument last January to Laughlin McDonald, director of the Voting Rights Project of the American Civil Liberties Union, and he was having none of it. Here’s part of his response from the Wednesday, January 14, 2009 edition of Talkback:

The Supreme Court and an appellate court can indeed take judicial notice of recent elections, but there are a lot of cases out there which say that the mere election of a minority to office is not enough to defeat a Section 5 claim. The courts have said that the relevant standard under the Voting Rights Act is whether a system usually diminishes the ability of minorities to elect candidates of their choice. So I don’t think the mere fact that Barack Obama was elected would be dispositive. In addition, you only have to look at the actual results to see that six of the states that are covered in whole or in part by Section 5 did not go for Obama – they went for McCain. In those jurisdictions we do not see the abnegation of history, racial attitudes and the persistent pattern of racial block voting; it still exists.”

According to McDonald’s analysis, the white vote for Obama in several of the covered states actually declined in 2008 compared to that for the white Democratic candidate John Kerry four years earlier. For example, Kerry got 19 percent of the white vote in Alabama in 2004, while Obama got just 10 percent in 2008. In Louisiana, Kerry got 24 percent of the white vote in 2004, while Obama got only 14 percent in 2008. In Mississippi, Kerry got 14 percent of the white vote, and Obama 11 percent.

And regarding the idea that this latest reauthorization of the Voting Rights Act was based on decades-old evidence, McDonald was equally dismissive, noting that Congress extended Section 5 after holding more than 20 public hearings and examining a record that exceeded 15,000 pages. Legislators noted the hundreds of Section 5 objections that had occurred since the last extension in 1982, as well as the hundreds of lawsuits that successfully challenged racially discriminatory voting practices during the same period. The 2006 extension of the Act was passed by an overwhelming majority of the House (390-33) and by a unanimous vote of the Senate. (The ACLU, which is among several civil rights organizations that have intervened in the case, discusses the legislative record at considerable length in its brief to the court).

While it is true that African American voters — as well as other racial, ethnic and language minorities – are no longer hamstrung by some of the more overt restrictions and exclusionary electoral machinations of the past, that does not mean that the lessons of the past are no longer instructive in our present circumstances. In his book, Stealing Democracy: The New Politics of Voter Suppression, George Washington University law professor Spencer Overton likens our vast and complex electoral system to a “matrix” that requires continuous attention “so that it more fairly empowers all voters rather than simply privileging the insiders who know how to manipulate it.” As he explains:

Contrary to conventional perception, American democracy is not an organic, grassroots phenomenon that mirrors society’s preferences. In reality, the will of the people is channeled by a predetermined matrix of thousands of election regulations and practices that most people accept as natural: the location of election-district boundaries, voter-registration deadlines, and the number of voting machines at a busy polling place. This structure of election rules, practices and decisions filters out certain citizens from voting and organizes the electorate. There is no ‘right’ to vote outside of the terms, conditions, hurdles and boundaries set by the matrix.”

That’s why we need a robust Voting Rights Act as much now as we ever did: to safeguard the rights and secure the interests of those who history has shown to be the likeliest targets for marginalization by the matrix.

Print This Article

For Obama, a Missed Opportunity for a Teachable Moment … Hugh Hamilton

March 29, 2009

Since taking office, Barack Obama has restored the art of rhetorical fluency to the presidential podium. His command of detail in the finer points of public policy has been refreshing, to say the least. He may not always get it right, but it’s clear that this is a president who does his homework. So it was all the more surprising that when called upon this week by ABC news reporter Ann Compton to comment on the role of race in his presidency thus far, President Obama fumbled the opportunity for what could have been a “teachable moment.”

Perhaps he was caught flat-footed by the question – after all, it is not often that “race” makes it to the top-ten list of topics selected for public discussion in polite society. But coming so soon after his own attorney general chided the American public for their reluctance to confront the issue, it was especially disappointing that the president did not seize the moment for a more expansive discourse on the intersection of race and public policy. We know that he is capable of rising to the occasion – as evidenced by his landmark speech on race delivered in Philadelphia one year ago this month. Maybe he thought that in light of the prevailing economic recession, this was not the right time to revisit the issue. But I would argue that this is precisely the moment to do so, as it offers a unique and invaluable opportunity to examine how our tormented legacy of race maintains a stranglehold on the lives of millions of contemporary Americans.

Ignoring the problem will not make it go away. So in the spirit of public service, I am offering herewith a primer on what the president might have said had he chosen to address Compton’s question with the expansiveness it deserved. First, here’s Compton’s actual question:

Compton:“Sir … Could I ask you about race?”

Obama:“You may.”

Compton:“Yours is a rather historic presidency and I’m just wondering whether in any of the policy debates that you’ve had within the White House, the issue of race has come up, or whether it has in the way you feel you’ve been perceived by other leaders or by the American people. Or have thelast 64 days been a relatively color-blind time?”

Now here’s the president’s actual response:

Obama:“I think that the last 64 days has been dominated by me trying to figure out how we’re going to fix the economy, and that’s [as it] affects black, brown and white. And you know, obviously, at Inauguration I think there was justifiable pride on the part of the country that we had taken a step to move us beyond some of the searing legacies of racial discrimination in this country. But that lasted about a day. And you know, right now the American people are judging me exactly the way I should be judged and that is: are we taking the steps to improve liquidity in the financial markets, create jobs, get businesses to reopen, keep American safe? And that’s what I’ve been spending time thinking about.”

Not bad.

But here’s some of what I think the president also might have said, thereby elevating the public consciousness on this critical question:

Obama: “Let me be clear: as president of the United States, I am fully committed to securing and advancing the best interests of every single American, regardless of race, color or creed. But as president I must also acknowledge the facts as they are and confront them accordingly. And the fact is that this recession — painful as it is for all of us — has exacted a heavier toll on some than on others. And for those who historically have been marginalized and discriminated against on the basis of race, that toll has been heaviest of all.

It’s not just me saying this; as Market Watch reported just recently: The recession began before it began for some workers. With about 3.6 million jobs lost since the recession officially began just over a year ago, the reeling job market is not hitting all demographic groups equally. Among Blacks, the seasonably adjusted unemployment rate reached 12.6 percent in January, compared with the national rate of 7.6 percent. In fact, Black unemployment has been well above our current national rate since 2001.

And Blacks are not alone in this regard; among Hispanics, their unemployment rate of nearly 10 percent is also well above the national level.

In times of severe economic hardship such as we are all experiencing today, some are able to cushion the blow by tapping into whatever reserves they might have accumulated in times of relative prosperity. But here again, Blacks are uniquely disadvantaged. The most recent research on wealth disparity in this country reveals that on average, people of color possess less than ten cents for every dollar of white wealth; only 14 percent of people of color have retirement accounts, compared to 43 percent of whites; and nearly 30 percent of Blacks have zero or negative worth, compared to 15 percent of whites.

Indeed, the African American community has been trapped in a recession since the turn of the century. Last September, the Economic Policy Institute sounded the alarm in a report titled, Reversal of Fortune, wherein the authors noted that since 2000, Blacks have been steadily losing what modest economic gains they acquired during the preceding decade. On all major economic indicators – income, wages, employment and poverty – African Americans were worse off in 2007 than they were seven years earlier.

The sub-prime mortgage crisis has made things even worse. For much of the past century African Americans were excluded as a matter of public policy from the opportunity to build wealth through government-subsidized home ownership. Low-interest mortgages guaranteed by the Veterans Administration and the Federal Housing Authority enabled an entire generation of Americans to become middle-class homeowners – except if you were Black. As explained in the award-winning book, The Color of Wealth: “Overall, less than 1 percent of all mortgages went to African Americans between 1930 and 1960. Bankers received the FHA Underwriting Manual which included a ban on lending in integrated neighborhoods. Millions of African Americans who moved north after the war encountered opposition from white developers, lenders, realtors, local officials and white mobs determined to keep them out of white areas. Restrictive covenants were attached to deeds to require white owners to sell only to white buyers.”

Of course things have changed since then – both as a matter of law and public policy. But not before an entire generation of hard-working, patriotic, tax-paying African Americans was shut out from the opportunity to benefit from one of our country’s most significant periods of prosperity and growth. By every economic measure, those who on the basis of race were denied the chance to buy a home with government help, and to give their children a leg up in life by passing on that equity to the next generation, are lagging behind their counterparts to this day.

Many of these very families were doubly victimized by the predatory lending practices that were a prominent feature of the recent sub-prime mortgage crisis. According to available data, people of color were more than three times more likely to be steered into sub-prime loans than whites. Now that the bubble has burst, one recent report estimates that the sub-prime crisis will constitute the greatest loss of wealth for people of color in modern American history.

As president, I carry with me the burden of that history and its enduring consequences. But I also carry with me the hope that we can construct in this country a framework for reparative justice that will enable us to begin the process of healing our legacy of past discrimination. I do not yet know what form that framework will take. But I do know that it cannot be color-blind any more than it can be gender-neutral. And I look forward to convening a national conversation with the American people on how best we can undertake this challenge in the period ahead.”

Print This Article

Decades of Disparity, Soaring Costs, Take Toll In War On Drugs… by Hugh Hamilton

March 15, 2009

If Blacks and whites in America use illegal drugs at roughly comparable rates, then why are Blacks arrested at rates several times higher than whites for the same drug-related offenses?

I put the question recently to Jamie Fellner, senior counsel at the U.S. Program of Human Rights Watch. Fellner is also author of a new report titled, Decades of Disparity: Drug Arrests and Race in the United States. Here’s what she said:

“From its very beginning, the U.S. war on drugs has been colored by race; race is the lens through which drug problems are defined and through which the responses are chosen. I do not believe that having mandatory sentences would have been implemented or maintained to this day if it were whites being sent to prison at the rates that Blacks are being sent to prison. There would have been huge pressure for change. But since it’s Blacks who are disproportionately bearing the burden of law enforcement and mandatory minimum sentences, there’s been less political pressure.”

Using data obtained from the FBI, Fellner’s research revealed that for nearly three decades (1980 to 2007 – the last year for which complete data were available), adult African Americans were arrested on drug charges at rates five times as high as those for whites. An analysis of state-by-state data showed that the disparity was even greater in some jurisdictions: in Minnesota, for example, Blacks were arrested at rates 11 times higher than whites for drug offenses in 2006. As Fellner explained, “Jim Crow may be dead, but the drug war has never been color-blind. Although whites and Blacks use and sell drugs at comparable rates, the heavy hand of the law is more likely to fall on Black shoulders.”

Noting that these racial disparities reflect a history of complex political, criminal-justice and socio-economic dynamics, Fellner argued that the disproportionate burden imposed on Black families and neighborhoods has exacted a social, economic and political toll on them that is “as incalculable as it is unjust.” For as long as urban communities of color remain the central focus of drug enforcement efforts, while suburban whites in gated communities get a free pass, for so long will these unwarranted disparities persist.

But Fellner also cautioned that while reducing the disparities is imperative, it should not be accomplished simply by increasing the rate of white drug arrests. Instead, she calls for a “rethinking of the drug-war paradigm,” with more emphasis on prevention and substance abuse treatment, and less on drug enforcement. She favors the use of community-based sanctions for drug offenses and the elimination of mandatory minimum sentences for them.

Alternatives to Incarceration

Legal scholars, substance-abuse experts and social-justice advocates have long inveighed against mandatory minimum sentences for first-time and low-level drug offenders. Their arguments may be as varied as the disciplines they represent, but they come to rest ultimately at the same conclusion: that mandatory sentences are ineffective and inappropriate as a matter of public policy. Some, like clinical psychologist Dr. Bruce Levine, even point to what they see as a double standard in the way we treat consumers of prescription versus street drugs. According to Levine:

“When we recognize that psychotropic prescription drugs are chemically similarto illegal psychotropic drugs, and that all these substances are used for similar purposes, we see two injustices. First, we see the classification of millions of Americans as criminals for using certain drugs, while millions of others, using essentially similar drugs for similar purposes, are seen as patients. Second, we see a denial of those societal realities that compel increasing numbers of Americans to use psychotropic drugs.”

The Drug Policy Alliance Network, which advocates for policy alternatives based on science, health and human-rights standards, is equally blunt in maintaining that “mandatory minimums have worsened racial and gender disparities, contributed greatly to prison overcrowding, and is both costly and unjust.”

Just how costly was underscored in a new study released this month by the Pew Center on the States, which reported that the Corrections industry last year accounted for the fastest expanding major segment of state budgets; over the past two decades, its growth as a share of state expenditures has been second only to Medicaid. According to the study, state corrections costs now exceed $50 billion annually and consume one of every 15 discretionary dollars.

Over the past three decades, states have been putting so many people behind bars that last year, one of every 100 adults in America was in prison or jail. But with far less notice, as the study reports, the number of people on probation or parole has also skyrocketed — to more than 5 million. As the authors note, “this means that 1 in every 45 adults in the United States is now under criminal supervision in the community, and combined with those in prison or jail, a stunning 1 in every 31 adults is under some form of correctional control. The rates are drastically elevated for men (1 in 18) and Blacks (1 in 11).”

Here again, Blacks bear a disproportionate burden: the Pew study (One in 31: The Long Reach of American Corrections), found that Black adultswere four times as likely as whites to be under correctional control; one in every 11 Black adults – 9.2 percent – was under correctional supervision at the end of 2007.

But the sheer cost of maintaining the world’s largest prison population and its ancillary correctional enterprises is beginning to take a toll on the states. Indeed, there is some evidence that the current economic downturn may yet prove to be the proverbial “ill wind” that blows some good for those who have long campaigned in favor of a more humane approach to dealing with low-level drug offenders – many of whom would be better served by treatment rather than incarceration. Nationwide, drug offenders account for 20 percent of all prison inmates (not counting those in jails).

Several states are already struggling with the oppressive costs of mass incarceration. In California, which maintains the country’s largest corrections system, federal judges ruled last month that conditions in the state’s 33 adult jails had become so overcrowded that they violate the constitutional rights of inmates, subjecting them to cruel and unusual punishment that is causing at least one death a month. To ensure minimal health and safety standards are met, the judges ruled that as many as one-third of the state’s inmates may have to be set free on early release or parole by 2012. There is just not enough money to build more prisons.

Moreover, as Guy Adams reported for the U.K. Independent, “the prison crisis is not limited to California. In Des Moines, Iowa, county officials plan to start charging prisoners for toilet paper. Michigan will release 4,000 prisoners who have served their minimum sentences. New Jersey and Vermont are putting drug-addicted offenders into treatment rather than prison. Louisiana, which has one of the highest incarceration rates in the developed world, is hoping to reform a system that spends more on prisons than on higher education.”

In a pithy editorial conclusion to its report, the Independent noted wryly that “these measures are controversial in a nation that views prison as a place for retribution rather than rehabilitation.”

The current economic crisis is forcing the United States to revisit that mindset. Maybe some small good may yet result from the ill wind of this recession, after all.

Print This Article

Why Commerce Matters…. by Hugh Hamilton

February 15, 2009

When did the Department of Commerce become such a hotbed of political controversy? The withdrawal last week of Judd Gregg — a little-known Republican senator from New Hampshire nominated by President Barack Obama to head the Commerce Department — has focused considerable and unaccustomed public interest on the agency. (The president’s initial choice for the post – New Mexico Governor Bill Richardson – also withdrew last month amid some controversy). This is the type of excitement ordinarily reserved for portfolios like Treasury, State and Defense. But Commerce? Since when did Commerce matter this much?

Since it became responsible for administering the Census – that’s when.

It all goes back to the turn of the last century, when Congress authorized the establishment of a permanent census office. In 1903, the office was transferred from the Interior Department to the Department of Commerce and Labor. Ten years later, when Commerce and Labor were restructured as separate entities, the Census Bureau was retained at Commerce, where it has remained in a state of relative obscurity ever since. Yet, of all the myriad functions carried out by the Department of Commerce, none is so politically volatile nor more fundamental to our system of governance than the census. For starters, it is among the relatively few government functions explicitly mandated in the U.S. Constitution.

Article 1 Section 2 of the Constitution requires that a decennial census be conducted of everyone living in the United States. The numbers are used to determine reapportionment in the House of Representatives and redistricting for legislative seats at the state and local levels. In addition to the actual enumeration of persons conducted every 10 years, the census is also used to construct a demographic profile of the country, including such characteristics as race and ethnicity, age, education, employment, income and home ownership, to name just a few. The government then uses that information to determine the allocation of billions of dollars in federal funds to local communities; USA Today estimated those funds last year at some $300 billion.

With that much money and power at stake, the census has evolved to become the largest peacetime mobilization of personnel and resources in the United States, involving as many as 860,000 temporary workers the last time around and a projected cost of some $14 billion for 2010.

And it’s all administered by the Department of Commerce.

Original Undercount

Of course, the census has always courted controversy; the process of enumeration was contentious from the start. Among the early conundrums confronting the framers of the Constitution was this: if political power would be based on legislative representation, and legislative representation was to be determined by the process of enumeration, how should the population of enslaved Africans be measured in this equation? The framers agreed to settle their differences by adopting the infamous “three-fifths clause,” whereby enslaved blacks would be counted for this purpose as three-fifths of a person; Native Americans not subject to taxation were excluded altogether.

Consider this the original “undercount” — a problem that has dogged virtually every census since 1790. For while the three-fifths clause was repealed with ratification of the 14th Amendment in 1868, the census has always struggled for any semblance of accuracy. In the census of 1870, for example, in which all inhabitants were counted as whole persons, Asian Americans were assigned their own racial classification for the first time. But they were collectively categorized as Chinese. The introduction of statistical sampling techniques and computerized tabulations in the 20th century were also accompanied by new challenges and controversies over the accuracy of the count.

By 2000, the cost of the undercount was being measured in billions of dollars worth of federal aid to mainly poor and working-class urban communities of color. Pricewaterhousecoopers, in a report commissioned by the Presidential Members of the U.S. Census Monitoring Board, estimated in 2001 that the bulk of those losses would be felt in 58 of the nation’s largest counties, including Los Angeles, Brooklyn and the Bronx. Gilbert Casellas, Presidential Co-chair of the Monitoring Board, described the findings as “the most compelling evidence of the potential harm caused by the 2000 census undercount [which] will cost state and local governments billions of dollars in funds that are earmarked for the programs that largely serve our nation’s most disadvantaged.”

In the almost decade since then, additional challenges have emerged. The population today is larger and more diverse than at any time in our nation’s history. There are fears that in the aftermath of September 11, 2001, many Arab and Muslim Americans may be reluctant to participate because they have grown distrustful of the federal government. So, too, have many immigrant communities who feel they have been unfairly targeted and stigmatized.

These anxieties are not altogether misplaced. Indeed, some Republican lawmakers, like Candice Miller of Michigan, are reviving old arguments that seek to exclude non-citizens altogether from the census, on the ground that their presence alone caused nine seats in the House to change hands between the states in 2000. Supporters of this restrictive or exclusionary view argue that California gained six seats it would not have had otherwise, while Texas, New York, and Florida each gained one seat. Meanwhile, Indiana, Michigan, Mississippi, Oklahoma, Pennsylvania, and Wisconsin each lost a seat and Montana, Kentucky, and Utah each failed to receive a seat they would otherwise have gained.

Needless to say, if this nativist view were to prevail, millions of lawful permanent residents and other non-citizens who work hard and pay taxes would be left with no political representation at all, as their numbers would not be counted in the reapportionment or redistricting process. That sounds like a classic case of “taxation without representation” to me.

An Independent Census?

That the census has been a political football from its inception is clear. But might it function more efficiently if it were elevated to an independent agency unfettered by the constraints of electoral politics? It’s an attractive proposition.

Last September, New York Congresswoman Carolyn Maloney introduced legislation that would remove the Census Bureau from the Commerce Department and establish it as an independent agency — much like NASA. She plans to reintroduce the bill before the current Congress, with the idea of having the newly independent entity take effect in 2012 – after the next census. Her effort has been endorsed by every living former Director of the Census, who collectively served seven presidents from Richard Nixon to George W. Bush. As Maloney explained:

“After three decades of controversy surrounding the decennial census, the time has come to recognize the Census Bureau as one of our country’s premier scientific agencies and it should be accorded the status of peers such as NASA, the National Institutes of Health and the National Science Foundation.

Nearly every economic statistic reported in the news and relied upon by Americans is derived from data collected day in and day out by career professionals at the Census Bureau. Yet the average American would be hard pressed to find this vital agency even on the Commerce Department’s own organizational chart on the government’s website, where it is buried in the basement of 32 boxes on the chart.”

She’s right. I tried. See for yourself.

Print This Article