Putting Flesh and Bones on the Dream
A blog about race, politics, and people
"The name of the game is to put some economic flesh and bones on Dr. King's dream."
—Arthur Fletcher, 1969.
||Free Speech Is Not an Academic Value
In the wake of student protestors' shouting down (and even physically assaulting) those with disagreeable academic opinions, this essay in in the Chronicle of Higher Education reminds us of some of the core values of academia.
"Free Speech Is Not an Academic Value," by Stanley Fish
Like everyone else these days, I’ve been reading and thinking about what’s happening on campuses when invited speakers are shouted down by student protesters. And my mind keeps drifting back to a statement of principles issued last March by a faculty committee at the University of Minnesota. Principle No. 1 reads: "A public university must be absolutely committed to protecting free speech, both for constitutional and academic reasons." This statement is at best insufficiently nuanced and at worst false.
The constitutional status of free speech at public universities has been worked out in a series of court decisions. The jurisprudence is a bit complicated, but it boils down to a key distinction between speech on a matter of public concern and speech that is personal or internal to the operations of the unit (i.e. a district attorney’s office or an academic department). If the speech at issue falls under the first category, it is constitutionally protected; if it falls under the second, it can be regulated in the same way any employer can regulate speech that disrupts the core business of the workplace.
Justice Thurgood Marshall described the adjudicative task. We must, he said, "arrive at a balance between the interests of the teacher as a citizen in commenting on matters of public concern, and the interest of the State, as an employer, in promoting the efficiency of the public services it performs" (Pickering v. Board of Education, 1968). So, in what might seem to be a paradox, the public university is "absolutely committed to protecting free speech" only when the speech produced is nonacademic. When it is academic speech that is being produced the interest of the employer is paramount and speech is permitted only when it serves that interest.
But isn’t that interest centered on speech because, as the Minnesota faculty put it in their draft recommendations, the university’s "larger normative commitment [is] to the free exchange of ideas"? No, it isn’t. The university’s normative commitment is to freedom of inquiry, which is quite a different thing. The phrase "free exchange of ideas" suggests something like a Hyde Park corner or a town-hall meeting where people take turns offering their opinions on pressing social matters. The right to speak is held by all; no requirements (of rank, intelligence, professional standing, etc.) limit the number of those who have access to the microphone. (Limits of course may attach to time, manner, and place.)
The course of free inquiry in universities is not like that at all. Before one can speak, in a classroom or in the research seminar or in a journal publication, one will have been subjected to any number of vetting procedures — votes, auditions, presentations — designed largely to determine those who will not be allowed to speak. Whether it is a department, a college, a dean, a provost, a learned-journal editor, it is the business of the university to silence voices, not to license them indifferently. To put it another way, the free exchange of ideas between persons who want in on the conversation is a democratic ideal; but the university is not a democracy; it is (or is supposed to be) a meritocracy, one in which those who get to put their ideas forward are far outnumbered by those who don’t. The process is more Darwinian than democratic.
This leads me to a conclusion implicit in the previous paragraphs: Freedom of speech is not an academic value. Accuracy of speech is an academic value; completeness of speech is an academic value; relevance of speech is an academic value. Each of these values is directly related to the goal of academic inquiry: getting a matter of fact right. The operative commonplace is "following the evidence wherever it leads." You can’t do that if your sources are suspect or nonexistent; you can’t do that if you only consider evidence favorable to your biases; you can’t do that if your evidence is far afield and hasn’t been persuasively connected to the instant matter of fact.
Nor can you follow the evidence wherever it leads if you are guided by a desire that it reach a conclusion friendly to your political views. If free speech is not an academic value because it is not the value guiding inquiry, free political speech is positively antithetical to inquiry: It skews inquiry in advance; you get where you wanted to get from the get-go. It is political speech if, when the material under consideration raises political/ethical questions, you believe it is your task to answer them, to take them seriously rather than academically. Any number of topics taken up in a classroom will contain moral and political issues, issues like discrimination, inequality, institutional racism. Those issues should be studied, analyzed, and historicized, but they shouldn’t be debated with a view to fashioning and prosecuting a remedial agenda. The academic interrogation of an issue leads to an understanding of its complexity; it does not (or should not) lead to joining a party or marching down Main Street.
That is what I mean by saying that the issue shouldn’t be taken seriously; taking it seriously would require following its paths and byways to the point where one embarks upon a course of action; taking it academically requires that one stop short of action and remain in the realm of deliberation so long as the academic context is in session; action, if it comes, comes later or after class.
So neither free speech — speech uttered by anyone who has something to say — nor political speech — speech intended to nudge students in one direction or the other — is a legitimate part of the academic scene. But both are part of the extracurricular scene: the rallies, workshops, panel discussions, and lectures about which we hear so much today. In those contexts partisan views are front and center, and they are aired by anyone and everyone in the room or the quad or the auditorium. And these views are being taken seriously. Speakers are not merely reflecting on the alternatives; they are strongly urging the alternatives, sometimes in apocalyptic terms: Unless we divest from fossil-fuel stocks, the environment will be destroyed; unless we speak out against Israel, a new Nazi-ism will triumph; unless we stand up against microaggressions, racism will run rampant. Passions run high, the stakes are felt to be enormous, the fate of the republic hangs in the balance.
It’s all so exciting, so exhilarating, so serious. But it is not a seriousness to which the university is a party. My contention that moral/political seriousness has no place in the university holds even in those areas in which moral/political seriousness is being performed to a fare-thee-well; for while that conversation (often very heated) is occurring within university precincts, the university is not actively presiding over it; rather, the university is, or should be, managing it, much as the proprietors of a sports stadium manage the crowds they invite in or as the proprietors of a Broadway theater manage the audiences they labor to attract. It’s show business! The university lets this stuff go on, but it doesn’t have a dog in the hunt; it neither affirms nor repudiates any of the positions that vie for attention in the circus it allows on its grounds; it doesn’t take those positions seriously, and it shouldn’t, for if it did so (by divesting from fossil fuels or policing microaggressions or declaring the entire campus a free-speech zone) it would no longer be in the education business; it would be in the partisan-politics business.
Not all universities understand the difference between curricular and extracurricular activities and the different responsibilities attendant on each. They are confused in both directions: They think that the partisan passion of the extracurricular sideshow has a place in the classroom, and they think that something genuinely academic is going on when speakers invited precisely because they are controversial become the occasion for controversy. They don’t see that it is the administration’s job, first, to ensure that the classroom is a safe space for intellectual deliberation (that’s the only safe space I’m interested in), and, second — a very distant second — to maintain control of the energies that have been let loose once the decision to have a lecture or mount a panel discussion or allow a rally has been made.
I put it that way so as to emphasize the fact that nothing requires the making of that decision; nothing requires that there be extracurricular activities at all. A university would still be one if all it contained were classrooms, a library, and facilities for research. A university would not be one if all it contained was a quad with some tables on it, a student union with a food court, an auditorium and a bowling alley, a gymnasium with a swimming pool and some climbing walls. You could take away all those things, and along with them the student newspaper, the fraternities, the sororities, the concerts, the athletic events, the dances and everything else administered by the office of student affairs (which you could get rid of too), and the core of the university would be intact.
So if you’re a college or a university, you don’t have to saddle yourself with any of those extras. But once you’ve decided to add them on, it’s your job to see that they work, which means, mostly, ensuring that events go smoothly and no one gets hurt. If that’s the assignment, many colleges and universities deserve a failing grade.
Consider an example much in the news these days: Middlebury College. The facts are well known. The controversial sociologist Charles Murray, co-author of The Bell Curve, was invited by the American Enterprise Institute Club to speak at Middlebury about his 2013 book Coming Apart. The event was co-sponsored by the political science department and one of its members, Allison Stanger, was scheduled to engage Murray in dialogue after his talk. That never happened, because as soon as Murray rose to speak student protesters turned their backs on him and began a nonstop serial chant featuring slogans like "Racist, sexist, anti-gay, Charles Murray go away" and "Your message is hatred; we will not tolerate it." After 20 minutes a university administrator announced that the event would be moved to another location where Murray would give his talk, and that he and Professor Stanger would engage in a live-streamed conversation. That did happen, but as Murray and Stanger were exiting the new venue they were harassed and assaulted; Stanger suffered a neck injury and spent a short time in a hospital.
What happened here? Well, according to many commentators, something disturbing and dangerous happened. That is the suggestion of an article headline in The Atlantic: "A Violent Attack on Free Speech at Middlebury." But whose free speech was attacked? If you’re thinking First Amendment (inapplicable to a private school like Middlebury anyway), no government or government agency prevented Murray from speaking. If you’re thinking First Amendment values like the value of a free exchange of ideas, that’s not what the students wanted, and it was their show (after they took it away from the AEI club). And if it is what the Middlebury administration wanted, as President Laurie L. Patton said it was, then it was up to the administration to take the steps necessary to bring about the outcome it desired.
If you were to ask me, "What would those steps be?" I would reply that I don’t know, but it’s not my job to know; it’s the job of the Middlebury administrators, and they failed to do it. In its account of the affair, Inside Higher Ed reports that "College officials said the size and intensity of the protest surprised them." Really? What planet were they living on? Didn’t they read the job description when they signed up?
Some Middlebury faculty and many outside observers blamed the students for the debacle, and there is no doubt that their actions and ideas were unattractive enough to qualify them for the position of whipping boy. When an earnest representative of the AEI Club told the students that he looked forward to hearing their opinions, one of them immediately corrected him: "These are truths." In other words, you and Charles Murray have opinions, but we are in possession of the truth, and it is a waste of our time to listen to views we have already rejected and know to be worthless. Now that’s a nice brew of arrogance and ignorance, which, in combination with the obstructionism that followed, explains why the students are getting such a bad press. They are obnoxious, self-righteous, self-preening, shallow, short-sighted, intolerant, and generally impossible, which means that they are students, doing what students do.
What they don’t do is police themselves or respect the institution’s protocols or temper their youthful enthusiasm with a dash of mature wisdom. That, again, is what administrations are supposed to do and what they are paid to do: Set up procedures for establishing, maintaining, and managing the various enterprises, academic and nonacademic, that fall within their purview. Pillorying the students while muttering something about the decline of civility and truth-seeking in a radical PC culture makes good copy for radio, TV, and newspaper pundits; however, it misses the point, which is not some piously invoked abstraction like free speech or democratic rational debate, but something much smaller and more practically consequential: the obligation of college and university administrators to know what they are supposed to do and then to actually do it. How’s that for a plan?
My advice to administrators: Stop thinking of yourselves as in-house philosophers or free-speech champions or dispensers of moral wisdom, and accept your responsibility as managers of crowd-control, an art with its own history and analytical tools, and one that you had better learn and learn quickly.
Stanley Fish is a professor of law at Florida International University and visiting professor of law at Cardozo Law School. He is the author, most recently, of Winning Arguments: What Works and Doesn’t Work in Politics, the Bedroom, the Courtroom, and the Classroom (Harper, 2016).
Leave a Comment (for Dr. Golland's blog)
| || |
||The America We Lost When Trump Won
This op-ed from the New York Times fairly sums up how I've been feeling since the election. It serves as a sad counterpoint to my very first blog entry on this page, when so many of us felt flush with the victory and hope expressed in November 2008.
"The America We Lost When Trump Won," by Kevin Baker
NO, I’m not over it.
On Election Day I felt as though I had awakened in America and gone to sleep in Ecuador, or maybe Belgium. Or Thailand, or Zambia, or any other perfectly nice country that endures the usual ups and downs of history as the years pass, headed toward no particular destiny.
It’s different here, or at least it was. America was always supposed to be something, as much a vision as a physical reality, from the moment that John Winthrop, evoking Jerusalem, urged the Massachusetts Bay Colony to “be as a city upon a hill.” To be an American writer meant being able to share that sense of purpose, those expectations, and to flatter yourself that you were helping to shape it. Nobody expects anything out of Belgium.
More than any other country, I think, America has been a constant character in the work of its writers. Not only those writers who celebrate it ecstatically, like Walt Whitman, who made his life’s work one long ode to our young nation, or Nathaniel Hawthorne, or Toni Morrison, or E. L. Doctorow, who have picked more critically through its past. It applies as well to those who have scourged it, and exposed the worst of its contradictions and betrayals; a Richard Wright or a Ralph Ellison, or John Reed. It remained a vivid entity even in the work of those who have left it for one reason or another, Henry James or Edith Wharton, F. Scott Fitzgerald or Ernest Hemingway, or John Dos Passos.
Their love for it, and their disappointments, all have the same roots, which are those expectations and those dreams. Even at our lowest, we believe with Langston Hughes’s wish to “let America be America again/The land that never yet has been, and yet must be”; with the Rev. Dr. Martin Luther King Jr.’s overworked but ever more necessary claim that the moral arc of the universe may be long but that here, at least, it bends toward justice. Even its sternest critics agreed: America was going places!
I know that it may sound naďve, even childish, to think that any nation has a special destiny. It’s the kind of thing that dictators and demagogues like to tell their people. I doubt if many of the other writers I know would admit they believe in such a big, vague concept as “American exceptionalism.” But we do, most of us. It’s inescapable, considering what we are: the first republic of the modern age, a nation of immigrants, haven to so many peoples from around the world. We have, like no other country, for better and for ill, dominated the modern world through both our hard power and our soft, our weapons but also our ideas.
I can tell you all of the worst things we have done. The annihilation of the peoples who lived here before we did, and how much of America was built on the backs of enslaved Africans. The things we have done to other nations weaker than ours, the death squads and the C.I.A. schemes, and all the squalid little wars we’ve waged to grab land or save face. The exploitation and the bigotry, and the withering greed, and how we let the vastness of this continent fool us into believing that no matter how big a mistake we make, we can always start over — that we can endlessly root up and tear down, and move unmindfully through the world.
I have written about many of these things, but that was in a greater cause, too. The absolute conviction, in the end, that I, too, was caught up in the great work; that I was helping us to get to some higher place and fulfill our promise.
Geoffrey Ward, the brilliant American historian and the writer of many of Ken Burns’s documentaries, told me with a sense of wonder, a few days after the election: “I just turned 76 and had naďvely assumed that issues I thought resolved when I was a young man — voting rights, abortion, the ongoing enrichment immigration provides our country — would remain resolved.”
Nothing is settled anymore in America, and it appears that so many of the gains we have fought so hard to win over the years are about to be rolled back by our new president and the party that has so cravenly backed him, even when it knows better. Obamacare, which millions of us — myself included — depend upon, is already under assault, and Medicare may not be far behind. Who knows what established rights the cadres of far-right justices who will now fill the federal benches for a generation may strike down?
Yet when I say that I have lost the America I knew, I’m not talking about policy, or even fundamental rights, disorienting as their loss would be. I mean a greater, almost spiritual faith that I had in my fellow citizens and their better instincts, something that served as my north star in all I wrote and all I did.
When I watched the debates and the conventions this year, my thoughts kept going back to my parents, neither of whom lived to see this election. They would have been staggered by the sheer, pounding vulgarity of it all. They were both political moderates, who voted Republican as well as Democratic, and who like most of us never paid all that much attention to politics outside the few weeks before an election. But the phenomenon of Donald J. Trump — a man who says he has never asked God for forgiveness, who refers to the Eucharist with characteristic humility (“I drink my little wine, which is about the only wine I drink, and have my little cracker"), who mocks our military heroes, who lumbers about a stage proclaiming, “I alone can fix it!,” who dismissed a working man after the election with a tweet that read in part, “Spend more time working — less talking” — would have been incomprehensible to them. They would have thought themselves transported to some other time and country, maybe another dimension. As do I.
I have listened to all the blame foisted on the Clinton campaign for doing this or that wrong, or the media for not exposing Mr. Trump, or for giving him too much airtime. I don’t buy it. Hillary Clinton’s campaign wasn’t that bad, and Mr. Trump was exposed enough for any thinking adult to see exactly what he is.
From assorted commentators I have heard that it is unfair or condescending to say that all Trump voters were racists, or sexists, or that they hated foreigners. All right. But if they were not, they were willing to accept an awful lot of racism and sexism and xenophobia in the deal they made with their champion, and demanded precious few particulars in return. Lately Mr. Trump has endorsed the comparison of his personal populist movement with Andrew Jackson’s, and it is true that there was much that was racist and ignorant at the heart of Jacksonian democracy. For their love, the followers of Old Hickory demanded the destruction of Native American civilization in the South, and the furthering of slavery westward. This cruel bargain won Jackson voters land, and thus the vote. What have those who embraced “Mr. I Alone Can Fix It” obtained, save for the vague, grandiose promise, renewed in his inaugural, that they will soon “start winning again, winning like never before”? Or — worse — Mr. Trump’s vow to end “political correctness” and make this, at least rhetorically, the same white man’s America it was in Jackson’s time?
I know that Mr. Trump was elected, in part, because too many people were still hurting in this economy, from the terrible disruptions of their lives and their communities over the last 25 years. I have been poor and desperate myself, and I know what that feels like. In their giddy rush to globalization and the paper economy, too many liberal — and conservative — leaders have made the same mistake that they made in Vietnam, when they tried to palm that misbegotten conflict off on the poor and the working class. They have forgotten — again — that this great nation will endure and will prosper only if we all prosper together.
Yet that is no excuse for what we did last November.
Throughout our history, Americans have encountered economic shocks much worse than anything we know today, and with many fewer resources at their disposal. American working people have agency, they are plenty educated, and in past crises they rejected the extremism that other nations turned to. Even in the Great Depression they did not succumb to the ideologies of Fascism and Communism sweeping the world. When the system seemed broken in the past, when the elites and the major parties seemed irretrievably corrupt and deaf to their appeals, their response was to build true democratic movements from the ground up, and to push them on to victory even if that took decades.
The populists after the Civil War, faced with the collapse into peonage of American farmers — then about half the population — built nationwide lecture and correspondence networks, and eventually won the reforms they needed, even though it took them more than 60 years. The first wave of feminists fought for more than 70 years to win their biggest demand; Susan B. Anthony and Elizabeth Cady Stanton were dead by the time women got the vote. African-Americans battled ceaselessly, in every way they could, against their enslavement and Jim Crow, training their own lawyers to take their cases to the Supreme Court. The struggles for labor rights, gay rights, Hispanic rights, civil liberties, religious toleration, women’s control over their own bodies — all these battles and more took decades to win. They are the glory of our civilization.
Today’s passive, unhappy Americans sat on their couches and chose a strutting TV clown to save us.
What they have done is a desecration, a foolish and vindictive act of vandalism, by which they betrayed all the best and most valiant labors of our ancestors. We don’t want to accept this, because we cannot accept that the people, at least in the long run of things, can be wrong in our American democracy. But they can be wrong, just like any people, anywhere. And until we do accept this abject failure of both our system and ourselves, there is no hope for our redemption.
A couple of days after the election I watched on CNN as red-faced Russian apparatchiks in Moscow toasted one another on their great success. “Hurrah!” I thought. “No more American exceptionalism! We have joined up with the drunken idiot of history!” Once Russians, too, and especially Russian writers, were certain that there was a special destiny for the Russian soul. But a century of disastrous choices and their consequences seems to have disillusioned them. They have so much to teach us.
Kevin Baker is a novelist and essayist and the author, most recently, of “America the Ingenious.”
Leave a Comment (for Dr. Golland's blog)
| || |
||Letter to Electors
I signed the following statement, republished from this site:
We, a bipartisan coalition of Americans including Electors, scholars, officials, and concerned citizens write to you in the spirit of fellowship, out of our sense of patriotism, and with great urgency.
There are times in the life of a nation when extraordinary circumstances call for extraordinary measures. Now is a such time, and your courage and leadership are required.Never in our Republic’s history has there been a President-apparent comparable to Donald Trump. His inauguration would present a grave and continual threat to the Constitution, to domestic tranquility, and to international stability:
He has threatened the freedom of speech by condoning violence at public events, and suggesting criminal penalties and even revocation of citizenship to punish political expression;
He has threatened the freedom of press by vowing to revoke First Amendment protections for journalists;
He has threatened the freedom of religion by proposing to bar entry to the country and force the registration of members of certain faiths;
He has entangled himself with foreign interests through his personal business dealings, and refused to provide records of his taxes, which could allay suspicions;
He has indicated a willingness to condone torture, in contravention of the Constitution and our international treaties, which carry the force of law;
He is uncomfortably close to the regime of Russia, which has interfered in the election;
He has shown reckless disregard for diplomacy, communicating impulsively, in public forums, regarding matters of national security, and allowing personal emotions to interfere with reasoned judgment, calling into question his fitness as Commander-in-Chief of the Armed Forces and the nuclear capabilities of the United States;
He has, unlike every previous Commander-in-Chief, never served in any public position, whether elected or appointed, civilian or military, thereby bringing no experience or proven judgment on behalf of The People, or evidence of a character suited to high office.
For these reasons, his assumption of office endangers the Constitution, the freedoms it protects, and the continued prosperity and welfare of the United States.
You, Electors, possess the power to prevent this outcome. You are not bound to cast your vote for the candidate of your party – and, as he won neither a majority nor even a plurality of the popular vote, there can be no question of undermining the will of The People.
The Constitution empowers Electors to exercise judgment and choice. If your role were only ceremonial, our Founders would not have required the states to elect you, or that you cast ballots by your own hand. State laws notwithstanding, you are free to vote your conscience. You have a mandate, like all officials, to protect and defend the Constitution.
And you have the right and responsibility to investigate those who stand for this office, and to deliberate before casting your vote.
We place country before party in imploring you, our fellow Americans, to investigate and deliberate. We stand with you as you exercise your conscience and give profound consideration to the consequences of your vote. We affirm your right and your duty to do so free from intimidation, and urge you to cast your ballot for a person with the temperament, integrity and commitment to Constitutional principles necessary in a President.
In doing so, know that you enjoy the support of millions of Americans.
Thank you for your service to our country.
View the original statement and list of prominent signatories here.
Leave a Comment (for Dr. Golland's blog)
| || |
||Collective Statement by Scholars in U.S. History and Related Fields on Civil Rights and Liberties in Dangerous Times
I signed the following statement, republished from this site:
As scholars of United States history and related fields, we have experienced concern and alarm as we went from a divisive campaign season to the election of Donald Trump as our president-elect. On the eve of a new administration whose key players have traded in hateful rhetoric and emboldened the harassment of various targets, we urge Americans to be vigilant against a mass violation of civil rights and liberties that could result if such troubling developments continue unchecked. Looking back on World War II and the Cold War, we recognize how easily the rights of people have been suspended during times of great uncertainty. A key lesson of such ordeals has been to never again repeat these mistakes, and so we issue a call to recognize and act upon the critical links between historical knowledge, informed citizenship, and the protection of civil and human rights.
During World War II, in the wake of Japan’s attack on Pearl Harbor, U.S. officials justified the imprisonment of 120,000 Japanese Americans--U.S. and foreign born, most of whom were U.S. citizens--with the argument of “military necessity.” In the ensuing early Cold War era, the McCarthyist “witch hunts,” House Un-American Activities Committee (HUAC) investigations, and registry of communists and their sympathizers destroyed the lives of countless teachers, artists, politicians, and others.
These historical persecutions have since been widely repudiated by scholars and elected leaders. For instance, the 1988 Civil Liberties Act signed by President Reagan and passed by a bipartisan Congress acknowledged internment as a mistake fueled by racism. It stated, “The internment of the individuals of Japanese ancestry was caused by racial prejudice, war hysteria, and a failure of political leadership.” The Cold War witch hunts associated with Joseph McCarthy were roundly denounced by Republicans and Democrats during the 1954 Army-McCarthy Senate hearings. Decades of research, debate, and analysis have brought historians to a consensus that episodes like internment and McCarthyism were misguided and immoral, and should never be repeated.
Internationally the United States has been engaged in a war on terror, and domestically the election made clear that we are in an ideological “culture war.” And while we find ourselves in a distinct moment compared to World War II and the Cold War, we are seeing the return of familiar calls against perceived enemies. Alarmingly, justifications for a Muslim registry have cited Japanese American imprisonment during World War II as a credible precedent, and the Professor Watchlist—which speciously identifies “un-patriotic professors”--is eerily similar to the communist registry of the McCarthy era. Looking back to history provides copious lessons on what is at stake when we allow hysteria and untruths to trample people’s rights. We know the consequences, and it is possible, with vigilance and a clear eye on history, to prevent tragedy before it is too late.
It is not just that we are at the cusp of what may be a massive rollback of civil rights and liberties, but our culture is also mired in confusion about facts vs. misinformation and a rebellion against knowledge and critical thinking. This makes our present moment doubly dangerous. Though we cannot know for certain what is ahead, we urge all Americans to enter the coming months with lessons of the past in mind, and to draw upon them to stand up for the protection of all people’s civil rights and liberties.
We sign this statement as individual scholars. Institutions are listed for identification purposes only.
View the original statement and signatures here.
Leave a Comment (for Dr. Golland's blog)
| || |
||Horrified but Unsurprised
The American people have often shown a capacity for evil.
However, our people have just as often shown a capacity for love, kindness, goodness, and compassion. For every Taney we have produced a Douglass, for every Ford a Gompers, for every Wallace a Dr. King, for every Delano Grape a Cesar Chavez. And yes, for every W., an Obama.
So let us be not defeated but energized, not afraid but brave.
Leave a Comment
| || |
Questions for a Post-Trump America
If current polls hold up, in early November we will be celebrating the defeat of the presidential candidacy of Donald J. Trump. We will have deflected from the presidency the racism and other forms of intolerance he has espoused. The important question then--and I think we need to ask it now--is what next?
The horrible conflation of white male paranoia with the decline of the working class (and much of the rest of the middle class) which allowed Trump to win the Republican nomination is nothing new. But the lessons for the remaining leaders of the party, sadly, are that one needs to win the Trump constituency to win the nomination. Democrats may see this as a cause to rejoice: the suicide of the Republican Party. But the pattern of electoral success for Republicans will remain the same: tack to the right in the primaries and to the center in the general election. Trump will be seen as having succeeded in the former task but failed in the latter. His successor--quite possibly Paul Ryan--will perhaps be more adept at this. Trump may not have succeeded in winning the presidency, but he may have won it for Paul Ryan in 2020.
The election of 1964 saw a similar pairing to the current Trump-Clinton matchup. Republican Senator Barry Goldwater was a right-wing extremist, like Trump, but with a modicum of government experience and views only slightly less outlandish. He tapped into the ennui of the white conservative of the time, and won the GOP nomination, but like Trump he suffered from major organizational flaws and was handily defeated by the old pro, his opponent Lyndon Johnson. Four years later, the GOP fielded a nominee nearly as conservative, but much more politically adept, and Richard Nixon became president. The winning combination, an appeal to the growing conservative base during the primaries followed by a tack to the center in the fall, proved Watergate-proof: Nixon's unelected successor, Gerald Ford, used it to come within a hair's breadth of winning in 1976 despite his pardon of the disgraced former president, and in 1980 the adroit politician Ronald Reagan, actor-turned-governor, won in a landslide.
From the Southern Strategy to Philadelphia, Mississippi, to Willie Horton, the Goldwater fiasco of 1964 gave the Republicans the game plan to win five of the next six elections--and thereby brought Goldwater extremism into the mainstream. We continue to deal with the consequences.
Trump may be unable to learn the lessons of history, but his opponents should do better. When they get over the euphoria of electing the first woman president--a welcome rebuke to centuries of male dominance in high American office no less impressive than Barack Obama's historic election eight years ago--Democrats need to get to work. They must reject fiscal policy that makes the rich richer and everyone else poorer, and implement social policy that restores confidence in our government. Otherwise, in the long run, their victory may prove as temporary as Lyndon Johnson's.
Leave a Comment
| || |
||The Most Qualified Presidential Nominee
Political positions aside, looking only at the resumes of the candidates, scholars have pointed to George H.W. Bush as the best-qualified presidential nominee in recent history. A success in business and in politics, he served two terms as a moderate Republican Congressman in conservative Democratic Houston, Texas, re-elected by a majority-white district despite his support of President Johnson's civil rights bills. After that he led the U.S. delegation to the United Nations, served as chairman of the National Republican Committee, was the American envoy to Red China, and even headed the C.I.A. Despite two failed attempts to win a senate seat, he won statewide elections for the first time when he ran for and won the vice presidency in 1980, a job which he held for two terms before winning the G.O.P. nomination (and the presidency itself) in 1988.
I would argue, however, that despite her shorter list of actual government jobs, Hillary Clinton is in fact a more-qualified presidential nominee than was Bush in 1988--indeed that she is the best-qualified presidential nominee since the Civil War (when the job was very different than it is today).
A brief list of her accomplishments may not quite make the point. She has been counsel to the House Judiciary Committee during its deliberation of articles of impeachment against President Nixon; First Lady of Arkansas; First Lady of the United States; United States Senator; and Secretary of State. Impressive, if somewhat shorter than Poppy Bush's resume.
While the first Bush had the longer list of individual jobs, Hillary Clinton has had more years in (or married to the holder of) public office. But not every year in the same job comes with the equivalent amount of experience. It seems to me there is a declining benefit as one continues in the same job: successfully navigating the first year as a senator, vice president, cabinet secretary or committee chair is, I would argue, a better qualification for the presidency than the second or third year in that job, and certainly more important than the ninth or tenth. George H.W. Bush and Hillary Clinton, as nominees, also brought both domestic and international governing experience to the table; clearly that's a better qualification than domestic experience alone.
To determine the relative qualifications of the major-party presidential nominees I developed an algorithm that rates the prior elected and appointmed government service of nominees, with a declining value for each year in a particular job, and bonus points for certain special types of service. I've made some judgement calls on the relative value of jobs, so different numbers would produce different results: feel free to download my spreadsheet HERE and play around with it to see what you get.
I gave the highest scores to the first year of each individual job, and set a "sunset" year, after which no points would be awarded. Here are the specific scores awarded for each job, the schedule for additional service, and bonus points:
||First Year Points
||Additional Year Points
Cabinet (per department)
|Vice President/U.S. First Lady/
Ambassador (per country)/
Director (per agency)/
Supreme Court Justice
Subcabinet (per job)
||First Year Points
||Additional Year Points
|Senate Majority/Minority Leader
|Speaker of the House
|Senate Committee Chair
|House Committee Chair
|More Than Three Jobs
||5 per Additional Job
|Any Government Service
(if less than 5 total points)
(to bring the total points to 5)
In terms of scope, I looked at every presidential candidate who earned electoral votes in the general election, as the earliest elections took place before parties made nominations, and some third-party nominees earned electoral votes--on the premise that earning electoral votes indicates statistically significant support. And I excluded nominees seeking re-election to a second contiguous full term (although these can be found in the spreadsheet, highlighted yellow).
One could reasonably argue my decision to equate service as U.S. First Lady with service as Vice President. On the one hand, First Ladies are not elected and hold no formal government role. But Vice Presidents likewise have a minimal constitutional role: preside over the Senate (and deliver tie-breaking votes when necessary). First Ladies and Vice Presidents are given policy portfolios only at the pleasure of the president. Some do nothing, while others play a more active role in decision-making. As it happens, the only First Lady in the survey played an active role in the Clinton Administration's domestic policy.
Based on this algorithm, it is unsurprising that Hillary Clinton, with a score of 107, is the best-qualified first-time major-party presidential nominee since the Civil War. She earned 25 points as Secretary of State, 45 points as Senator, 27 points as First Lady, and 10 bonus points for international service. Despite his impressive resume, George H.W. Bush came in only sixth since the Civil War, with 88 points. He earned 27 points as VP, 21 points for service as delegate to the UN, envoy to China, and director of the CIA; 10 points as a U.S. Congressman, and 5 points as RNC chair. He earned 10 bonus points for international service and 15 for the quantity of his jobs. In addition to Hillary Clinton, he was beaten out by Lyndon Johnson (1964, Democrat, 103 points), Bob Dole (1996, Republican, 98 points), Teddy Roosevelt (1912, Progressive, 92 points), and "Fighting Bob" La Follette (1924, Progressive, 89 points).
At the opposite end of the spectrum, tied for least-qualified nominee--both since the Civil War and all-time--are Wendell Wilkie (1940, Republican) and of course Donald Trump (2016, Republican), each with exactly zero points, for no prior government service whatsoever. Unless Trump wins, none of the bottom nine won their elections.
An important thing for Hillary Clinton to consider is that qualifications do not necessarily translate into victory: of the twenty best-qualified nominees, only eight won their elections. Of those, only one went on to win a second full term: Richard Nixon (1968, Republican, 52 points); and as she well knows, he didn't finish it.
How do the best-qualified nominees stack up as presidents? Those eight best-qualified winners include an excellent president: Harry Truman (1948, Democrat, 75 points). He, like Lyndon Johnson (1964, Democrat, 103 points), inherited the presidency after his predecessor's death and after his own election chose to not seek a full second term.
However, relatively poor preparation is not necessarily an indicator of failure as president. The lowest-ranking winners include Woodrow Wilson (1912, Democrat, 15 points). Elected with only two years experience as governor of New Jersey, he was re-elected to a second full term and is considered one of our better presidents. Another president usually rated as well for his performance in office, Ronald Reagan (1980, Republican, 45 points), had eight years' experience as governor of California.
Prior qualifications are good, but not necessarily indicative of performance in office. Abraham Lincoln, usually considered the best president in American history, ranks near the bottom (1860, Republican, 11 points). His predecessor James Buchanan is usually considered the worst, yet he ranks very close to the top on prior service (1856, Democrat, 133.5 points).
And, in case you're wondering, the best-prepared nominee in American history was John Quincy Adams (1824, Democratic-Republican, 151.5 points). Seven years in the cabinet, five-and-a-half in the senate, and fourteen as an ambassador to four different countries. Elected without winning the popular vote, he lost his re-election bid to Andrew Jackson (1828, Democrat, 45 points).
Leave a Comment
| || |
||A Nation of Racism
Remarks at Understanding Ferguson, Governors State University
We are a nation of racism and we have been since at least 1710 when Virginia formalized its racial slave code. We were founded by slaveholders and when we agreed to end slavery most whites favored second-class status for African Americans. Each arriving wave of European immigrants was gradually encouraged to jump into a melting pot, eventually mixing themselves all up in whiteness, by becoming as racist as those who came before, and this is continuing with more recent non-African migrant groups. Whether we pass laws to keep Blacks in a separate, inferior space, as we did in the South from the Civil War to the Civil Rights Era, or just do it without using the law, as we did then in the North and do now everywhere in the country, we have built on this history by establishing in our very marrow the notion of separate and unequal. My African-American colleague, Professor Henry Louis Gates, Jr., was arrested when he forgot his keys and had to break into his own home, after a white neighbor called the cops. If I lose my keys, the neighbors, the locksmith, and maybe even the police will bend over backwards to help me get back in. My local locksmith didn't even ask for my ID when I asked him to change the locks after I bought my house. I doubt Professor Gates would enjoy the same privilege.
But by the same token, we are a nation of great promise. Our founding document was written by a slaveholder who said all men are created equal. Our northern states were the first societies in human history to outlaw slavery, and we fought our bloodiest war--with more deaths than in all other American wars combined--to end that pernicious, peculiar institution, setting an example that resulted in the end of legal slavery worldwide only two decades later. For every two steps back, after all, we've taken one step forward, and this year we are commemorating the fiftieth anniversary of yet another shining moment in our struggle: the signing of the Civil Rights Act of 1964.
The events in Ferguson have elements of both sides of American history. On the one hand we have the continued manifestation of this terrible racism from which we still suffer, the killing of an unarmed Black man by a white officer who saw Michael Brown as a thug, inhuman, animalistic--rather than as a young man who had been accused of petty theft and minor assault, deserving of some minor punishment, yes, but also deserving of sympathy, attention, education, nurturing, and respect--like any white eighteen-year-old.
But the reaction to the killing and the subsequent grand jury verdict, the expressions of frustration and pain and anger and sadness, also remind me of a time when we came together as a nation. To heal, yes, but also to learn, to improve, to turn personal tragedy into community triumph. From the killing of Chaney, Goodman, and Schwerner, we got the Voting Rights Act of 1965, which made possible the election of President Obama. We are living in a historical moment again. We have the opportunity to use this moment in ways that will make our children and our grandchildren proud.
Leave a Comment
| || |
||Press Narratives, Marion Barry, and the Fletcher Alternative
The press narrative of the life of Marion Barry, longtime Mayor of Washington, D.C. who died last week, portrays Mr. Barry as having been a disappointment. A bright civil rights leader in the 1960s, he descended into the morass of politics and became the worse for it, leading to accusations of corruption and his eventual conviction on drug-related charges. The city became a dirty cesspool of crime, drug abuse, and despair, and its mayor was emblematic.
But Washington was no more dirty and dangerous than other major cities in the 1980s, and white politicians have also been accused (and forgiven) of cocaine use. Republican Texas Representative Charlie Wilson, while in office, and possibly even former president George W. Bush, in his youth, were accused of using cocaine. And on the Democratic side, accusations have been levied at Senator Ted Kennedy and Hamilton Jordan, an aide to President Jimmy Carter. None of these white politicians suffered politically, yet Marion Barry was made a laughingstock on late-night TV, and the voters of Washington, D.C., who eventually re-elected him to the mayor's office, were derided as fools.
Let me be clear: Marion Barry should have avoided corruption and drugs alike. As part of the Civil Rights Era's first generation of African-American politicians, Barry should have known that he would need to work at least twice as hard as his white couterparts--and that his foibles would garner double the scrutiny. But the narrative is unfair. It plays into traditional white expectations of stereotypical African-American behavior, dating back at least as far as the myth of the corrupt Reconstruction-era politician. It is a convenient myth in that it allows whites to justify their racism (be it overt, unconscious, or institutional). It is a myth we must all fight, and so the current press narrative of Marion Barry is one we must reject.
As an illustration of this, let's engage in a little conjecture on what might have happened if Barry had lost the general election for mayor in 1978. His Republican opponent was Arthur Fletcher, who had most recently been an aide to President Gerald Ford. I know a little something about Arthur Fletcher; he played an important role in the events which form the core of my first book, and he is the subject of my second (forthcoming) book. The New York Times obituary of Marion Barry neglected to mention Fletcher, while the Washington Post obit did mention him, but in both cases the implication was that in this overwhelmingly Democratic city the winner of the Democratic primary was a shoo-in for the general election. (This need not have been the case; in 1978 columnist William Raspberry reported that stronger support of Fletcher's candidacy by the Republican National Committee might have heralded the beginning of a viable two-party system in the District; however, this would have signalled a change of direction the Republican Party was then--and still apparently now--unwilling to take.)
The press narrative of Fletcher runs along the opposite lines of that of Marion Barry (but still contains strong elements of racism and post-Reconstruction racist mythology): in this narrative, Fletcher was a corrupt Kansas politician in the 1950s who was "saved" by the Civil Rights Movement (and harsh personal experiences) and became an unselfish leader, advising four presidents, leading the United Negro College Fund, and becoming known as "the father of affirmative action." There are huge problems with this narrative, just as there are huge problems with the press narrative for Marion Barry. But these narratives allow us to consider what might have happened had things turned out differently at the moment when these stories collided--the mayoral election of 1978.
Had lightning somehow struck and resulted in an election victory for Fletcher (in fact he polled far better than expected but didn't break 40% in any ward except the 3rd, in Northwest D.C.), the press narratives would have us imagine that Washington, D.C. would have somehow been idyllic during the 1980s. True, Fletcher probably wouldn't have run an administration quite as corrupt as Barry's, and it's even more likely that he would have avoided an arrest on drug (or any other) charges (Fletcher was never arrested in his life). But the idea that a Fletcher mayoralty during the 1980s would have alowed the city to avoid the rampant crime and drug abuse then experienced by so many other American cities is ridiculous. Fletcher might have used his experience as a presidential advisor on urban affairs to good effect and probably would have worked out privatization deals with corporations eager to win sponsorship opportunities in the District, but he would have still had to face the overwhelming problem that D.C. has a minuscule tax base and remains beholden to a largely hostile Congress. Reaganites shared Fletcher's party affiliation but little else; Vice President Bush would have offered help but only to the limited effect of which any vice president is capable. Even with major finanicial support from the RNC, Fletcher's re-election bid in 1982 would probably have fallen victim to deepening urban blight and a Democratic midterm resurgence.
So on the occasion of Marion Barry's death, let's be careful how we frame the narrative. Is it possible that Mayor Barry was a kind soul with a good heart and a commitment to social justice who was undermined by a racist society that took advantage of his public mistakes, and an unfortunate addiction to drugs and alcohol? Is it possible that he was a tragic figure worthy of sympathy more than ridicule?
Leave a Comment
| || |
||Another Day That Will Live in Infamy
Two days ago the Egyptian army overthrew the elected president, Mohammed Morsi, suspended the constitution, and installed an interim government. President Morsi, political leader of the Moslem Brotherhood, an Islamist group with a plurality of support among the electorate, had marginalized the secular opposition and proven unresponsive to protests. We can have a variety of opinions on the one-year administration of Morsi, and indeed on his fitness to serve a second term had he not been ousted in a military coup d'etat. But it seems to me that Americans who applaud the coup and support the generals are paternalistic at best and fascist at worst. This is not simply, as David Brooks called it, a question of process over substance ("Defending the Coup," New York Times, July 5, 2013). Nor does it portend the decline of political Islamism, as Tom Friedman would have it: after all, Islamists have been kept down by secular military power before ("Egypt's Revolution, Part II," New York Times, July 4, 2013). This is a question of democratic legitimacy. At issue are our republican values and the idea that people of all beliefs and opinions should look to the ballot box rather than down the barrel of a gun.
The claim that we should support the coup because we favor substance over process, because Mohammed Morsi was arrogating to himself and his party inordinate power for a democracy, and because the Moslem Brotherhood holds beliefs antithetical to democratic ideals, posits a dangerous assertion. This is the crux of the "substance over process" argument: that it is better to have a military dictatorship with secular generals with whom we agree than a democracy with Islamist elected leaders with whom we disagree.
Those of us who approve of the military coup in Egypt must ask ourselves under what circumstances we would approve of a military coup in the United States. What do we do here when we oppose the policies of the incumbent? If we see his actions as unconstitutional and illegal? If we see him as arrogating to himself inordinate power for a democracy? Many of us felt this way about the Bush administration, and many others likewise feel this way about the Obama administration. But should any American advocate a military coup d'etat in the United States, that person would be labeled a crackpot. After all, our system holds that once elected, a president can hold power for a full term. It also holds that a president can be removed for "high crimes and misdemeanors." So when we oppose the policies of the incumbent, we Americans wait until the next a national election. And if the situation is so dire and our constitutional concerns so severe that we can't wait, there is always the option of impeachment.
I can already hear my readers veritably screaming "but that's us! Egyptians don't have the same constitutional protections!" Actually, they do (or rather, they did before this coup).
According to Article 133 of the 2012 Egyptian Constitution (translated by Nivien Saleh of the Thunderbird School of Global Management), "The President of the Republic is elected for a four-year term.... He may be reelected once." Article 152 lays out procedures for impeachment remarkably similar to those found in the United States Constitution. The voters chose Morsi and his party. Once elected, Morsi had the right to govern for his full term unless impeached. That's the social contract inherent in the ballot: when we lose an election, we support the right of the winner to govern, and can only remove him congruent with the rules we agreed to follow by casting our votes.
Most of those who favor this military coup would never consider legitimate a similar takeover in the United States. This is hypocritical, or course, but it is also paternalistic: such people are stating that Egyptians (or non-westerners) are not ready for democracy--and that we are. When pressed, such people explain that Egyptians (and non-westerners) suffer from a lack of education, an over-prevalence of religious extremism, and a weak civil society, as compared to American voters. This is nonsense, of course. There is no evidence that the average educational attainment of the Egyptian voter is significantly lower than that of her American counterpart. The popularity of the Tea Party and ultra-Catholic candidates like Rick Santorum demonstrates that religious extremists have an outsized voice in American politics (and we wouldn't have advocated a military coup if Santorum had been elected president). The civil society charge also does not hold up. Americans have been decreasingly involved in civic organizations, as demonstrated in studies like Robert Putnam's Bowling Alone and others. While we have a stronger history of civil society in the United States than they do in Egypt, if anything our civil society is weakening while theirs is strengthening.
Also inherent in such an argument is the fascist notion that the Egyptian military is somehow legitimate, that it is to them that the Egyptian people should turn when they feel that their government is behaving unconstitutionally. That an unelected cabal of generals should serve as kingmakers is as ridiculous as the idea that the Egyptians are not ready for democracy. But the modern world has already had its share of military dictatorships, from Chile to Hungary to Libya to Vietnam. In all cases the rights of the people were trampled in the name of "order." None of these governments were legitimate, and none were fundamentally different than medieval monarchies. Modern, enlightened people cannot support military states. Democracy, however distasteful it may appear at the moment, is the only legitimate form of government.
Finally, by supporting the coup, what message are we sending to the Islamists? Clearly, we are telling them that we will support any democratic outcome as long as it does not involve their victory at the polls. There can be only one result from this. The Islamists will exit the democratic process and seek success through violence and terrorism. That is horrible. Occasional Islamist electoral victories create the environment that will allow Islamists to accept occasional electoral defeats (much as the Bush victories girded most Republicans to accept the Obama administration and the Obama victories are girding Democrats for the eventual Republican return to power.)
Mohammed Morsi's Islamic Brotherhood-dominated government was a poor one. He appears to have behaved undemocratically, to say the least. But his opponents should have pursued impeachment or waited until the end of his term. His ouster by a military cabal should be mourned. Wednesday was a dark day for democracy.
Leave a Comment
| || |
||Presidential Pet Peeve
Barack Obama is not the 44th President of the United States.
There's a line guaranteed to attract attention. But read it closely: I'm not saying he isn't the president. I'm just saying he isn't the 44th. Sounds crazy? Maybe, if you've been led to believe that there have been exactly 43 presidents before him. But as any good US historian knows, that isn't true.
I'm not talking about the presidents of the Continental or Confederation Congress, who preceded George Washington (those presidents did not preside over the nation, they presided over congress, much the way Vice President Joe Biden presides over the senate in his capacity as President of the Senate). Washington was indeed our nation's first president. Nor am I referring to Jefferson Davis, the President of the Confederate States of America: Abraham Lincoln was our 16th president and Andrew Johnson our 17th.
I am referring to the ridiculous yet widely-accepted notion that Grover Cleveland was both the 22nd and the 24th president of the United States because his two terms in office were non-sequential. He succeeded the 21st president, Chester Arthur, and the 23rd president, Benjamin Harrison, it is true, but that fact didn't make him two presidents any more than it made him two people. During Cleveland's second term, there had been a total of 23 men who had served as president, and the next president, William McKinley, was the 24th man to hold the office, not the 25th.
One might argue against this by noting that Cleveland's were the 22nd and 24th presidential administrations, and therefore he should be considered the 22nd and 24th presidents. But every two-term president is seen as presiding over two administrations; with re-election, presidents expect pro forma resignations from their cabinet officers, and it has been considered honorable to leave government service with the start of the boss' new term since Secretary of State Thomas Jefferson did it in 1793. By that standard, Cleveland presided over the 28th and 30th presidential administrations and President Obama is presiding over the 62nd presidential administration (that's counting every succession and every re-election).
The president who was re-elected after four years out of office should not be counted any differently than the fifteen who were re-elected while holding the office. So let's get it straight: Cleveland, the 22nd man to hold the office, was the 22nd and only the 22nd President of the United States, and Barack Obama is not the 44th, but the 43rd.
Leave a Comment
| || |
||Toe-to-Toe with the Imposter
Arriving in Oxford
I suffered badly from imposter syndrome while in grad school and as a postdoc. Although I managed, I think, to cover it up with bravado and gregariousness, I regularly doubted my ability to pass my comprehensive exams, research, write, and defend a dissertation, and secure a job as a professor. It took the publication of my book and my appointment to a tenure-track position for me to overcome it, but it took this past weekend’s conference at the University of Oxford to realize I had done so.
When I started at the University of Virginia I quickly became miserable, personally and professionally. My girlfriend was still in New York, and every other weekend either I would head north or she would come south. Having been an award-winning history major at Baruch College, where I was a big fish in a small pond, I soon found myself a small fish in a big pond. My work habits were difficult to maintain, my studies suffered, and I earned more Bs than As. Although my advisor Michael Holt, Grace Hale, and possibly Brian Balogh had faith in my prospects, it would appear that I failed to suitably impress Ed Ayers and Gary Gallagher, in retrospect with good reason. I left after completing the MA.
With the recession still on and CUNY offering me a stipend, I decided to give history another shot, but immediately suffered another setback. Noting that I had an MA, my new advisor, Jim Oakes, recommended I take the written comprehensive before I even started classes, and I failed.
With this background it should be easy to understand why I suffered from imposter syndrome. And although I passed the writtens on my second try a year later, I earned a B on my second major paper.
I took refuge in student government, a pursuit which brought early success. Elected to represent my department at the Doctoral Students’ Council, I was immediately elected to represent the DSC at the University Student Senate, where I quickly became Vice Chair for Graduate Affairs, was appointed to represent the students on a committee of the Board of Trustees, and found myself regularly rubbing elbows with senior administrators. Although I denied it to myself at the time, it was a way to avoid not only my studies. There remained, that constant, small voice inside that told me that I wasn’t up to the task of doctoral work.
But even then there were signs of the success to come. I published reviews in my minor and major and I earned a rare A from notoriously difficult grader Laird Bergad in his class on the history of Brazil. And I earned an A on my “outside” paper on a topic unrelated to my planned dissertation on industrialization and slavery. Sadly, I spent more time focusing on my doubts than on these successes.
Still, I plugged on. I passed my orals and switched my thesis topic and advisor, moving into the twentieth century and the history of civil rights (which had been the topic of that “outside” paper). Firmly back on track, I researched, wrote, and defended my dissertation.
But now it was 2008, and the job market had tanked. I thought that what remaining academic jobs there were would go to the “elite” scholars—those who had stayed at UVa or worked with pedigree advisors at Ivy League schools. I couldn’t compete with people like Wayne Hsieh, who finished at UVa after four years and immediately landed a tenure-track job at Yale, his undergraduate alma mater. Wanting to stay in New York for the sake of my wife and daughter—where the chances of landing a position seemed particularly abysmal—I focused on applying for administrative jobs, trading on the connections I had made through my student government work.
My advisor, Clarence Taylor, as well as Brian Purnell, one of my readers, urged me to submit my dissertation to presses. It doesn’t cost anything, I thought; nothing ventured, nothing gained. A year after my defense I was under contract with the University Press of Kentucky. I decided to go on the academic job market in earnest.
After two years on the market, however, I once again began to doubt myself. Most of the rejections didn’t faze me, but the ones where I came very close—especially those in New York—were hard to handle. After not getting a job offer at BMCC, where I was the inside candidate, I didn’t even get an interview at BCC, where I was an adjunct and former sub. Then I was rejected for an adjunct job. Perhaps my book contract was just luck, I thought. I doubted my skills with history, I doubted my abilities as a teacher, I doubted I had the personality to win over a search committee.
And then, suddenly, it was over. My book was published and I had a job.
I didn’t think about my imposter syndrome again—by then I had long known the term—until this past weekend when I attended a conference at St. Anne’s College, Oxford. There, toe-to-toe with serious scholars (about a third of whom were from Oxford or Cambridge, either as grad students or faculty), I held my own, engaging in incisive discussions inside and outside of panels with the likes of Gareth Davies, Kevin Yuill, Catherine Clinton, and even a team from UVa—two ABDs and a recent PhD. (Particularly noteworthy were my interactions with Clinton, a seemingly imposing imposing figure during my senior year at Baruch when she was a visiting full professor, who now, as holder of a chair at Queens University Belfast, is “Catherine.”)
But what I found most remarkable was the imposter syndrome on evidence among Oxford and Cambridge grad students. It was then, while commiserating with them, that I finally realized that I was “cured.” I had arrived. I no longer have any doubts about where I stand in our field. I can hold my own with some of the finest scholars in the world.
I didn’t write this to brag. (Much of my career, as you can see, has been nothing to brag about.) Rather, I wrote it to encourage those grad students who may be experiencing similar setbacks to those I faced. My last post contained an unduly harsh admonition to those who have been procrastinating, but I want you all to know just how successful you can be in our field despite seemingly debilitating failures. Focus on the positive, stay flexible, keep working at a good pace, and remember that you are capable of doing great work. If I can do it, you can too—if you really want to.
Leave a Comment
| || |
||The Abysmal Job Market, Revisited
Last summer I posted a comparison between my experience and that of Joe Sramek, a friend and fellow CUNY PhD who went on the academic job market two years before me (Advice for the Abysmal Job Market, 8/9/10). Joe had written a piece for the Graduate Center Advocate in the fall of 2007 in which he laid out his path to the tenure-track job he now holds: over 150 applications, more than a dozen interviews at AHA Atlanta, three on-campus interviews, and one job offer. But between his Advocate piece and my doctoral defense a year later, the market tanked, and it has not recovered. Last year I blogged about my experience during my second post-doctoral year and my first in earnest on the academic job market. Consciously channeling Joe’s piece, I noted my nearly one hundred applications, exactly zero interviews at AHA San Diego, two telephone interviews and one on-campus interview. I had received no job offers for tenure-track positions but I was looking forward to starting my second semester as a full-time sub at CUNY, the “inside candidate” for a tenure-track job at Borough of Manhattan Community College. Under the circumstances, I considered it a successful year in that I had become a full-time faculty member and had, as I put it, the chance to try again.
As the insider at BMCC I considered the entire semester one long job interview, and was interviewed by the committee and then, as a finalist, by the provost. When I ultimately did not get the job, and found myself scrambling for adjunct work in January—again with no interviews scheduled for the AHA (Boston)—I considered writing this “revisited” piece. Ultimately I decided against it, worrying that I would come off as bitter at an academic market that had let me down. After all, my credentials were about as solid as they could be: a book in press, full-time teaching experience, copious committee experience, strong references. Things soon went from bad to worse: in March I didn’t even make the interview cut at Bronx Community College, which had two tenure-track jobs available, and in April I was rejected for an adjunct job—an adjunct job, for Pete’s sake—at New York City College of Technology.
But yesterday I received my third full-time job offer in as many weeks, and it now seems as if every three to four days I withdraw from a search after being offered an interview. As I prepare for the move to my new tenure-track job at Governors State University, an upper-division college with graduate and doctoral students outside Chicago where I will be designing the undergraduate history major and new MA program in history education, this seems like a good opportunity to reflect on the market and offer some fresh advice to recent grads and ABDs at the CUNY Grad Center.
A la Joe Sramek, let’s start with the stats. Over the past two years, I applied for a total of 281 academic positions: 181 tenure-track (or otherwise permanent); 58 replacement (visiting, sub, etc.), 36 postdocs, and 6 public historian (i.e. in-house historian at the Department of Defense). I had (or was offered) 15 first-round interviews (8 of which for tenure-track or permanent jobs) and 7 on-campus interviews (4 for tenure-track/permanent). The result: five job offers, two of which were tenure/track or permanent and three for visiting positions (including the two visiting positions I accepted last year).
(A word about “permanent” non-tenure track: this is a category of job about which I was unaware prior to my search. Perhaps it is a new phenomenon, as colleges seek to retain their options to lay off faculty in the wake of ever-tighter budgets. But I interviewed for two of these positions and was offered one, which ultimately I did not take because fortunately within days I had a tenure-track offer. Announcements for these jobs typically contain the word “renewable” but otherwise appear to be replacement jobs, but the applicant should ask how renewable they are during the interview. The two that resulted in interviews for me each involved three-year contracts, permanently renewable.)
Last year, my main regret was that I hadn’t gone on the job market in the semester of my defense—fall 2008. I still regret that, as the market was better then than it would be the following year. But my current regret is that I didn’t promote myself adequately as the author of a book until it was actually published. When I say adequately I mean listing the book on the first page of the CV, centered, below the address, above all other content. I had been listing the book under “publications,” on page two of the CV. Once I made it more prominent I received a flurry of interview offers, and that’s the key in this job market: you have to have a stand-out CV and cover letter. These are the tools for getting from the application to the interview—and to do that you will need to beat the longest odds of the process.
In this abysmal job market, search committees are receiving upwards of 200 applications for each position. It’s easy to understand why: just read Perspectives—the AHA monthly magazine—and you’ll see that since 2008 the number of PhDs granted in history has exceeded the number of job openings. From these 200-odd applications, search committees must select 10-12 candidates for an initial interview. So it’s basically twenty-to-one against getting an interview. The odds get significantly better after that: 3-4 candidates get on-campus interviews, and if the top choice turns down the job (as I did for one permanent job and one temporary job) you only need to be in the top two to get it. In short, you need to get out of the general pool of applicants and into the much smaller pool of interviewing candidates. To do that you need a solid cover letter and a stand-out CV.
Your cover letter should be between 1.5 and 2 pages. The body should include a lengthy paragraph on your dissertation (followed by a short paragraph on your current scholarship if you have moved beyond your dissertation), a paragraph on your teaching experience, a paragraph on your teaching philosophy, and a brief statement about your service experience if you have any. (While at the Graduate Center it pays to serve on two or three committees; ideally you’ll serve on at least one that is college-wide. Ask your DSC rep to put you in touch with the students serving on the Committee on Committees so that you can get on a college-wide committee.) The introduction should briefly list your major accomplishments as they relate to the particular job (“with my dissertation now under contract and with copious experience teaching both American history and Western Civilization, I am very excited by the opportunity to apply for the assistant professorship at Beaufunk State University, Juneau”) and the conclusion should briefly restate the fact that your quals make you a good fit for the job and that you are available for a telephone or Skype interview and will be attending the AHA. Scholarship and teaching should come first and second: respectively for a research university job and reverse for a community college job; use your own best judgment for the many jobs in-between.
Everyone in this market has a Ph.D. or soon will (even for community college jobs), and everyone who’s in serious contention for the jobs you want has a list of awards, good references, and a solid transcript. You will stand out with a book (again, aggressively promoted on the CV from the moment you get a contract) and with full-time teaching experience. Is it possible to get a job in this market with neither? Sure. But the odds are worse than abysmal. So you need to get these things. The full-time temporary teaching experience will come through persistent applications and a willingness to travel (especially now that CUNY has cut virtually all substitute lines). The book contract will be a harder trick, so you need to put a lot of energy into that. It’s not just luck: you need to write an excellent dissertation on a topic that fills a gap in the existing literature or is otherwise very sexy. But you also need to have a good relationship with your advisor, who will recommend appropriate presses; you need to write your dissertation in a style that is conducive to becoming a book; you need to develop a strong book proposal; and you need to respond to the anonymous peer criticism with grace and thoughtfulness, showing the editors that you’re someone they can work with comfortably.
Here comes the tough love. If you’ve been working on your PhD for more than a decade, or you’re having problems getting your advisor or committee members to agree to schedule a defense, stop. Just stop. Get out and find another career. You’re not going to get a book published and you’re not going to get a full-time job—so you can forget about joining the professoriate. No college is looking for a failed historian, and the sooner you accept that you need to get out, the better it will be for you. And if you can’t count on good references you are in for continual frustration. I know this is harsh, but it’s reality, and the sooner you accept it the happier you’ll be. If you don’t get out now you can expect a career on the adjunct circuit—and even adjunct jobs are getting harder to come by. If you haven’t already done so, file for your MPhil and move on.
But if all indications are that you’ve got a shot at this, you need to get on the market with gusto, starting with the fall semester of the academic year in which you will defend. Get an Interfolio account, line up your references, get them to upload their letters directly to Interfolio, and ask them to renew their letters each year you’re still on the market. Use your Interfolio references for EVERYTHING: even online applications, which ask for your referees’ contact information, can be given individual Interfolio e-mail addresses (instructions at Interfolio). Apply for every single job for which you feel qualified, based on your scholarship and your teaching experience—including temporary jobs outside of your scholarship as long as you’ve taught in that field. Check the listings every week at higheredjobs.com, historians.org, H-Net, and cuny.edu. Apply for each job significantly early to meet the deadline—and be conscious of whether the search calls for delivery by e-mail, online, or post. Plan on attending the AHA, but don’t be dejected if you don’t get scheduled interviews for it (I never did). At the AHA, check the job board and apply. I applied for two jobs at AHA Boston and got one interview while I was there. Keep checking and applying into the spring. Both of my tenure-track/permanent job offers were the result of applications with spring deadlines.
If you can get into the interview pool, you’ve got an excellent chance of landing a good job. And the more you interview, the better you’ll get at showing what’s great about you—and the more your chances improve. Because once you’re in that pool, it’s much more about personality and delivery than paper credentials. But there are some things you’ll need to keep in mind long before you get to the interview stage.
Most importantly, you need to start regularly reflecting on your teaching. You don’t need to be an excellent teacher to get a job, even at a teaching institution like a community college, but you do need to demonstrate that you think about your teaching and are looking for ways to improve it. It’s generally better to talk about the mistakes you’ve made as a teacher—as long as you can discuss how you overcame them—than it is to recite all the latest pedagogical techniques you’ve been trying.
Give some serious thought—preferably long before the interview—on why you want to be a teacher.
Once you’re offered an interview, research the school and think carefully about why you want to work there, why you think you’re a good fit, and what you want to know from the interviewers. Yes, I know, you want to work there because you want a job—any job—and all you really want to know from them, in this market, is whether or not they’ll hire you. But part of the process is to pretend you don’t feel that way. So do as much research as you can in advance, take notes, and write down questions. The questions, by the way, are as much about showing you’re serious as about getting more information. After all, if you only get one offer, you’re going to take it. But if you get more than one, you may not have a lot of time to make up your mind, so it pays to know as much as you can. You should ask about the teaching load and opportunities for research funding, service commitments, where the faculty tend to live, etc. But the question I always lead with is “tell me more about the students.”
You’ll probably be asked about using technology in the classroom. You might also be asked about online courses. Be honest. Talk about what technologies you like, and where you feel you could use more training. If you haven’t taught an online course (I haven’t), it pays to be able to say you set up a blog for your students and posted questions on it each week for them to discuss. But as with everything else, don’t just do it for the sake of being able to say you did it; reflect on the results and be prepared to discuss them.
Don’t ever say “pedagogy” in an interview; for that matter, don’t use all that many GRE words at all: you’ll come off as out of touch, incapable of relating to your students. You should speak in a relaxed manner, using plain English. Answer questions directly, and don’t be evasive; just like in your orals, don’t be afraid to say “I don’t know.” Hopefully you won’t have to say it that much in each interview.
Be positive; don’t be glib, trite, or flip; take the interview seriously even while showing that you don’t take yourself so seriously. I was once asked about my experience teaching junior high school. [I gave the wrong answer.] Needless to say, I didn’t get that job. Only in retrospect did I realize that they were asking how I could relate to students pursuing degrees in education. But more importantly, if I wasn’t prepared to discuss something on my CV, I shouldn’t have put it on my CV. A far better answer, obviously, would have been something like “it taught me just how hard teachers work and gave me a tremendous respect for people pursuing that calling.”
Be prepared with a list of courses you’d like to teach, not just the courses you’ve already taught. As a full-timer you will likely teach at least one or two courses a year beyond the introductory surveys. In interviews for two of the jobs I was ultimately offered, I was asked about my willingness to teach in my minor, so you should think about how you would design such courses. For your surveys, be prepared to list the textbooks you’ve assigned.
Try to think about the process from the interviewers’ perspective. Reading more than 200 cover letters and CVs, then meeting to discuss them and choose interviewees, then debating the choice of finalists. But that’s just the logistics: they’re often very anxious about making the right choice. They’ll be working with the successful candidate for at least seven years, if not many more; that’s a big commitment and they don’t want to hire someone with whom they won’t get along. But once you can relate to that experience, it will be helpful if you can find a way to show that empathy during the interview. People who can relate to different perspectives make good colleagues.
Don’t be put off by what appears to be a bad attitude on the part of the interviewers. You may think they don’t like you, but they don’t know you, and they might be role-playing to test your responses. You also might rub someone the wrong way; be cool, and you might win over everyone else. (That person you’re rubbing the wrong way might actually be disliked by everyone else.)
The more you interview, the more you’ll find that your answers seem (to you) rehearsed. Just remember that this is the first time your interviewers are hearing them, so behave accordingly. This holds true even with multiple interviews on the same campus: the dean hasn’t heard your spiel with the committee, so just say it all again. You’ll develop a few catch phrases, and that’s great: it will build your confidence, and confidence (as long as it’s not arrogance) comes off well in an interview.
Full-day campus interviews will include a meal, but late in the day your energy will start to flag. You might have breakfast with the committee chair, a meeting with the full committee, lunch with committee members, a teaching demo or job talk (to which the entire community might be invited, a walking tour of the campus, and one-on-one meetings with the chair, dean, and/or provost. Bring granola bars to keep your energy up; eat them when you’re getting tired, rather than waiting until you’re hungry. Because of adrenaline and anxiety you might not actually get hungry until much later.
With the teaching demo or job talk, style and substance both matter. As with the orals, if you don’t know the answer, don’t fake it; admit it candidly and explain that you use such moments in your teaching as an opportunity to show how professors aren’t omniscient, and that you turn it around and ask the students what they think the answer is. Try to teach towards your specialty, both because that’s where your expertise is and also because it shows how you relate your scholarship to your teaching. In one demo I was asked to lecture on the social and political changes of the Age of Jackson. As a scholar of African-American history, I focused on the growth of abolitionism during that era and the political changes regarding the expansion of slavery. I didn’t even mention Andrew Jackson himself until the Q&A. I got the job.
A good way to prepare for the teaching demo: teach (duh). But here’s a good way to prepare for the job talk: convert a dissertation chapter into a fifteen-minute presentation, then start delivering it at conferences. You can find a list of Calls for Papers at H-net; find the conferences that seem most likely to accept your paper, write a proposal, and send it off. After you’ve delivered the paper a few times, the job talk will be second nature—and all those presentations are lines on the CV!
When you get a job offer, ask how long you have before they need a decision. Then IMMEDIATELY inform any other search committees for whom you have interviewed but not been told whether or not you got those jobs. If it’s true, say “I’d really rather work with you, but I need to make a decision soon.” I did that, and within two days I had a better job offer—which I accepted.
Try to avoid pulling out of jobs which you have already accepted, but you can exploit the job-value hierarchy. It is understood that you can and will back out of adjunct commitments for a full-time job, and you can back out of a temporary job for a permanent job. This is not unethical as long as you have not signed a contract (and may be ethical in any event when it comes to backing out of adjunct contracts—just be sure you’re not breaking the law). But if you have signed a contract for a temporary full-time job, you must meet that commitment. You also may not ethically back out of one job for a better job in the same tier, i.e. backing out of a tenure-track 5-5 for a tenure-track 4-4, or backing out of an adjunct job with an hour-long commute for one closer to home. In this regard a verbal acceptance is ethically binding, so it pays to ask for as much time as possible and not commit until you’re sure you’re ready. Just don’t miss out! It’s better to turn down a better offer in the same tier later than to lose one job hoping for a better offer that may never come.
You can imagine just how excited and relieved I am to be off the market. But like Joe Sramek before me, I felt that I could not leave for my future on the tenure track without imparting what I have learned to my fellow CUNY PhDs and ABDs. See you at AHA Chicago!
Leave a Comment
| || |
||AAAHRP Conference, Seattle, Washington
I'm sitting in my room at the Arctic Club Hotel in downtown Seattle, enjoying my last morning of Pacific Time. Yesterday's presentation at the Northwest African-American Museum was a success. I delivered my latest paper on Art Fletcher, the father of affirmative action, and faced interesting questions. King County Councilmember Larry Gossett, referring to Art's 1967 election to the Pasco City Council, asked about the minority and general population of Pasco at the time. A Washington State University Ph.D. candidate named Marc Robinson asked for more information about the Pasco drug trade. In both cases I did not have the answers, so I'm glad the questions were asked. I also had the good fortune to meet Patsy Fletcher, Art's daughter-in-law (Paul Fletcher's ex-wife), who chaired a panel later in the day; and Nat Jackson, Art's protege and successor at the East Pasco Self-Help Coop, who drove up from his home in Olympia. Nat and I spent most of the afternoon together; he had brought a VCR to play me a tape he had made in 1995 in which Art announced his presidential run. Tacoma Mayor Marilyn Strickland--who has the distinction of being Tacoma's first African American woman mayor and first Asian-American woman Mayor (her father was black and her mother is Korean-American) delivered an excellent keynote address, discussing (and lamenting) the continued importance of race in American politics since the election of Barack Obama. My favorite presentations included that of University of Montana Professor Tobin Miller Shearer, who compared the aggressive use of prayer at civil rights protests in the South with displays of guns by black militants later in the period; and that of Western Carolina University Professor Pamela M. Harris, who analyzed the dearth of newspaper accounts of Irene Morgan's Supreme Court decision in an attempt to determine why Morgan, who refused to give up her seat on a segregated bus in the 1940s, is barely remembered in comparison with Rosa Parks.
Leave a Comment
| || |
||Mayor Bloomberg's "Let 'em Eat Cake" Moment
On Monday, the first day after the snowfall, Mayor Mike Bloomberg, speaking from his perfectly plowed, salted, and cleared East 79th Street, suggested New Yorkers "relax and take in a Broadway show."1. I'd love to take in a Broadway show, but not only are ticket prices far beyond what most New Yorkers consider an affordable evening's entertainment (one current show has tickets from $76.50 to $289.00) but there were no trains running anywhere near my neighborhood, and my street hadn't been plowed, so I couldn't drive to another neighborhood with underground trains. On Tuesday morning, the F train started running again, and on Wednesday afternoon my street was plowed. (As of Thursday morning, we still haven't had any mail delivery.) Mind you, the snow stopped falling early Monday morning. To add insult to injury, the mayor then told New Yorkers that we were to blame for driving in the snowstorm and getting stuck. But the coup de grace came when he said that "people’s perceptions were based largely on whether their own streets were clear." Yes, and that's most of us! On Tuesday night, when my local F train became Brooklyn's only above-ground line back in service, I went to Park Slope. I walked along 8th Avenue from 9th Street to Garfield Place. Every residential street (with the exception of the one next to the hospital, thankfully) looked exactly like my own in Gravesend: unplowed.
What this episode indicates is something that many New Yorkers already knew about our plutocrat mayor: that he is not one of us. I don't mean that he's from Boston; most New Yorkers originally hail from elsewhere. I mean that he is not representative of us, he doesn't get us, he hardly even pretends. He doesn't value our opinions, and when he hears them, as with the snow, he attributes them to venal self-interest (which says more about him than it does of us). I can't say I disagree with all of his signature policies; his successful smoking ban in bars and restaurants, his push for calorie counts on menus, his increase of the bike lanes and pedestrian-only plazas, and his unsuccessful attempt to decrease the number of cars in lower Manhattan all strike me as improvements to New York City and our way of life. It's the way he does it, running roughshod over our feelings and opinions. Another recent example, besides the poor way he handled the blizzard, was his appointment of a media executive to head the Board of Education. Tone-deaf.
And so I invoke a sad episode in history for comparison. What made Marie Antoinette's statement historically significant was that it showed just how cut off she was from her people, just how little she understood life for everyday Parisians. For her, when the royal pantry was out of bread, she could simply have cake (another type of bread). But for the hungry of France, there was no such option. She could hardly conceive of such a life. And "Mayor Mike," while obviously not as distant (he did, after all, have a middle-class upbringing), is sufficiently distant nonetheless.
Marie Antoinette went to the guillotine. Will Bloomberg's political career suffer the same fate?
Leave a Comment
| || |
||65,000 African American Soldiers in the Confederate Army? I Think Not
Historian James R. Grossman recently posted this blog entry on a current elementary or secondary school textbook in use in the state of Virginia.
Obviously the historical profession needs to work harder to ensure that children at all levels are assigned quality history textbooks, and another problem is the ability of partisans of historical "pseudofacts" to use the internet to spread false messages about the past. But as an internet user and sometime blogger, I'd prefer to meet the challenge head-on. So here's my rebuttal to the notion that there may have been as many as 65,000 African-Americans in the Confederate Army:
Certainly we can dismiss out of hand the notion that black soldiers served under General Stonewall Jackson, who died in May of 1863, nearly two years before the Confederate government authorized the recruiting of black troops. Further, the fact that blacks were not allowed to serve in the Confederate Army until weeks before the end of the war creates a logistical problem with the veracity of the statement. Could the ragtag remnants of the Confederate government and army even have mustered in (let alone trained) 65,000 volunteers of any color in that short time span? And if they did, wouldn't they have been able to win a few battles and prolong the war into at least the early summer of 1865? With 65,000 fresh recruits, why would General Lee have surrendered when he did?
So clearly we're not talking about actual soldiers, legally recruited and trained. What then are we talking about? Assuming that this figure of 65,000 was not made out of whole cloth, is there any other way to arrive at it?
If we include, in addition to the handful of actual black soldiers that did enlist to fight for the south during the Confederacy's final days, the blacks (most, if not all of whom, presumably, were slaves) who served support functions for southern white soldiers and officers during the war, perhaps we could get to that figure, but I doubt it. But what if we counted all the slaves owned by confederate army officers? If we consider these human beings as personal property, as slave-owners certainly did (and modern Confederate apologists certainly still do), then being owned by a soldier in the Confederate army would make them "part" of the army in the manner in which a soldier's canteen is "part" of the army. These people may never have seen a battlefield; many of them may have even abandoned their owners after hearing news of the Emancipation Proclamation; nevertheless, Confederate apologists might still consider them somehow part of the army, and when added together, they might meet or even exceed the figure of 65,000.
But this seems to me the very definition of comparing apples with oranges. Even if 65,000 slaves were coerced (with offers of freedom or more direct modes of coercion) into the Confederate ranks, how does that compare to the 180,000 former slaves and free blacks who willingly and eagerly fought for their freedom on the Union side?
Look, some black southerners served in the Confederate army in the final days of the Civil War. OK. And there's a nut down in Ashville, North Carolina, a black man who marches to the town square every morning in full Confederate regalia. But it just isn't important enough to record beyond a footnote--certainly in comparison to the service of black soldiers in the Union army. The attempt to inflate the figure is an attempt to make black Confederate soldiers more relevant than they were--and thereby justify the false claim that the Civil War was not about slavery.
What do you think?
Leave a Comment
| || |
||Thirty Years from Independence Plaza
As I start my new job here at BMCC, I can't help but feel as if I am returning after thirty years. It was in 1982, 28 years ago, that I graduated from P.S. 234, the Independence School, at the end of 5th grade. My morning walk from the train to my office at BMCC today passes the drop-off point for the school bus I took every morning from Greenwich Village to Greenwich Street.
Tribeca was a different neighborhood back then. De-industrialization had left this thriving commercial neighborhood with numerous piers on the Hudson virtually an empty shell. Dark warehouses and loading docks piqued the curiosity of the elementary-school student traveling in and out every day. Independence Plaza--designed as an attempt to remake the neighborhood as a residential gem--was more of a dangerous housing project than an urban oasis.
And of course, there were the twins. Recently completed--but for all I knew as a seven-year-old, longstanding fixtures--these two shining edifices of glass and steel represented the anchor of a neighborhood desperately trying to recover. Strength amidst squalor. How excited I was--we all were--when the school bus driver would ask, on a morning with particularly light traffic, if we wanted to go see them! Instead of turning on Harrison Street, he'd stay on the West Side Highway--then a dark nest of drugs and prostitution under the old viaduct--and pass the gleaming World Trade Center, the "twin towers," before making a U-turn at Battery Park.
But most mornings were mundane--in a Greenwich Village sort of mundaneity. My stop was on University Place and East 13th Street. There was a corner deli where I learned to steal candy and subsequently learned not to steal candy--still there, with different owners--and there was a second-floor window where a fat man would parade around naked--also still there, but presumably with a different tenant. Stromboli Pizza--which we called Amadeo's after the boxer-turned restaurateur who owned it and made every pizza by hand, flipping the dough in the air--is also still there, but Amadeo himself is long gone. (His picture remains, but the owners are not related.)
We'd line up and board the bus and it would take us south on University Place--University was a two-way street then, and New York University then gave as much to the neighborhood as it took--then around Washington Square Park and down LaGuardia Place into Soho. Then it was west on Broome and north on Hudson to drop off the P.S. 3 students and pick up my friend Uri Feiner, who today is godfather to my daughter. Then West on Christopher to the West Side Highway. Finally, east on Harrison and south on Greenwich to what was originally called P.S. 3 Annex, a small elementary school at the top of what seemed like an incredibly long flight of stairs to a plaza with a playground beside a very tall building.
Mundane also meant an early exposure to Pink Floyd. The driver blasted The Wall--that paean to adolescent male angst--and the whole bus would sing along in the afternoons when he'd reach "Another Brick in the Wall, Part Two." Our favorite line, naturally, was "We don't need no education."
One day, during recess, Shawn Adams and I took advantage of the lax security among our teachers, and ran off along the promenade on Independence Plaza's western edge, crossing above Harrison Street and exiting the Plaza at Franklin Street for an unsupervised slice of pizza and a turn or three at "Space Invaders" or "Asteroids." The pizzeria may have also had a bar; it was certainly dark, and knowing what I now know about the neighborhood then, I consider us lucky to have gotten back to school unscathed, let alone our absence undetected.
One horrible morning, Etan Patz didn't get on the bus. His stop was on West Broadway and Prince Street. We weren't friends; he was one year behind me, and a year is huge in elementary school. But everyone on the bus soon knew just about everything there seemed to be to know about him, with pictures plastered all over downtown Manhattan. And after that, one of my parents always walked me to the bus stop.
Graduation came in 1982 and it was off to junior high without a second thought. High school, military service, my twenties, college and graduate school, all have come and gone without visits to Independence Plaza. I've sped by many times on West Street--now without the viaduct--towards the battery--now without the twins.
After thirty years, Tribeca is a different place. Yes, Independence Plaza and the warehouse buildings are still here, but the Plaza is now the posh urban oasis its builders envisioned--and far too expensive for a BMCC professor. The warehouses, however, are more accessible, with trendy restaurants, bars, and coffee houses. The piers have been replaced by the World Financial Center and Stuyvesant High School. There's less room for imagination, but there's also less danger.
I'm glad to be back.
Leave a Comment
Edith Pajarito, 11/14/10
I wad going to leave a comment about the 65,000 african-American that fought in the civil war but there is not way to leave a comment. Instead, i will leave a commet about 30 years after independence plaza. Is interesting how independence plaza has change through the years and moré about how this área has become expensive but safeter but missing the Twins that were a great loss to this wonderful city.
| || |
||Advice from the Abysmal Job Market
Three years ago my friend Joe Sramek defended, went on the job market with over 100 applications, had about a dozen interviews at the 2007 Atlanta AHA (and turned down a few more that he couldn't find time for), had three on-campus interviews, and was offered and accepted one tenure-track job. He was both lucky and talented. Since then, as you know (unless you've been blissfully nose-deep in the books these past few years, which is entirely possible), we've entered the worst academic market possibly since the Middle Ages—but definitely since the 1970s. Under these circumstances, it’s very difficult to hold ourselves up to Joe’s standard. Having just completed a year on that market with very different results, my advice may prove useful to grad students who will be defending in the next year or two.
First, to parallel Joe's story, let me tell you about my year. I went on the market with over 100 applications. I had two telephone interviews for tenure-track jobs and two local campus interviews for replacement faculty jobs (in my case, CUNY full-time substitute positions) and had exactly zero interviews at the 2010 San Diego AHA (although I enjoyed attending panels in the balmy weather). I was also placed on a pre-interview shortlist for one tenure-track job but didn’t make the interview cut. All four of the preliminary interviews resulted in second-round on-campus interviews. One of the tenure-track jobs resulted in a rejection; the other ended in a cancelled search (for administrative reasons; I’ve been encouraged to re-apply next month when the search is reopened). I was offered both replacement jobs. I still don't have a tenure-track job, but under the circumstances it was a successful year. I am a full-time faculty member with a full-time salary and summers off—and the opportunity to try again.
Some will advise you not to go on the market the semester you defend. Others will say you won't get a tenure-track job in this market without a current replacement job and a book contract. I say you should apply for anything and everything for which you are qualified. Serendipity, synchronicity, and dumb luck are incredibly important and underrated factors—and completely beyond your control. But if you don't apply, you definitely won't have a shot. The financial outlay is minimal—the cost of postage for those departments that are still using snail mail, and the cost of Interfolio dossier service. The energy outlay, however, is significant. Be prepared for 8-12 hours per week in the fall, and 1-2 hours per week in the spring—not counting interviews. The more you do in advance, the easier the individual applications will be.
Use Interfolio dossier service. Do you really want to bother your referees for every application? Get three trusted faculty members to write strong anonymous letters of reference, and have them upload them to your Interfolio account. You should also order a copy of your doctoral transcript to be sent to Interfolio. Once your dossier is complete, you can order Interfolio to send the documents in any combination—by e-mail or post—directly to each search committee. Interfolio will send your documents for $6 per application. You can also upload your CV to Interfolio, but I don’t recommend that, because you’ll want to change your CV too often for the service to be useful.
Some more documents you’ll need, in varying combinations, for most applications: A research agenda (what you plan to do with your dissertation, and what you’re thinking about doing next); a research philosophy (how you approach the task); a teaching philosophy (ditto); three writing samples (a full chapter, a truncated chapter, and something 2-3 pages long--like a book review); your dissertation table of contents as a separate file; sample syllabi (from courses you’ve already taught as well as for 2-3 courses you’d like to teach as electives); and your teaching evaluations (scan your copies and combine them into a single document). If you plan to apply for post-docs, you should prepare a research proposal. And if you plan to apply for public history jobs with the federal government and are a military veteran, scan your DD-214.
And then there’s the cover letter. I have five draft versions and I still tailor them for each application (beyond de rigueur address and date changes). My main cover letter is for research universities. Then I have one for small colleges and community colleges; one for replacement jobs; and one for replacement jobs at small or community colleges. (Small and community colleges prize teaching flexibility—if you’ve taught in your minor or in other disciplines that’s a plus—and replacement searches usually aren’t looking for a permanent colleague). Finally, I have a cover letter for post-docs (sometimes they require it). The cover letters for jobs should include a section on your scholarship and a section on your teaching. I put scholarship first for the big colleges and teaching first for the small colleges. But read the call for applications carefully and add or subtract based on what each search committee wants.
Check out these three articles: “How to Make Your Application Stand Out,” by Rob Jenkins (Chronicle of Higher Education, November 23, 2009); “Dodging the Anvil,” by Thomas H. Benton (Chronicle of Higher Education, January 4, 2010); and “Subtle Cues Can Tell an Interviewer ‘Pick Me,’” by Phyllis Korkki (New York Times, September 13, 2009).
The fall is application season, with deadlines usually beginning on October 1 but sometimes stretching into late December. During the late fall telephone interviews begin, and most primary interviews are conducted in the early winter, often at the AHA. In the late winter and early spring there will be on-campus interviews and applications for late searches and replacement jobs (these are called substitutes at CUNY and visiting professors elsewhere). Don’t give these short shrift; until you have a job offer, you should remain actively on the market, and being a replacement is far better than adjuncting.
Where are the jobs listed? Once a week I check historians.org, H-Net, and CUNYFirst. And tell the office administrator to keep you on the grad student listserv--it's how I first heard about all my actual jobs except my current one.
Lastly, don’t forget to keep up with your scholarship. Your break after your defense should be no longer than twice the length of the break you took after your orals. Get to work on your book proposal. There’s no better way to cheer yourself in a horrible job market than with the news of a book contract (and it’s helpful in landing those jobs, too)!
Leave a Comment
| || |
||On Gates, Crowley, and Race
It was only a matter of time before I blogged on this. I am a historian of African-American history, after all. But so far I've been content to let people like Bob Herbert cover it. Having had a month to mull it over, this seemed like an apt time to weigh in.
Some background. On the night of July 16th, Harvard scholar, activist, and TV personality Henry Louis Gates, Jr.--who is Black--returned to his home to find that he couldn't get in through the front door. And so he entered through the rear, prompting a concerned neighbor to call 911. Sergeant Crowley of the Cambridge Police Department--who is white--arrived on the scene with another officer, who is Black, and possibly other officers. Gates came to the door, produced ID to prove that it was his own residence, and was arrested for disorderly conduct. The charges were dropped. President Obama, asked about the incident at a press conference on health care, called it "stupid." The president later apologized and invited Gates and Crowley to a beer at the White House.
First, Gates. Gates is incredibly smart. He knows that as a Black man he is in more danger of unwanted police attention in America than a white man, and he has used this knowledge to avoid angering law enforcement officials in the past. He does this, like other smart men, by engaging in some variation of the humiliating "shuck, jive, and grin" routine that puts racist whites at ease--the whites move the unknown Black man from the "possible dangerous criminal" category to the "happy slave" category. Smart Blacks do this because they know that statistically they are more likely to be targeted by police officers than whites. They do this because they'd rather be humiliated in the short term than endure the treatment of Amadou Diallo, Abner Louima, or Sean Bell. But after proving that he was the lawful resident of his home--and having endured a lifetime of occasional humiliation because of his race, Gates got angry and apparently insulted Sergeant Crowley (he may have said something about the sergeant's mother). Excusable? perfectly. Legal? That too.
Next, Crowley. Sergeant Crowley is what we in race relations call an unconscious racist. That means when he thinks intellectually about race, he believes in equal rights, and in his better moments he demonstrates what President Obama might call a "post-racial" mentality. He has apparently led racial sensitivity seminars in the Cambridge Police Department. But as a white man in America he is the product of generations of racial conditioning which have not been erased in the half-century since Brown. (By this definition, everyone in America--including this author as well as Gates himself--is at least an unconscious racist--with some of us genuinely trying to overcome it and others, on the opposite end, who are conscious or overt racists--people who believe in segregation or white supremacy, for instance.) Statistically, Crowley was more likely to ignore Gates' anger and insults if Gates was white. But he didn't ignore it. As a small crowd began to gather, apparently Crowley decided that Gates' behavior, if left unanswered, would lower the respect of the community for the police. Or he took it personally and decided to get tough. Either way, he made an arrest that--while technically legal--was absolutely wrong. Because as a police officer, he's supposed to de-personalize the situation. He's supposed to be able to weather a few insults here and there. He's supposed to recognize that anger seemingly directed at him may actually be directed at the system he represents. In short, he's supposed to have a thick skin.
I am glad the police exist. I believe that a free society needs a strong constabulary to maintain peace. I served for nine years as an Auxiliary Police Officer with the NYPD. The police provide an essential service for which I am grateful. But that doesn't blind me to the institutional racism found throughout American police departments; nor does it blind me to the individual psychology which prompts youngsters to become police officers rather than, say, firefighters or emergency medical technicians. Further, police officers accept a certain degree of risk as part of their jobs. In exchange for that risk, we honor our police officers much the way we honor our military veterans. And police officers extend a professional courtesy to each other--far too much, I think--by not writing speeding tickets or, in more extreme cases, by protecting each other from criminal charges (the so-called thin blue line). Ultimately, however, regardless of how they (illegally) treat one another, police officers who arrest civilians for what amounts to disrespectful behavior are wrong to do so. Even if they are afraid for their own safety, their fear--especially when unsubstantiated--does not outweigh our rights.
Next, the Black officer. Apparently the Black officer or officers on the scene have backed up Sergeant Crowley's account of the incident and support his decision to arrest Gates. This is not surprising. There is a well known saying in Harlem: "fear the white cop, but fear the Black cop more." Black cops are, for the most part, incredibly self-conscious with white cops, constantly trying to prove themselves. And they usually do so by adopting the attitudes and behaviors of their white counterparts, in the extreme. In short, the Black cop will typically side with the white cop before he sides with the innocent Black civilian in order to avoid being labeled as untrustworthy by the white cops. He is leaving behind one community as fast as he can in order to be embraced by another community. He is often successful; today many Black police officers enjoy a tremendous amount of respect from their white colleagues. But the price is that they have turned their backs on an important aspect of justice: they have ignored institutional racism (except when it affects them personally).
Finally, President Obama. The president was right to label the incident "stupid." It was stupid. That's not the same as labeling Sergeant Crowley stupid--which he did not. Both the President and the American people know the difference between a stupid act (of which we have all been guilty) and a stupid person. When tensions get high, smart people do stupid things. They often immediately regret their behavior (although if the punishment for this one is a beer and a family tour at the white house, I doubt Crowley has any regrets). But the press, trying to sell the story, turned an accurate, honest comment into a scandal. Of course, that's what they do: they're trying to sell papers. Unfortunately, the President decided to play their game and apologized. Politics is politics, and that's what the "stupid" incident and beer moment were all about.
What we're left with, in the end, is what we began with: the continued refusal of the American people--even with a Black president--to have a serious discussion about race. Oh, from time to time we'll have a "town hall meeting"--and these are useful in that people can give voice to the way they feel, and viewers get to see different sides. But they tend to be dominated by the uneducated, with typical statements like "I don't see why I need to be held responsible for something that happened four hundred years ago" (Ted Koppel's Nightline Town Meeting following the lynching of James Byrd, Jr.). We, the educated, need to do a better job informing the public that slavery didn't end 400 years ago (about 146, actually) and has legacies that continue today, and that whites--even poor whites--continue to benefit from those legacies.
I guess it starts here.
Leave a Comment
konrad kozieja, 12/07/09
I actually found this incident very interesting, first of all i believe that Mr. Gates should of pressed heavy charges against officer Crowley. Unconditioned arrest is unfortunately becoming a very common "thing". People especially in NYC when approached by police believe that the cop has a right to do ask them and do anything. To be honest this is not just a racial issue , even though being African-American can categorize an individual as an immediate "suspect" police brutality and what one can call "unfairness" is a wide spreading issue. Two years ago my hair was longer than one of an avarage female and i went through countless encounters with police, for example i've been pulled over a lot of times by police officers trying to search me and my belongings, at first I was careless and i would let them do "their thing" but as i progressed in my knowledge of law and police procedures I realized they had NO RIGHT to condone their searches on me. So after a while I started to refuse being randomly searched on the street. What I am trying to conclude is that people need to simply understand their rights as a citizen of this country. The fear of the police badge has to be avoided...
Flamur Nikaj, 09/03/09
This is the first that I'm hearing of this story and I'm not surprised to hear about a police officer mistreating a black man. The fact that Gates was arrested for showing a little emotion after being embarrassed in front of a crowd of people is despicable. As an educated man I would believe Gates is high on his pride and intelligence. I'm sure he felt humiliated and disgusted because he is being viewed by his neighbors as a criminal. It is normal for a person to react the way he did, especially if it has happened to him in the past. Officer Crowley tried making an example of Gates by arresting him and that is unjust. He didn't want to show weakness by allowing Gates to walk away seemingly unpunished. Gates' reputation has forever tarnished after this incident and it was all over something that the president calls "stupid."
| || |
||What Value Life?
Let's be clear: I cannot influence in any way the war in Gaza, Israeli public opinion, or the official American response. I blog because people I know--many of them, anyway--seem to care what I think. Also, I am a teacher who feels very strongly that politics and matters of opinion do not belong in the classroom, but who believes that he has a responsibility to enlighten on matters about which he cares deeply. And so, I blog on the Gaza War.
First, I should state that I am a secular Jew. I was raised in the Village Temple and received a bar-mitzvah. Although my father was born a Jew, my mother converted to Judaism after I was born, so the orthodox rabbis of Israel might have some qualms about granting me citizenship were I to apply (and, barring unforeseen circumstances, I have not intention of applying). My wife was born Jewish and so, therefore, was my daughter. But I am not currently a member of a temple. I do not practice, although my daughter underwent a reform naming ceremony at my father's temple. I do not fast on Yom Kippur, although I will attend a break-fast for the whitefish salad. But we light a menorah on Chanukah nights when we are both home, and I have been known to lead a seder or two.
Second, I should say that I am a Zionist, albeit in the mildest sense of the word. I see the existence of the state of Israel as a beacon of world Judaism, an important re-affirmation of our legitimacy as a people and as a nation, especially important in the shadow of the Holocaust. Anti-semitism will continue, but as long as there is a state of Israel--as a Jewish state--then there will be at least one place where Jewish refugees will never be turned away. Israel's form of government also states to the world that Jews--wherever they live--are a modern, democratic people--notwithstanding our authoritarian, chauvinist rump.
Since I believe in the legitimacy of the state of Israel, it follows that I believe that Israel has the right--indeed the responsibility--to defend its citizens and its borders. I do. Israel has the right and responsibility to defend itself against existential threats on the battlefield--as it did in the 20th century against Egypt, Syria, and Jordan. And Israel has the right and responsibility to defend itself against suicide bombers in its midst and rocket attacks from territories under its control.
When Israel was fighting state-based existential threats in the 20th century, like those presented by Egypt, Jordan, and Syria, the response was proportionate to the threat. The threat was the end of the state of Israel. Israel did not conquer the invading countries, but it fought conventional battles and, when successful, occupied buffer sections of those countries so as to attain geographical security: the occupation of the Sinai peninsula and Gaza Strip limited Egypt's strategic control of the Suez Canal and Gulf of Aqaba, from which it launched its earlier attacks; occupation of the West Bank geographically "thickened" Israel to make it more difficult for Jordan to try to cut it in two. When Israel later signed treaties with Egypt and Jordan, the Sinai--which Egypt wanted--was returned, but Gaza and the West Bank--which Egypt and Jordan did not want for demographic reasons--were not. (The rulers of Egypt and Jordan were concerned that incorporation of those areas into their own countries would dangerously increase the proportion of Palestinians to Egyptians and Jordanians. Based on their response to the current war, they still are.)
The threat since then has come not from the neighboring states--notwithstanding Iraq's Scud missile attacks during the Gulf War and Syria and Iran's continuing animosity--but from the people of the Gaza Strip and the West Bank, the Palestinians. (One of the causes of this threat is, of course, the historical and continuing mistreatment of Palestinian civilians by the Israeli military, but the terrorists have always been--and remain--a minority.) But unlike the earlier state-based threat--and notwithstanding the rhetoric of Hamas--this newer threat is not existential. The existence of the state of Israel, at least for now, is not threatened by the Palestinian people in Gaza or the West Bank, not by Hamas, and not by the launching of Qassam rockets.
This is not to imply that Hamas' behavior is good, that the suffering of the Israelis affected by the launching of Gaza Qassams is not tragic. Quite the contrary. These acts are evil and illegal. But the Israeli response must be proportionate as it was against the existential state-based threats of the 20th century. Which brings me to the crux of the argument: the Israeli response has not been proportionate; it has been criminal. Yes, criminal. Israeli civil and military leaders are committing war crimes.
Whether or not civilians are being deliberately targeted, large numbers of civilians (numbers that dwarf those of Israeli casualties from the Qassams, before and during this war) are being killed in targeted structures. When you are ordered to bomb a target in which innocent civilians are known to be present, you cannot press the button--you cannot give the order--unless the threat from that target is greater than the potential loss of innocent life. As far as we know, such threats (nuclear weaponry, for instance) do not exist in Gaza. Those Palestinian civilian lives are of equal value to the Israeli civilian lives, regardless of which party they voted for in the last election. And the behavior of Hamas does not justify the behavior of the Israeli civil and military leaders in issuing and executing these orders.
One argument made frequently by those who applaud the bombing in Gaza is that Hamas in particular and the Gazan Palestinians in general are like the Nazis--that even though most Gazans are not members of Hamas, the party's anti-Israel demagogues exercise a degree of control over Gaza similar to that held by the Nazis in Germany in the 1930s. There is some validity to this, if only in the fact that the economy of Gaza is similar to that of depression-era Germany, and the humiliation of the Gazans is similar to that of the Germans after World War One. But unlike interwar Germany, Gaza is not now (and has virtually no chance of becoming) a major military-industrial power capable of conquering its neighbors and launching a genocide in Israel. Iran has that potential, but Israel is not bombing Iran (that would be a mistake too, but for other reasons). The fact that the now-moderate Fatah retains political control in the West Bank means that the Gazans might also de-radicalize with an improved standard-of-living (which is probably impossible until Gaza's borders and seaports are re-opened).
Gaza is not a "concentration camp," as a Vatican spokesman recently labeled it. But Gaza is a ghetto, like the Warsaw Ghetto (albeit much larger), into which many of its residents were forced as refugees, and from which there is currently no exit. Nor are Israeli civil and military leaders Nazis, as some West Bank protesters recently called them. But some of the decisions being executed by the Israeli military are similar to some of those executed by Nazis against civilian populations--especially Jews--during World War Two.
I do not pretend to have all (or any) of the answers to the continuing crisis in the Middle East. But what I do know is that a history of persecution does not give the Jewish people--my people--the right to persecute. The civilian lives of Palestinians are just as valuable as those of Israelis. And until we all understand that simple principle--that all human life has value--the tragedy will continue.
Leave a Comment
Shaquanna Cole, 11/10/10
Hello professor I know I don't talk much in your class but i found your blog entries to be very deep. You seem to care deeply about the issues not only in America but all around the world as well. To me it interesting to know that there is a ghetto in Gaza where al these refuges camps are.
Vadim Popov, 01/02/10
well i wish i knew were to start., i would not be here if i would not have a lot of respect for you Mr. Golland, that would explain my interest in you, and your carrier! I had a great honor of being one of your student, i was very, very lucky to have you as a teacher!!! (that would explain the fact me going back to your web site)
i am russian speaking americanised jew... hated by most of the free world and self hating jews as well!
not sure if you are right about stating in your story that you just stating your opinion, that is not true you as a great teacher influense opinions of generations!!!!!! so you better watch what you are saying and as a history teacher do your research on history befoure you state something.
i understand you need to be liberal and other b.s. and you have to be liked my majority of the world and that requers you to be not honest against jews, and state of isrele....
but being a teacher, great teacher as i talk from me expiriens!!!! you really need to do your homework about israele.
this country was born 1948 and fought against arab countries who was trying to destroy it from day 1! it had survived against all ods, it is doing more good to the world than most arab countries put togethere (but who cares about that as the world hates jews!)
there are 22 arab countries in the world they don't want to take palestinians egypt can take gaza, jordan can take west bank but than they can't blaim jews for arab problems (by the way there were part of those countries)
you need to understand that 1929 there was no jewish state, yet arabs killed jews in the future state of israele for no good reason back than, what was the excuse than? please advise??????
arabs go out and celebrate death... why? when one of theirs goes out and kils jews, cristians they celebrate (WTF, why?) what mother can praise her own kids being blown up killing other human beens?
when sept 11 happened some muslems were really celebrating in this country... why?
i tell you why they hat jews, and cristians, and hindus, and ect... they celebreat death! i have been to israele and seen seasoned soldiers cry, cry when their fellow soldiers were killed by so called peacefoll terorist you so proudly defent!
it is clash of coulters and morals, being liberal most of the people support arabs... as they are scared to say the truth... jews cry and feel pain for every single loss of their fellow jews around the world as arabs take great joy in death, death of their own and cristians and/or jews!
maybe you should go to israele and see, what those bad=evel jews doing there...
p.s. i am honored as i had you as a teacher you are a great teacher!!! you touch a lot of hearts and minds yet befour you influence minds you must know what you are saying about israele....
I am a student of Dr. Golland, and found this blog from the site our class uses. Dr. Golland, this blog is an amazing piece and has definitely changed my view of you. I have come to look at you now not just as a teacher, but also as a thoughtful scholar. I am thankful for this, as I am certain it has only enhanced my inclination to think more deeply and at length about your lecture topics, and thusly enhanced my own education and understanding of the subject matter. For some reason, students have a tendency to de-humanize their professors and in some cases in doing so, create a relationship of opposition (sometimes only in their mind) between themselves and their educators. They don't realize, however (as I do not, until I call myself out on it internally) that it really does them nothing but disservice, and cripples their education. I will admit that I am guilty of this to some extent, with some professors I have had more than with others: students tend to think of their teachers as blind pushers of assigned subject matter than scholars who are genuinely taken with the subject that they've (in most cases) devoted their adult life's studies to. I am grateful and happy to have a professor I can look up to not just as an educator but as a scholar as well.
Now, to the subject at hand:
"I have the same fear - of being labeled an anti-Semite, a Jew-hater, etc. All for what? For being a person with a conscience?" -Zeeshan
I am very relieved to see more of this type of thinking. Being engaged to a Jewish man who (quite vocally) shares much of the same opinions you have displayed here, I have followed his example and become deeply interested in this situation. My interest and my willingness to share my opinions and engage in discussion regarding this matter was brought to a grinding halt when I was called both of the things the poster above me stated that he was afraid to be called. One of the people engaged in what I felt was a very mature, intellectual discussion suddenly reverted to calling me a "Jew hater" with "disgustingly anti-Semitic" opinions: and all for stating what was basically in line (and certainly not outside of) what you just stated above. I was aghast with shock and a strange sense of hurt, it truly is a terrible thing to be accused of. I started to question my own views: Was I wrong? Did my disagreement with the way Israel had conducted themselves really make me a "Jew hater"? Did dissension turn me into a bigot? Shortly after this, I witnessed a very heated (and nearly physical!) exchange between an elderly Jewish woman and a Palestinian exchange student in the commons at Hunter. The screaming made the argument nearly unintelligible, but one could make out the name-calling, at least. It took the security men an uncomfortably long time (in which a crowd gathered to watch, most of them open-mouthed in disbelief) to separate the two arguing women. In that time I realized that the emotion of the situation was quickly turning people into irrational monsters, incapable of placing themselves into the "shoes" of others, and that maybe disagreement didn't automatically make me into a bigot, but it certainly did enable otherwise rational people to become blazing, unstoppable locomotives of hate and judgment.
In any case, the insecurity and uncertainty I feel from being accused of hatred for simple disagreement was somewhat lightened after reading this. I really sincerely hope you continue to post, as I will continue to read.
Overall, this is a solid and balanced assessment. And I say that as an Israeli who comes from a family with strong roots in zionism and the Israeli military.
I would like to add a few points:
1. The mass media reports to the world that Israel has been getting hit by rockets on a regular basis prior to the start of this war. While true, it leaves out that the motivation for those attacks has been the complete sealing of Gaza from the outside world by Israel's military forces. This means no trade, no supplies, no real ability to have an economy. Hamas and Gazans view this as an act of war, as would most in the United States if, say, Canada were to do that to us.
2. It is important to understand that this war is no surprise. By sealing up Gaza, Israeli politicians such as Olmert, Livni, and Barak, ensured that the pressure cooker of Gaza was going to burst. And they knew that once it burst, it would give them an opportunity to enter Gaza and prove that "Israel is tough."
3. In the context of mistakes made in the 2006 Hezbola war and subsequent loss of confidence by the Israeli voting public, Olmert & Co, especially defense minister Ehud Barak, have much to gain in showing the Israeli public that they can win a war against rocket-launching gorillas. Especially with elections happening in February and the outgoing American administration to be replaced by one who is less likely to be so amenable to the cause. This one is at least as transparent as Bush's Iraq war scheme.
4. Proof for point #3, Livini's and Kadima party's poll ratings immediately began to increase as the war began.
5. Ironically, Israel was in support of--or at least was happy to ignore--Hamas in its earlier years. When secular Arafat and the PLO were enemy #1, Israel was delighted at the idea of radical Islamic Hamas gaining power and threatening Arafat's control. Does this remind you of something? Taliban? Bin Laden? Same old story.
Now, the flip side. The advertisement mentioned earlier in the comments actually does make some sense. What would happen if Canada launched rockets into Buffalo every day? There would be an all out war of mass proportions, and it would not be proportionate to the attack.
I think it is reasonable to conclude that proportional use of force is not the only consideration in determination of reasonable use of force. The problem is that the use of force is not just a response, but a deterrent. To be effective as a deterrent, the deterrent must inflict sufficient damage as to cause the other side to reassess the value of the attacks.
And then there is the question of time. If Israel is hit by 3 rockets a day for 6 months (total of 540 rockets), is it proportional to respond with the same? Can they respond with the equivalent of 540 rockets of firepower in a week and call it a day?
There is a lot to the argument of proportional use of force that can make it very difficult to rationalize. Which ultimately leads to difficulties rationalizing war in the first place. What is clear is that each side has been escalating the conflict.
Unfortunately, each side has gains to be made by the escalation: For Israel, this is redemption of its tough reputation after the 2006 Lebanon war, and improved chances for political candidates that prior to this were sure to lose. For Hamas, it's about world attention that shifts from that of a terrorist organization to that representing a people unfairly attacked and in humanitarian crisis, and increased standing among other Arab nations and people.
I would reckon that most people on both sides just want peace and to be done with it. My hope is that eventually, after enough of these conflicts, the Israeli population begins to recognize in large enough numbers that it is being manipulated in the name of "security," and that the Gazans realize that rockets and suicide bombs are not the best approach, and might do better to hire a good PR firm to create a different kind of pressure--a humanitarian appeal--that puts them more in the company of the abused population of Darfur and less in that of Al Qaeda.
My final point is that the Palestinian population--that of combined Gaza and West Bank, is inching ever closer to a population majority in Israel. If they can succeed in presenting themselves to the world as a peaceful and repressed population over the next several years, their eventual population growth will yield them with the bargaining position of a true apartheid situation. Likewise, if the Israeli population wakes up in a way similar to what transpired in the United States this past fall, and stops rewarding politically-motivated warfare, there is the potential to truly move forward in resolving these conflicts. Otherwise, as my father, former Officer in the IDF used to say, it's the same story it's been for the last 40 years.
What can we say... it is one giant fuster cluck. We in the western world think about "wow... 60 years" and to this part of the world that is just about the time it takes for a sand dune to move 80 feet. Jews expelled out of Judea for 400 years... no problem...we will be back. Ottoman Empire rose and fall... don't worry... we will get it back. Both sides can be at war with each other for years and as you point out, adjoining countries have their own rationales and reasonings which just makes things more complicated.
The last and worst part of this is how the modern world brigns us bloodshed straight to our TVs. Why is it that our grandparents still to this day wont talk about what images they saw in war?
Dave, well said.
Gaza is in ruins. Hamas is ever stronger. Morale is at its lowest. And Israel thinks peace will come out of such desperate acts?
My thoughts are now with the innocent victims - victims of an avoidable humanitarian crisis.
There was no need to give that lengthy background/introduction to what you stated later in the post. Was it a defense mechanism so that your co-coreligionists dont label you a sell-out? I have the same fear - of being labeled an anti-Semite, a Jew-hater, etc. All for what? For being a person with a conscience?
But so the world goes...
I'm glad you took some time to post your thoughts and reflections.
Prayers for peace,
| || |
||Some Post-Election Thoughts
Today our nation was confronted with a profound question, to choose hope or to choose fear. I am proud to say we chose hope.
For too many years now people like me have been made to feel like outsiders in our own country. We have watched as our president exploited our fear of terrorism to keep himself in power, embroil us in a never-ending war for no good reason, and shred our Constitution. He has allowed us to drown in hurricanes and collapsing bridges, and responded by cutting domestic programs. And he has presided over the worst economic disaster since the Great Depression.
I love America. I am a veteran. I was a Boy Scout. I have voted in eighteen elections. I have served on jury duty. I pay my taxes as an act of civic pride. In short, I believe in what we stand for as a nation. And tonight, after eight years of being told I was wrong, that America wasn't what I thought it was—my nation told me that in fact I was right.
Tonight, our new President-Elect linked our generation of Americans with our forebears in a way that no president has done in a long time. Like the generation that fought the Revolution, who stood up to the tyranny of the King and the tyranny of the mob; like the women and men who responded to the impossible situation of human slavery and national disunion and did what had never been done before, re-uniting the nation and freeing the slaves, setting an example that the remaining slaveholding nations would soon follow; like the progressives, who tackled the worst excesses of capitalism; like our grandparents during the New Deal and World War Two, who built dams and homes and unions and planted victory gardens and endured rationing—rationing!—and stood up to the overwhelming dark night of fascism and genocide; and like our parents in the Civil Rights Era, who overturned decades of Jim Crow and ended another ill-advised war. Our next president reminded us that we too have been called to do the impossible.
The task before us may seem overwhelming, and the cynics among us will surely scoff, as they did in those earlier periods of trial. We have to repair the regulatory system that prevented economic depression for 70 years. We have to bring an end to the war that we needed to fight, by stabilizing Afghanistan and defeating Al-Qaeda, and the one that we didn't, by reasonably withdrawing from Iraq. We have to close down our known confinement facility at Guantanamo Bay, Cuba, and our unknown confinement facilities around the world, putting the guilty on trial in the United States and freeing the innocent who were caught in too wide a net. We need to restore our credibility as peacemakers and viability as world leaders. We have to ensure quality medical and psychological care for our returning veterans, and give them the same educational opportunities we gave to their grandparents after World War Two. We have to sponsor cleaner, sustainable fuels, and drastically cut consumption, not only to save ourselves a few dollars at the pump but to save our planet as well. And the list goes on.
Tonight, President Elect Barack Hussein Obama—what a wonderful thing it says about our nation that we could not only elect a Black man to the presidency but that he should have such an, ahem, international name—called on the better angels of our nature. He paraphrased John Winthrop, who saw America as a "City on a Hill;" John Hancock, who called for "a more perfect union;" Sam Cooke, who sang "It's been a long time coming, but I know a change is gonna' come;" and Abraham Lincoln, who said at the hallowed ground of Gettysburg that "government of the people, by the people, for the people, shall not perish from the earth."
"…Shall not perish from the earth." What a daring promise. In recent years I have become increasingly worried that our great republic was doomed to go the way of Athens, Rome, and all the republics that had come before. As a Jew raised in the shadow of the Holocaust, I have been worried that we are a City on the Hill no longer, that like my immigrant forebears I would soon be forced to seek a new safe haven. In recent years the executive branch has become a single Orwellian department of doublespeak and Congress—even under the Democrats—has compliantly rolled over. I have been especially dismayed by the behavior at the Sarah Palin rallies, where racial epithets and xenophobia became the norm. Palin has made clear that she feels to be a "real" American one must live in a small town, vote Republican—and be white. Sadly, the party of Lincoln is whiter than ever before. Thankfully, today our nation rejected that idea—resoundingly. And tonight I have hope—not fear, but hope—that our great republican experiment will go on throughout my lifetime and my daughter's lifetime.
As the crowd at the McCain campaign headquarters booed every mention of President-Elect Obama and called out epithets, the gracious Obama crowd cheered the patriotism of Senator McCain. McCain is an American hero, and he deserves to be thanked for his service, both in Vietnam and in Congress (however much we may disagree with his opinion about that war or his votes in the senate). Seizing the historical moment, Obama might have risked two direct quotes, the first from Ronald Reagan, the second from Lincoln: "It's morning in America," and "With malice toward none, with charity for all, with firmness in the right as God gives us to see the right, let us strive on to finish the work we are in, to bind up the nation's wounds, to care for him who shall have borne the battle and for his widow and his orphan, to do all which may achieve and cherish a just and lasting peace among ourselves and with all nations."
But I think the one presidential quote that does the most justice to our historical moment comes from the accidental president who left us only two years ago: Gerald Ford. "My fellow Americans, our long national nightmare is over."
Leave a Comment
Shaquanna Cole, 11/10/10
I think as a nation America has come along way with alot of social issues affecting the nation.
Kary Deoleo, 11/08/10
hello it is a great post. Sometimes I think what is the mind of those politicians that they do not understand each other. how is it possible that they work for their own convenience?
Irene Diaz, 11/07/10
I really enjoy your post.
Denise Baum, 11/07/08
After reading your beautifully phrased, articulate words reflecting last evenings Presidential outcome, I was incredibly moved and brought to tears. How well you put so many peoples feeling into words is something to be very proud of. I commend you for being able to express what so many people feel at such a poignant time in our lives.
May we now go from strength to strength and may Barack Hussein Obama find the right path to lead America down....
Thank you again for all your beautiful insight into what we all hope to be a better, honest America..
Joe Sramek, 11/06/08
Well put! I could not agree more with what you have said - as I said to many of my colleagues at SIU yesterday, I've never felt as proud of my country as I do right now! What a nice thing that we won't be hearing about the "Bradley effect" or, "white voters will never elect a black man," etc. While many of the "yellow dog Democrat" counties here in Southern Illinois voted for McCain on Tuesday (as did many in neighboring Kentucky and the boothill of Missouri, they were counteracted by even bluer suburbs. DuPage County, for example, up in the Chicago suburbs voted nearly 60-40 for Obama! While, to be sure, there were rural whites down here who are still racist enough to ever vote for a black man, but thankfully they were vastly outnumbered by folks like my Republican-leaning parents who voted Obama on Tuesday. In the future, I will look forward to Democratic politicians never again having to pander to such people whether we call them "Reagan Democrats" or "Nascar Dads" or whatever silly name our pundits dream up.
Now, let's all get to work in helping to enact the progressive America that we have been dreaming of for so long. Indeed, our long national nightmare is over!
Thank you for sharing in words what we, too, feel so deeply (and this should be an Op-Ed piece for the New York Times--we encourage you to pursue this}. Last night, the words of the spiritual that I'm sure you know came to mind and echoed when I (Laurie) read your last line, "Free at last, free at last. Thanks the Lord, I'm free at last." This is why we have optimism about the future of our country: if you represent the next generation of leaders, we are confident about what lies ahead. May we share this with friends and family, both here and abroad? We couldn't have expressed our own feelings of relief and hope with a perspective on history any better than you did so beautifully here.This is fantastic and should be read by every American citizen. We are proud to have you and your family as our friends.
Disclaimer: Dr. Golland is not responsible for any opinions expressed in the comments, which appear in small type font. Comments are unedited but offensive comments are not posted.
3/20/17: Free Speech Is Not an Academic Value
1/21/17: The America We Lost When Trump Won
12/14/16: Letter to Electors
12/13/16: Collective Statement by Scholars in U.S. History
11/9/16: Horrified but Unsurprised
9/1/16: What Next? Questions for a Post-Trump America
8/25/16: The Most Qualified Presidential Nominee
12/04/14: A Nation of Racism: Remarks at Understanding Ferguson
11/25/14: Press Narratives, Marion Barry, and the Fletcher Alternative
07/05/13: Another Day That Will Live in Infamy
03/02/12: Presidential Pet Peeve
07/12/11: Toe-to-Toe with the Imposter: Arriving in Oxford
06/15/11: The Abysmal Job Market, Revisited
02/06/11: AAAHRP Conference, Seattle, Washington
12/30/10: Mayor Bloomberg's "Let 'em Eat Cake" Moment
11/02/10: 65,000 African American Soldiers in the Confederate Army? I Think Not
09/29/10: Thirty Years from Independence Plaza
08/09/10: Advice from the Abysmal Job Market
08/13/09: On Gates, Crowley, and Race
01/14/09: What Value Life?
11/05/08: Some Post-Election Thoughts
Last updated 11 April, 2017 (DHG)