History, American Democracy, and the AP Test Controversy
July/August 2015 | Volume 44, Number 7/8
Wilfred M. McClay
University of Oklahoma
Wilfred M. McClay is the G.T. and Libby Blankenship Professor in the History of Liberty at the University of Oklahoma. He has also taught at the University of Tennessee at Chattanooga, Tulane University, Georgetown University, and Pepperdine University, and he served for eleven years as a member of the National Council on the Humanities. His books include The Masterless: Self and Society in Modern America, The Student’s Guide to U.S. History, and Figures in the Carpet: Finding the Human Person in the American Past. He received his Ph.D. in history from Johns Hopkins University.
The following is adapted from a talk delivered on July 10, 2015, at Hillsdale College’s Allan P. Kirby, Jr. Center for Constitutional Studies and Citizenship in Washington, D.C., as part of the AWC Family Foundation Lecture Series.
Historical study and history education in the United States today are in a bad way, and the causes are linked. In both cases, we have lost our way by forgetting that the study of the past makes the most sense when it is connected to a larger, public purpose, and is thereby woven into the warp and woof of our common life. The chief purpose of a high school education in American history is not the development of critical thinking and analytic skills, although the acquisition of such skills is vitally important; nor is it the mastery of facts, although a solid grasp of the factual basis of American history is surely essential; nor is it the acquisition of a genuine historical consciousness, although that certainly would be nice to have too, particularly under the present circumstances, in which historical memory seems to run at about 15 minutes, especially with the young.
No, the chief purpose of a high school education in American history is as a rite of civic membership, an act of inculcation and formation, a way in which the young are introduced to the fullness of their political and cultural inheritance as Americans, enabling them to become literate and conversant in its many features, and to appropriate fully all that it has to offer them, both its privileges and its burdens. To make its stories theirs, and thereby let them come into possession of the common treasure of its cultural life. In that sense, the study of history is different from any other academic subject. It is not merely a body of knowledge. It also ushers the individual person into membership in a common world, and situates them in space and time.
This is especially true in a democracy. The American Founders, and perhaps most notably Thomas Jefferson, well understood that no popular government could flourish for long without an educated citizenry—one that understood the special virtues of republican self-government, and the civic and moral duty of citizens to uphold and guard it. As the historian Donald Kagan has put it, “Democracy requires a patriotic education.” It does so for two reasons: first, because its success depends upon the active participation of its citizens in their own governance; and second, because without such an education, there would be no way to persuade free individuals of the need to make sacrifices for the sake of the greater good. We now seem to think we can dispense with such an education, and in fact are likely to disparage it reflexively, labelling it a form of propaganda or jingoism. But Kagan begs to differ with that assessment. “The encouragement of patriotism,” he laments, “is no longer a part of our public educational system, and the cost of that omission has made itself felt” in a way that “would have alarmed and dismayed the founders of our country.”
Why has this happened? Some part of the responsibility lies within the field of history itself. A century ago, professional historians still imagined that their discipline could be a science, able to explain the doings of nations and peoples with the dispassionate precision of a natural science. But that confidence is long gone. Like so many of the disciplines making up the humanities, history has for some time now been experiencing a slow dissolution, a decline that now may be approaching a critical juncture. Students of academic life express this decline quantitatively, citing shrinking enrollments in history courses, the disappearance of required history courses in university curricula, and the loss of full-time faculty positions in history-related areas. But it goes much deeper than that. One senses a loss of self-confidence, a fear that the study of the past may no longer be something valuable or important, a suspicion that history lacks the capacity to be a coherent and truth-seeking enterprise. Instead, it is likely to be seen as a relativistic funhouse, in which all narratives are arbitrary and all interpretations are equally valid. Or perhaps history is useless because the road we have traveled to date offers us only a parade of negative examples of oppression, error, and obsolescence—an endless tableau of Confederate flags, so to speak—proof positive that the past has no heroes worthy of our admiration, and no lessons applicable to our unprecedented age.
This loss of faith in the central importance of history pervades all of American society. Gone are the days when widely shared understandings of the past provided a sense of civilizational unity and forward propulsion. Instead, argues historian Daniel T. Rodgers, we live in a querulous “age of fracture,” in which all narratives are contested, in which the various disciplines no longer take a broad view of the human condition, rarely speak to one another, and have abandoned the search for common ground in favor of focusing on the concerns and perspectives of ever more minute subdisciplines, ever smaller groups, ever more finely tuned and exclusive categories of experience. This is not just a feature of academic life, but seems to be an emerging feature of American life more broadly. The broad and embracing commonalities of old are no more, undermined and fragmented into a thousand subcultural pieces.
* * *
This condition has profound implications for the academy and for our society. The loss of history, not only as a body of knowledge but as a distinctive way of thinking about the world, will have—is already having—dire effects on the quality of our civic life. It would be ironic if the great advances in professional historical writing over the past century or so—advances that have, through the exploitation of fresh data and new techniques of analysis, opened to us a more expansive but also more minute understanding of countless formerly hidden aspects of the past—were to come at the expense of a more general audience for history, and for its valuable effects upon our public life. It would be ironic, but it appears to be true.
As historian Thomas Bender laments in a recent article, gloomily entitled “How Historians Lost Their Public,” the growth of knowledge in ever more numerous and tightly focused subspecialties of history has resulted in the displacement of the old-fashioned survey course in colleges and universities, with its expansive scale, synthesizing panache, and virtuoso pedagogues. Bender is loath to give up any of the advances made by the profession’s ever more intensive form of historical cultivation, but he concedes that something has gone wrong: historians have lost the ability to speak to, and to command the attention of, a larger audience, even a well-educated one, that is seeking more general meanings in the study of the past. They have indeed lost their public. They have had to cede much of their field to journalists, who know how to write much more accessibly and are willing to explore themes—journalist Tom Brokaw’s celebration of “the greatest generation,” for example—that strike a chord with the public, but which professional historians have been trained to disdain as ethnocentric, triumphalist, or uncritically celebratory. Professional historians complain that such material lacks nuance, rigor, and is prone to re-package the past in terms that readers will find pleasing to their preconceptions. They may be right. But such works are at least being read by a public that is still hungry for history. The loss of a public for history may be due to the loss of a history for the public.
Instead, it seems that professional historiography is produced mainly for the consumption of other professional historians. Indeed, the very proposition that professional historiography should concern itself in fundamental ways with civic needs is one that most of the profession would find suspect, and a great many would find downright unacceptable—a transgression against free and untrammeled scholarly inquiry. Such resistance is understandable, since conscientious historians need to be constantly wary of the threat to their scholarly integrity posed by intrusive officials and unfriendly political agendas.
There can be no doubt that the professionalization of the field has brought a remarkable degree of protection for disciplinary rigor and intellectual freedom in the framing and pursuit of historical questions. But must abandonment of a sense of civic responsibility come in tandem with the professionalization of the field? This presents a problem, not only for the public, but for the study of history itself, if it can no longer generate a plausible organizing principle from its own resources.
* * *
Consider in this regard our startling incapacity to design and construct public monuments and memorials. Such edifices are the classic places where history and public life intersect, and they are by their very nature meant to be rallying points for the public consciousness, for affirmation of the body politic, past, present, and future, in the act of recollection and commemoration, and recommitment to the future. There is a profundity, approaching the sacramental, in the atmosphere created by such places, as they draw together generations of the living, the dead, and those yet unborn in a bond of mutuality and solidarity. The great structures and statuary that populate the National Mall in Washington, D.C.—such as the Lincoln Memorial and the Washington Monument—or the solemnity of Arlington National Cemetery, do this superbly well. There is a sense, too, that cemeteries honoring fallen soldiers of the Confederacy somehow deserve our general respect, even if the cause for which they fell does not. But these structures were a product of an earlier time, when the national consensus was stronger. Today, as illustrated by the endless deadlock over the design and erection of a memorial to Dwight D. Eisenhower in Washington, a drama that has become a fiasco, we seem to find the construction of monuments almost impossibly difficult. And in a different but not unrelated way, the sudden passion to cleanse the American landscape of any and all allusions to the Confederacy or slaveholding—a paroxysm more reminiscent of Robespierre than of Lincoln—also suggests the emergence of a public that is losing meaningful contact with its own history.
Why has this happened? In the case of the Eisenhower memorial, it happened because the work of designing the memorial was turned over to a fashionable celebrity architect who proved incapable of subordinating his monumental ego to the task of memorializing a great American hero. But more generally, it has happened because the whole proposition of revering and memorializing past events and persons has been called into question by our prevailing intellectual ethos, which cares little for the authority of the past and frowns on anything that smacks of hero worship or piety toward our forebears. The past is always required to plead its case before the bar of the present, where it generally loses. That ethos is epitomized in the burgeoning academic study of “memory,” a term that refers in this context to something vaguely suspect.
“Memory” designates the sense of history that we all share, which is why monuments and other instruments of national commemoration are especially important in serving as expressions and embodiments of it. But the systematic problematizing of memory—the insistence on subjecting it to endless rounds of interrogation and suspicion, aiming precisely at the destabilization of public meanings—is likely to produce impassable obstacles to the effective public commemoration of the past. Historians have always engaged in the correcting of popular misrenderings of the past, and that is a very important and useful aspect of their job. But “memory studies” tends to carry the debunking ethos much further, consistently approaching collective memory as nothing more than a willful construction of would-be reality rather than any kind of accurate reflection of it. Scholars in the field examine memory with a jaundiced and highly political eye, viewing nearly all claims for tradition or for a worthy past as flimsy artifice designed to serve the interests of dominant classes and individuals, and otherwise tending to reflect the class, gender, and power relations in which those individuals are embedded. Memory, argues historian John Gillis, has “no existence beyond our politics, our social relations, and our histories.” “We have no alternative,” he adds, “but to construct new memories as well as new identities better suited to the complexities of a post-national era.”
The audacity of this agenda could not be clearer. It is nothing less than a drive to expel the nation-state, and completely reconstitute public consciousness around a radically different idea of the purpose of history. It substitutes a whole new set of loyalties, narratives, heroes, and notable events—perhaps directed to some post-national entity, or to a mere abstraction—for the ones inhering in civic life as it now exists. It would mean a complete rupture with the past, and with all admired things that formerly associated themselves with the idea of the nation, including the sacrifices of former generations. Ernest Renan argued that a nation was “a large-scale solidarity, constituted by the feeling of the sacrifices that one has made in the past and of those that one is prepared to make in the future,” as part of a “clearly expressed desire to continue a common life.” That solidarity, that quest to continue a common life—all would surely be placed in jeopardy by the agenda Gillis proposes.
* * *
It is at precisely this point that the recent controversy over the new Advanced Placement (AP) U.S. History framework comes into play. Not that the College Board—the private New York-based organization that administers the advanced placement exam to American high school students—openly espouses such a radical agenda. Instead, the College Board argues that its 2014 revision of the AP exam has sought to make the exam more perfectly reflect the contents of a typical collegiate introductory survey course in American history. On the surface this would seem to make sense, since the avowed purpose of AP is to provide a shortcut to college-level credit. But it is also a huge problem, since, as Thomas Bender himself has observed, the introductory survey course, once the glorious entryway to a college history department, is now its neglected and unwanted stepchild.
The Advanced Placement exam has become a fixture in American education since it was introduced in the years immediately after the Second World War, and many colleges and universities in the U.S. (and more than 20 other countries) grant credits or advanced placement based on students’ AP test scores. For many American students, the AP test has in effect taken the place of the required U.S. history survey course in colleges and universities. This makes its structure and makeup a matter of even greater importance from the standpoint of civic education, since many of these students will never take another American history course. The pervasive use of the test has had many sources, but surely its widespread adoption is testimony to the general trust that has so far been reposed in the test. The test has retained this trust by striking a sensible balance between and among different approaches to the American past. In addition, rather than issuing detailed guidelines, the College Board until very recently has made do with a brief five-page document outlining the test’s general framework for the use of teachers, and leaving to them the distribution of their teaching emphases. This was a reasonable, respectful, and workable arrangement.
In this light, the 134-page framework in the 2014 iteration of the test represents a radical change and a repudiation of that earlier approach. It represents a lurch in the direction of more centralized control, as well as an expression of a distinct agenda—an agenda that downplays comprehensive content knowledge in favor of interpretive finesse, and that seeks to deemphasize American citizenship and American world leadership in favor of a more global and transnational perspective. The new framework is organized around such opaque and abstract concepts as “identity,” “peopling,” and “human geography.” It gives only the most cursory attention to traditional subjects, such as the sources, meaning, and development of America’s fundamental political institutions, notably the Constitution, and the narrative accounting of political events, such as elections, wars, and diplomacy.
Various critics have noted the political and ideological biases inherent in the 2014 framework, as well as structural innovations that will result in imbalance in the test and bias in the course. Frankly, the language of the framework is sufficiently murky that such charges might be overstated. But the same cannot be said about the changes in the treatment of American national identity. The 2010 framework treated national identity, including “views of the American national character and ideas about American exceptionalism,” as a central theme. The 2014 framework grants far more extensive attention to “how various identities, cultures, and values have been preserved or changed in different contexts of U.S. history, with special attention given to the formation of gender, class, racial, and ethnic identities.” The change is very clear: the new framework represents a shift from national identity to subcultural identities. Indeed, the new framework is so populated with examples of American history as the conflict between social groups, and so inattentive to the sources of national unity and cohesion, that it is hard to see how students will gain any coherent idea of what those sources might be. This does them, and all Americans, an immense disservice. Instead of combating fracture, it embraces it.
If this framework is permitted to take hold, the new version of the test will effectively marginalize traditional ways of teaching about the American past, and force American high schools to teach U.S. history from a perspective that self-consciously seeks to decenter American history. Is this the right way to prepare young people for American citizenship? How can we call forth the acts of sacrifice that our democracy needs, not only on the battlefield but also in our daily lives—the acts of dedication to the common good that are at the heart of civilized life—without training up citizens who know about and appreciate that democracy, care about the common good, and feel themselves a part of their nation’s community of memory? How can we expect our citizens to grapple intelligently with enduring national debates—such as over the role of the U.S. Constitution, or about the reasons for the separation of powers and limited government—if they know nothing of the long trail of those particular debates, and are instead taught to translate them into the one-size-fits-all language of the global and transnational?
* * *
We often speak these days of global citizenship, and see it as a form of advanced consciousness to which our students should be made to aspire. But global citizenship is, at best, a fanciful phrase, abstract and remote, unspecific in its requirements. Actual citizenship is different, since it entails membership in the life of a particular place. It means having a home address. Education does young people no favors when it fails to equip them for that kind of membership. Nor does it do the rest of us any favors. We will not be able to uphold our democracy unless we know our great stories, our national narratives, and the admirable deeds of our great men and women. The new AP U.S. History framework fails on that count, because it does not see the civic role of education as a central one.
As in other areas, we need an approach to the past that conduces most fully to a healthy foundation for our common, civic existence—one that stoutly resists the culture of fracture rather than acceding to it. This is not a call for an uncritical, triumphalist account of the past. Such an account would not be an advance, since it would fail to give us the tools of intelligent and morally serious self-criticism. But neither does an approach that, in the name of post-national anti-triumphalism, reduces American history to the aggregate sum of a multitude of past injustices and oppressions, without bringing those offenses into their proper context—without showing them as elements in the great story of a longer American effort to live up to lofty and demanding ideals. Both of these caricatures fail to do what we have a right to expect our history to do. Nor, alas, will professional historians be much help, since their work proceeds from a different set of premises.
Historians will find their public again when the public can find its historians—historians who keep in mind that the writing of our history is to be for that public. Not for in the sense of fulfilling its expectations, flattering its prejudices, and disguising its faults. Not for in the sense of underwriting a particular political agenda. But for in the sense of being addressed to them, as one people with a common past and a common future, affirmative of what is noblest and best in them, and directed towards their fulfillment. History has been a principal victim of the age of fracture. But it can also be a powerful antidote to it.
Sent from my iPhone
No comments:
Post a Comment