Visiting the 9/11 Memorial Museum

Beyond the characteristic of self-awareness, which is shared by other species, humans have an obsession with commemoration. Our focus on legacy extends well beyond the desire of our genes to procreate, afflicting us in a way that drives our linear existence during the relatively short time we enjoy what is called life.

9/11 Reflecting PoolA product of this obsession is the memorial, which manifest itself in myriad forms across the landscape. From wall graffiti to wilting flowers on a roadside, or something more ephemeral like a candlelight vigil, memorials are designed to evoke remembrance by connecting our personal experiences to the object we are commemorating. Whether or not we actually remember the subject of the memorial, we leave the experience feeling something that is personal and satisfying.

With that in mind I visited the 9/11 Memorial Museum recently with the concern that the experience would leave me emotionally drained and intellectually wanting. The exhibit’s mission, “to bear solemn witness to the terrorist attacks of September 11, 2001 and February 26, 1993,” suggests a role that is more memorial than museum. Mixing memorial and museum creates a number of thorny issues. While many have complained about the gift shop — profiting from commerce in a place of solemnity — that is a minor matter compared to the systemic design issues. The memorial is clearly, if not singularly, designed to evoke a strong emotional response and encourage visitors to personalize the experience. “Where was I on that morning?” “What did I feel?” This is great for generating empathy but interferes with the process of understanding that the museum should foster. A museum owes its visitors a public history experience, one which includes the perspective, knowledge, and narration of the curators. If visitors personalize the attacks, then their ability to incorporate the narrative and learn is degraded.

WTC TridentThe mood is set immediately upon entering, as you walk down a corridor full of projected images and audio testimonies taken from around the globe. People describe how they felt upon hearing the news and watching the event unfold. Everyone is encouraged to place themselves within the exhibit, and it is difficult not to participate. I was immediately transported back to my work commute, listening to Bob Edwards on Morning Edition come to the realization that the first plane crash was not an accident. In the space of a few footfalls images from throughout the day flashed through my mind, and associations long since dispersed became tactile once again: the smell of a closed office building, the harsh glow of fluorescent lighting, the co-workers that I never really liked yet commiserated with on that day, and finally the image of smoke rising above Manhattan as my plane descended into Newark airport three weeks later.

Certainly the most evocative space is a small remembrance chapel set in the center of the memorial. Bench seating is provided for those who wish to reflect upon the stories of each person lost on that day. Names and vital statistics are projected on the walls while surviving friends and relatives recount anecdotes about victims (or perhaps the survivors are the victims?) As I sat and listened, I asked myself from where did the feelings come? I did not know anyone who perished in the attacks. What was it about the exhibit that would provoke the kind of intense response I experienced? The purpose of the memorial — to draw upon my own experiences to create something unique — weaves shared threads throughout the culture that allows us to connect at a social level without possessing commonalities.

Exhibit curators encourage personalization through their focus on mundane artifacts. Computer diskettes and the display of a Home Depot receipt are intended to remind us that we could have easily been victims. “But for the grace of God go I.” This emphasis on the mundane is problematic, since it cannot be relied upon to provide context for the rest of the exhibit. It is, essentially, empty calories. While a little bit is satisfying, too much leaves one malnourished.

Interestingly, the Smithsonian caused a tremendous uproar in the 1990s when they included mundane items from Hiroshima in their Enola Gay exhibit design. These items — which included a burnt doll — were considered inflammatory and political. Veterans groups and members of Congress were in an uproar, and despite the museum’s effort to radically redesign the exhibit, the NASM director Martin Harwit resigned his job in the face of Congressional hearings. The public’s response to the 9/11 Memorial Museum indicates that the political message is more palatable when we are the victim and not the aggressor.

How does one not politicize a political act? The museum is hobbled by the memorial’s mission of solemnity. By the time one reaches the museum — a separate area beyond the memorial — the experience is so personalized that the hope of gaining understanding of events is dashed. Unvarnished introspection cannot occur, nor can we consider a framework in which al Qaeda is a rational actor (despite their terrorist tactics.) But if we are to gain a greater understanding of the world and America’s role as a geopolitical leader, that is exactly what we must do. The greatest opportunity this exhibit misses is allowing visitors to transcend provincialism and become cosmopolitan, if for only a few hours. Instead, we are only offered the chance to question why anyone would want to hurt us, a people who have never harmed others.

I left the exhibit with many questions. Foremost is how younger people who have no direct experience of the day will view the exhibit. This can only become the proverbial statue in the park, a plaque with names of those who perished a century ago, and the projection of our own experiences onto historical events that bear no resemblance to days past. Perhaps striking “museum” from the name would help. This would certainly solve the problem of relying upon mundane artifacts to manipulate visitors while leaving them without a cohesive perspective. Unfortunately, it doesn’t meet the higher expectations of solid public history. The 9/11 Memorial Museum suffers from an identity crisis: triangulating the needs and desires of survivors, mythologizing the role of the Twin Towers, and claiming to present a scholarly history of related events. Predictably it fails to rise above the banal.

The conversation continues on Twitter at #BeingAmerican. For other perspectives about the Enola Gay controversy visit the AHA’s “Historians Protest New Enola Gay Exhibit.”

Identity as a Zero-Sum Game

I visited the Guggenheim Museum last weekend and viewed the Carrie Mae Weems retrospective. Weems is a photographer and videographer who deftly manipulates her instrument with what is deservedly called a powerful voice. If you live in the area, go see it now: it closes in mid-May.

My thoughts this week continue to coalesce around the idea that forming and achieving American identity is a zero-sum game, that those who hold an image of #BeingAmerican must struggle to maintain their identity real estate, and that becoming more inclusive means diluting what people have. I know this sounds counter-intuitive, which is the point. We have all been raised with the cultural myth of the melting pot: disparate populations coming together to form that savory, balanced stew that nourishes the most exceptional nation in history. I suspect that, in our moments of sobriety, all of us recognize that the myth and the reality are quite divergent.

Of all the moving imagery I saw, there are two in particular that I wish to consider here. They might not be the most memorable or popular images, but they carried the message of zero-sum identity politics.

The first image — a still-life of a living room end table — appears in an early portfolio, Family Pictures and Stories. The print itself is stunning, a dark image with a brilliant range of tone and balanced composition, lustrous wood contrasted by the stark light of the side lamp. On the table, next to the lamp, is a pair of “Chinaman” figurines. My initial reaction was very mixed. I am almost as old as Weems and therefore grew up in the same era, familiar with similar items in either my home or those of my extended family. But orientalism has always made me uncomfortable: it suppresses a true understanding of other cultures and allows one to avoid confronting prejudice by transferring that behavior to what is not real. So why were these types of icons so prevalent? Establishing a hierarchy of identities through stereotyping allows us to claim and hold that identity real estate we need to be American, to prove we are integrated into the whole. Instead of the melting pot being inclusive, we seek to join and remain American by excluding others.

I am not making this observation to excuse it, but merely to identify a weakness that we all share.

The second image — or images — were so powerful they brought me to tears. Weems overlaid a quadtych of four antebellum slave portraits with the words House, Kitchen, Yard, Field. Yes, we are to feel shame. Yes, Weems is making a point about dehumanization. But I also think there is a contrary force in motion here: the base labels take away a basic value for all of us, thus we must confront the consequences of an exclusive identification process throughout our history. To take something away imparts value to it. Denying people identity recognizes its importance. Even [especially] today, the exclusive politics of #BeingAmerican creates a hurtful, counterproductive, and unnecessary process of cultural assimilation.

If you want to know how important a cohesive American identity is to people, look at how white Protestants are responding to the demographic shifts in the United States (don’t forget one of my “favorites”, David Barton.) By seeking to deny American identity to people who are not white, male, and Protestant –okay, maybe I’ll give you Christian — we force the perpetuation of the zero-sum identity game. A game at which Carrie Mae Weems has proven a formidable opponent.

On #BeingAmerican

My current project examines the relationships between national, social, and individual identities, and how we view ourselves as American. Identity is like the Evil Queen’s Magic Mirror: it has defined boundaries and function, but everyone approaches it with different expectations and leaves with a personalized experience. It is not surprising then to find tremendous differences in personal beliefs while observing a shared definition and purpose. This has been the theme threaded through several recent readings.

At the end of my graduate coursework I wrote a state of the field paper that included Gretchen Murphy’s Hemispheric Imaginings: The Monroe Doctrine and Narratives of U.S. Empire. The book examines how the Monroe Doctrine evolved throughout the nineteenth century; specifically, Murphy uses discourse analysis to show that popular culture contained many of the concepts that politicians and thought leaders adopted in their policies. Monroe’s nascent framework was nearly stillborn: it grew slowly, branching into different interpretations, each waxing and waning as our nation’s agenda and influence changed. The relationship between American social identity and political forces is direct and powerful. Even contemporary assertions that the Monroe Doctrine is dead ignore its incorporation into American identity.

Of interest today is a problem that Murphy identified in the opening of her Introduction:

Even the name “America” bespeaks the crisis; conventionally used to designate the cultural identity of the United States, its implicit erasure of Latin America and Canada is now painfully apparent…

For Murphy, the solution is to coin the term “USAmericans.” A bit clunky at first, I quickly realized it was effective for distinguishing between the various Americas of the present day. Recognizing the contemporary hubris of USAmericans to co-opt the identity of an entire hemisphere is a first step to respecting other cultures between Ellesmere and Patagonia.

Later, historian Jonathan Wilson posted the following to Twitter. “USians” indicated a trend and, upon chatting with him, he also offered his awareness of the term “Statism.”

However, our view of the hemisphere is much different today than in the nineteenth century, and one could accuse Murphy of performing Whiggish history (although in fairness she is a professor of English.) Did we view ourselves as rightful claimants to the title in the early days of the republic? Was there competition for the title at that time? Or are we simply projecting present-day concerns into the past?

I grabbed my copy of The Federalist and examined the thirty-four uses of the string “American.” Like those of us in the present day, the term is used to identify not only the new nation but also those members who comprise the cultural and civic body. That may be less surprising when one considers the neighborhood at that time: Great Britain, Spain, and France controlled most of the remaining western hemisphere and maritime routes, providing imperial perspectives for the non-indigenous peoples.

What is interesting about The Federalist is how Madison and Hamilton employ the word. While the latter favors its use in describing a regime, dominion, and state, Madison is quite comfortable using it as a container for the members of the new nation. Neither have a problem with excluding other western hemisphere societies.

Consider Hamilton’s remark in Federalist no. 11:

They foresee the dangers that may threaten their American dominions from the neighborhood of States, which have all the dispositions, and would possess all the means, requisite to the creation of a powerful marine.

He not only uses the term to describe various dominions within a geographic region, but he elevates the new nation to the status of a significant player. This statement asserts the role of the United States as a legitimate force in the Atlantic world, and foretells the development of attitudes and policies in the nineteenth century.

Madison, however, clearly favors the term to describe individuals as members of a society. Already in 1789, he views his cohort as one body, an organism evolved to conquer the vast geography of the continent and repel external dangers, as witnessed in Federalist no. 14:

…the kindred blood which flows in the veins of American citizens, the mingled blood which they have shed in defense of their sacred rights, consecrate their Union, and excite horror at the idea of their becoming aliens, rivals, enemies.

The metaphor of the body is powerful, capturing the profound diversity of the nation’s citizenry, their interdependency on one another, and the transcendence beyond the sum of their parts. Although the brain and the liver are profoundly different and may work at cross-purposes from time to time — at least my brain regularly enjoys the alcohol that my liver must work to remove — they cannot exist without each other, and both clearly belong to something greater than themselves (at times, anyway.) By addressing this interdependency, Madison presages notions of being American. Whether deliberate or not, he is creating space for popular and civic culture to begin defining the boundaries of a national identity, a space that will be molded and kneaded by myriad factions throughout the history of the United States. This process not only shapes identity but also circumscribes the behavior of future actors.

Nearly 125 years later, the rebirth of the Ku Klux Klan would focus its rebranding on notions of American identity. Dr. Kelly Baker deftly examines this process in her book, Gospel According to the Klan: The KKK’s Appeal to Protestant America, 1915-1930. Although Baker — a historian of religion — warns us to maintain a boundary between the KKK and white, Protestant America, it is not irresponsible to place the KKK at a different point on the same continuum of American social identity. In essence, the Klan is looking into the same Magic Mirror as the rest of us.

Indeed, Baker draws the connection early:

The Klan gained a following because of its twin messages of nation and faith, and the fraternity progressed because of members’ commitment to its religious vision of America and her foundations.

Americans — USAmericans, USians, etcetera — have always expressed discomfort when facing down groups like the Klan. We tend to look at them as boils or abscesses on the body, and our social narrative regularly “others” such groups to preserve its pristine nature. Although identity is necessarily fictional by nature, such preservation is counterproductive to understanding the true capacity of it, where it originates, and how it might evolve in response to certain social pressures. It may be full of pus, but it’s our pus, and more importantly it is representative of processes within our civic body.

The space carved out by Madison et al. to debate the notion of American identity is still in use today, and still contains the momentum of that early legacy. Each iteration of cultural debate and policy depends upon the previous. Both Murphy and Baker successfully trace their subject matter through to the present, and both have their foundations in the late eighteenth century. In fact, I would argue that losing sight of Madison’s body metaphor constitutes the greatest internal threat. While it may be time to encompass other cultures in what we consider to be American, we must also strive to encompass the many factions within the body of the United States. Only through an honest assessment of what we are can we shape who we will become.

I would love to hear your thoughts. You can find me on Twitter using #BeingAmerican.

Orchid Show

I took a break yesterday and visited the New York Botanical Garden, where the annual orchid exhibition just opened. Instead of writing, I thought I would just post some images.

What a lovely facility.













Finally, we wandered outside of the Haupt Conservatory and discovered this collection of sculpture, each representing the four seasons. Frankly, I think Spring looks a lot like the late Peter O’Toole.





What the Trouble with Really Tells Us

The recent implementation of the website garnered considerable media coverage, as poor functionality and reliability provided fodder for politicians, partisans, and pundits. One of the more common causes cited for the disappointing launch was programming incompetence, but some discussion of a broken procurement system lent a bit of variety to the discussion. Right-wing ideologues took advantage of the situation to demonize the public sector and renew their call for further privatization (see the so-called “Yellow Pages” letter) although much of the development work was actually carried out by private contractors. I think much of this misses the heart of the matter: the implementation of highlights weaknesses in the core IT competence of the United States government (public sector) at a time when technology should be a central component of our strategy for global competitiveness.

Office WorkersEssentially, I am making the argument that bureaucracy offers important benefits that Washington, D.C. should embrace. Despite its negative connotation, a certain level of bureaucracy is essential to any organization wishing to grow and codify its position, whether that organization is a powerful nation-state or a multinational corporation. As large organizations create rules and processes to operate efficiently and effectively, cultural knowledge develops. This cultural knowledge — vetted and evangelized throughout the organization — represents intelligence that provides competitive advantage, branding, and a language for customer interaction. It is organizational knowledge that is passed between generations, which promotes stability and, like many forms of knowledge, confers competitive advantage. Despite being described as anti-democratic and despotic, bureaucracy learns how to serve its customer (e.g., the current administration) more effectively than an entity from outside of the organization that does not have the same customer relationship.

One current political assertion rarely debated is that outsourcing enables effectiveness by lowering costs through competitiveness. The practice allows public sector departments to shrink their staff while selecting specialists to work on specific projects. Dogma states that the private sector must be more efficient as its workers are highly motivated to excel and attain valued skills. That is possibly supportable when limiting the analysis to balance sheet performance, but the argument certainly collapses when intangibles like strategic advantage are considered. Balance sheets are important, but many truly innovative organizations grant themselves adequate leeway to explore strategic directions that fall well beyond the definition of efficiency, and we laud their behavior. The public sector must innovate, learn, and consolidate just like the private sector, and this can only happen with a distinct and healthy bureaucratic organization. In my experience as a business process analyst, I observed that outsourced partners could not leverage customer knowledge as well as internal teams. When this leverage does not exist, Information Technology is not considered integral to the organization’s mission. Thus, it cannot be strategic, and its benefits are not optimized.

As the century progresses, the level of IT mastery found in our public sector and its incorporation into strategic planning will relate to the level of success experienced in areas such as finance, security, and logistics. Eschewing IT bureaucracy will insure a sub-optimal understanding of customer requirements and exclude this critical asset from strategic planning. Already we see how countries like Taiwan employ technology to drive their health care costs below 7% of GDP, about one-third of the U.S. level. Recent efforts to disrupt the education sector with technology demands that greater care be taken when considering our public education strategy vis-à-vis the industrialized world.

The debacle of is not a failure of a single development team as much as it is an inadequate response by a depleted system. Cultural knowledge that could have informed the design, construction, and assimilation of new technology has been systematically dismantled over the past three decades. Most important is to remember that bureaucracy is not about political ideology, it is about understanding the framework in which policy is executed. That is cultural intelligence. Because this form of intelligence is not nurtured and passed from generation to generation, our expectations of success must be necessarily lower. The use of outsourced resources to fulfill work orders like will sacrifice customer awareness and collaboration for characteristics like staffing flexibility. That appears attractive as a tactic, but is useless as a core competence. Policy makers need to consider whether the United States can maintain its hegemony in a technological environment without a robust IT bureaucracy.

Neoliberalism and the God Emperor

I had the television on the other day while talking heads preached the gospel of low taxes and small government inducing economic growth and innovation, and I was overcome by the image of the God Emperor of Dune, Leto Atreides II. What would prompt such an association, short of powerful hallucinogens or a vacuum of social interaction? It is the quandary that we share with Leto II (or he will share with us, since his time is still several aeons in the future.)

Leto, the son of the prophet Muad’Dib, undergoes a physical and mental transformation from man to sandworm. For 1500 years, Leto crafts a new narrative of the human place in the Universe. He squelches independent thought and provides the tyranny that comforts so many in their banal lives. The God Emperor becomes the repository of ambition, risk, and power.

At the time of the story, Leto falls in love with Hwi Noree, the Ixian ambassador who speaks to his lost humanity. He faces two pathways: the most likely and necessary is his death and consequent reorganization of the Universe; but for a time he contemplates a life with Hwi as a human. The important point is that Leto can reverse his metamorphosis, yet he realizes that it will take another 1500 years. And here we come back to the talking heads on television…

For decades we have been sold a narrative that not only prescribes an optimal economic model, but defines our relationship to the community. Despite the fact that considerable evidence demonstrates the deficiencies of this economic model, we continue to build our identity around it and accept the bondage that it imposes. Much as humanity watched the transformation of the God Emperor and accepted his tyranny and order, we fling ourselves down the road of neoliberalism, without questioning its consequences or even our eventual destination.

This may sound trivial, but there are serious implications for the New Left, or whatever is out there in opposition to the American Right. Policy change is difficult and circumscribed without first changing identity through messaging. At present, a unified message does not exist, whether you are watching a Democratic spokesman like Chris Van Hollen on television or following the various fractured factions of the Occupy movement on social media. Until a coherent, alternative identity exists to challenge neoliberalism, catastrophes like the 2008 financial collapse will be inadequate to drive change, and leaders like President Barack Obama will operate within existing political confines. Even failures of the Right will be softened by the framework they have already built.

Too many people see themselves in a rising boat, even if it’s a dinghy. Much of the population matured in an era where no other message was communicated. The reality is that thought leaders are seeking a permanently impoverished working class to provide cheap labor for capital. Until I hear the opposition articulate that far and wide, I will remain skeptical that change is on the horizon.

Summer Reading

Finally, I’m back to reading what I want. It should be a good summer.

Adams, Brooks. The law of civilization and decay; an essay on history. New York: The MacMillan Company, 1897.

Adams wrote one of the first and certainly one of the most influential essays justifying American imperialism (TR cited him.) He and Turner set the tone for American expansionist thought.

Banks, Iain M. Consider Phlebas. New York: Orbit Books, 2008.

I have already started this one, picking it up after hearing about Banks’ cancer. Sadly, he died this week.

Bradbury, Ray. Zen in the Art of Writing. Santa Barbara: Joshua Odell Editions, 1996.

I need to nurture my creative side a bit.

Foucault, Michel. The Birth of Biopolitics: Lectures at the Collège de France, 1978-1979. Edited by Michel Senellart. Translated by Graham Burchell. New York: Palgrave Macmillan, 2008.

For my Great Famine paper. This may require copious amounts of vodka.

Martin, George R. R. Game of Thrones: A Song of Fire and Ice: Book One. New York: Random House Digital, 2003.

I’m hooked on the series. It’s time to try the books.

Nally, David P. Human Encumberances: Political Violence and the Great Irish Famine. Notre Dame: University of Notre Dame Press, 2011.

Nally is a geographer and Reader at Cambridge. He wrote a compelling, if sometimes problematic, article on biopolitics and the Great Famine. I’m excited to see what he does with this monograph.

Vance, Jack. Tales of the Dying Earth. New York: Tom Doherty Associates, LLC, 1998.

Jack Vance was described as the greatest science fiction author you never knew. This is a volume of four novels. You can find me at the lake wall…

Williams, William Appleman. The Contours of American History. New York: Verso, 2011.

Williams was a brilliant and outspoken diplomatic historian. Contours and The Tragedy of American Diplomacy have both been reissued for their 50th anniversaries, with new introductions by Greg Grandin and Andrew Bacevich, respectively (both writing for The American Empire Project.)

Is Redemption Only for the Powerful?

Three news stories collided in today’s expression of serendipity. As they rolled across my desk over the course of several hours, I had to ponder the nature of redemption in the United States.

Jonathan Turley reported about the case of Kaitlyn Hunt, a high school senior in Florida who was dating a younger girl. Unfortunately, Kaitlyn is eighteen, and when the girlfriend’s parents found out about the relationship they filed charges. Kaitlyn is now facing felony charges and the almost certainty of a lifetime on a sex offender registry. The prosecuting attorney is refusing to consider any lesser charges.

In Senate debate today David Vitter introduced an amendment to the Farm Bill that bar certain felons from receiving SNAP (Food Stamp benefits) for life. Someone who gets in trouble as a youth and does his time would be excluded from basic sustenance if they fall on legitimate hard times in the future. Consequently, their children and grandchildren might also suffer. Democrats did not contest the amendment.

And last, and least, Anthony Weiner launched his campaign for mayor of New York City in the dead of night, apparently hoping to miss the morning tabloid cycle. ‘Nuff said.

We are really bad in the United States about accepting the idea of redemption. We used to punish people for their crime, accepting that crime was a part of life. The idea of rehabilitation is fairly new, younger than the age of our country. Today incarceration has mutated into some grotesque beast: nothing is too harsh and the sentence itself cannot provide deterrence for possible future malicious behavior. At least, in certain cases…

Kaitlyn is a good student who has been active in her school. Because she had consensual sex with someone else in her school, the State of Florida is seeking to deprive her of liberty and the pursuit of happiness. I use those phrases because, no matter how you might define the pursuit of happiness, being placed on a sex offender registry for the rest of her life will prevent her from collecting property and wealth. It will damage her ability to earn income. It will create six decades of hardship, because as a high school girl she had consensual sex with another high school girl.

The Farm Bill is also a rich example because the amendment to deny SNAP was submitted by David Vitter, the Senator who cheated on his wife with a prostitute while wearing a baby diaper. What David Vitter asked for and received from his constituents (and possibly his wife) is something he wishes to deny people who don’t have power. People who might have made a mistake, but have paid their debt. People who might have been treated harshly because of their economic status or race.

Anthony Weiner wants another chance. It is unknown if the voters of New York City will oblige him, but he has already been forgiven enough to use his past position to earn hundreds of thousands of dollars consulting for large corporations doing business in Washington, D.C. So in nearly every sense, Weiner received his redemption.

I sit at my desk tonight and wonder if this is the kind of society we are all comfortable with. Because it really disturbs me.

The Legacy of Vincent de Paul

On June 16th I will graduate with an M.A. in History from DePaul University. It has been a good experience, but it was with consternation that I read an email from Rev. Holtschneider, the President of the university, pitching a plan for a new basketball stadium at McCormick Place in Chicago. The plan is an irresponsible allocation of public resources, and enables city leaders to “Christmas shop” while ignoring serious fiscal issues in the city.

The numbers seem to vary depending upon who you read, but the total cost is being estimated between $210 million and $300 million. It would include the 12,000 seat stadium, hotels, and street-level improvements for restaurants and shopping. DePaul would contribute $70 million in order to build a “first-rate college [basketball] program.”

DePaul makes this investment for several reasons. College basketball presents high-impact opportunities to promote our reputation. Alumni find the broadened name recognition a help when competing for jobs nationally. This first-class facility and its more central location will help us build on the momentum our basketball program has enjoyed in recent years from hiring first-rate coaches and staff.

Rev. Dennis H. Holtschneider, C.M.

This is extraordinarily confusing for a number of reasons. First, DePaul’s basketball program has languished since its glory days of Ray Meyer in the 1980s. Last year, the team was 11-21, and they are barely better over the past five years. Departmental budgets have been cut to the point where professors have limited ability to print and photocopy classroom materials. Academic conference budgets are non-existent.

The city of Chicago is a clusterfuck. Police, fire, public health clinics, and sanitation are all under strain. Schools are closing and Mayor Emanuel is pressing for school privatization. Property and sales taxes are very high and corporations are cutting deals to stay in the city in exchange for huge tax breaks. The neoliberal experiment is embraced with abandon but failing miserably.

The only people who think this is a good idea are those pushing it. Rev. Holtschneider “expects this project to produce 3,000 to 5,000 permanent jobs, along with 5,000 construction jobs during the building phase.” I cannot begin to imagine how that might happen, and neither can other urban planning consultants interview by the Chicago Sun-Times (the story is behind a paywall.) City leaders are faced with the law of diminishing returns: Chicago is already a vibrant tourist and convention destination, and DePaul is already selling tickets at its current arena in Rosemont. Adding more capacity in the city is nothing more than a shell game that will add little to the overall economy (the mayor of Rosemont is already in Springfield lobbying for concessions.)

While I appreciate the many conflicts that exist in the administration of a large organization like a university, I am disappointed that DePaul University is contributing to a development plan that will draw financial resources away from other necessities and make taxpayers responsible for more municipal debt. Ideally, Rev. Holtschneider should oppose the plan entirely and speak to the crisis of our public school system. At the very least, he should not contribute $70 million to a project that — like so many others in cities around the country — promises to be an under-utilized public asset for decades.

Although I am not Catholic (or even religious, for that matter) I have enjoyed learning about the life of Vincent de Paul, a seventeenth-century priest who founded a charitable order, counseled kings and queens, and seemingly lived the life of Jesus Christ through humility and good works. As a practical man, Vincent might have stood by while such a project was constructed, but he also would have counseled against depriving the population in order to achieve it.

The Future of the Footnote

I have considered the function of the footnote and the opportunities afforded by digital technology since entering graduate school. Recently, a brief exchange on Twitter with @Jason_M_Kelly and @lostinhistory prompted me to commit some ideas to “paper.” This is just a beginning, as I am sure there will be much more to add.

The footnote has evolved since the days of Edward Gibbon, when it achieved the status of high art. Today, it is certainly treated with less deference, although many authors are skilled at adding tremendous depth to their work with the footnote. Its most primary role is to cite source material. Linking the wider community to sources provides not only accountability but also information that leads to an exchange of ideas. Footnotes provide space for a counter-narrative. This may be less understood outside of the academy, but authors will typically use the footnote to address arguments that fall outside of their thesis. In this manner the footnote advances debate and illustrates historiography. Finally, at least for our purposes today, the footnote acts as a social network. Before the days of inexpensive global travel and international conferences in desirable destinations like Tallahassee, Florida, footnotes provided a way to answer the concerns of colleagues. This is a critical, albeit logistically outdated, function. New technologies provide an incredible opportunity to rethink the way we approach scholarship.

Coming from an Information Technology and business consulting background, I view the progress of global networks, cheap storage, cloud computing, hardware, and bandwidth as adequate precursors for rethinking the footnote. A new paradigm can maintain academic rigor while advancing the dissemination of information throughout the community. All of this can take place in an open source environment, flattening the current publication and distribution hierarchy by placing new toolsets and communication channels in the hands of content creators.

Let us take a glimpse at how the three roles of the traditional footnote might change.

Since citation is so important, new ways of identifying source material should be envisioned. While the URL is a beginning, and electronic analogues of paper documents (like the PDF file) include them, current technology makes it possible to embed source material into the note. If my thesis rests upon the interpretation of an eighteenth-century document, then presenting a digital rendering of the actual document is far superior than referencing its archive location or published transcript. Original text, three-dimensional renderings of material culture, comparisons of multiple editions, and audio/video/still photography can all be reproduced with higher quality and less expense in a digital format than on paper. This means that archives and libraries are going to have to step up their game at digitizing sources, and new forms of licensing like Creative Commons will have to be codified. However, none of this will happen if the academy does not take a firm position and champion the effort.

Although technology is a laggard in my field of History, there are initiatives to create online communities.[1] H-Net is a good social media platform, but it is still a listserv, which is 1990s technology. The digital footnote is begging to be a platform for scholars to engage each other in an ongoing, dynamic conversation. Why should the notes from a first edition remain static? If scholarly conversation occurs after publication, as it most surely will, there is no technological impediment to reflecting that dialog in the notes. The inclusion of book reviews, antitheses, or new research provides a more robust intellectual environment to engage the audience and continue the important function of illustrating historiography.

Imagine the value of a work like The Federalist, annotated throughout the life of our nation by scholars who applied their worldview and theoretical framework, sitting on your tablet device as a data-driven archive.

I am not suggesting a mere comment thread, although that could be a separate environment in which an author moderates an ongoing conversation with students and the public. Such a casual space allows the book to become its own social history. Even if this is not desirable, the important thing to remember is that it is possible. Authors can tailor their work to extend as far beyond their academic field as they wish (for survey classes or a non-academic audience.)

The footnote’s role in providing a counter-narrative also has great potential in the digital age, providing the ability for an author to draw an entire state-of-the-field into a single location. The presentation of such a dialog is no more limited than what we have already explored, and I am quite certain that others will exceed my imaginings.

To return to Jason’s point, the eBook needs to become more than an analogue of the printed book, and the footnote must also evolve. I see a web of scholarship connected through the footnote, controlled through an app and refreshed from the Cloud. The “book” itself is merely an entry point into a larger corpus of knowledge. A colleague told me that “footnotes are magical. There is nothing not to love about footnotes.” It is time to liberate the footnote from the bottom of the page and place it in a space that incorporates the historical event as well as the ongoing conversation about history.

I may stand in need of some apology for having used, without scruple, the authority of Constantine Porphyrogenitus, in all that relates to the wars and negotiations of the Chersonites. I am aware that he was a Greek of the tenth century, and that his accounts of ancient history are frequently confused and fabulous. But on this occasion his narrative is, for the most part, consistent and probable; nor is there much difficulty in conceiving that an emperor might have access to some secret archives which had escaped the diligence of meaner historians.

Edward Gibbon, The Decline and Fall of the Roman Empire


[1] My comment does not consider the tremendous amount of work done in the digital humanities, which I don’t consider germane to this conversation. In this context I am only referring to social networking. No hate mail, please!