The Trauma of Theory: A Cautionary Tale

I had my first run-in with literary theory in the spring of my freshman year. I was halfway through my first college English class and thought I knew everything; I figured that because I’d read Paradise Lost and was increasingly able to follow along when I heard graduate students talk about their work, I’d be able to listen to a faculty member I knew give a paper on a panel concerning a topic in which I was interested, and know when to smile and nod. I let some people talk me into attending this panel, and I knocked off my work-study job to stand in the back of an overflowing auditorium, full of optimism and full of myself.

And boy, was I sure mistaken. Not only did I not understand the poem the speaker was discussing when she passed around photocopies of it; I didn’t understand a single word she said about it. I don’t remember, today, what the title of the talk was, or what argument she might have said she was intending to make; I only remember blank incomprehension, and confusion, and shame. I remember becoming increasingly worried and upset as I failed to grasp anything, failed to understand why what the speaker was saying was important to an understanding of the poem, failed to nod or chuckle with the rest of the audience. I ducked out before the end of the panel, too ashamed of my lack of understanding to drink the coffee, pick over the fruit tray, and say hi to the people in the audience whom I knew. I went home and cried. Though surely no one in the audience even noticed me, much less knew how confused I was, I felt as if I’d been exposed as a pretentious fool, and I realized how ridiculous I’d been to think that half a semester of intro lit could have prepared me for the rigors of professional literary criticism, or indeed the realities of the professional academic world. A few English classes and theory talks later, I have learned enough to watch the people in the audience whom I think are clever and nod when they nod; I have learned to stay for the fruit tray and let myself be introduced to people no doubt wondering what this awkward undergrad was doing at their talk; every so often I can grab hold of a sentence out of the paper which relates to something I’ve read or learned from a class, something which reminds me that the speaker isn’t talking in a foreign language after all. And I have come to accept that, as an undergrad, as not even an English major, as someone of merely average intellect who hasn’t read the theorists the academics make use of in their talks, there is no reason why I should understand the strange language they speak, their inscrutable methods of making sense out of a text which to the uninitiated sound quite all Greek (or perhaps all French, given the context, except that I actually do understand French, and what they say doesn’t sound like any of the French I know). Even if I can cope, now, with this incomprehension—enough to keep masochistically putting myself through the routine, in the hopes that someday I will understand—that afternoon at that first panel remains one of the most frightening and embarrassing moments of the first half of my undergraduate career. For someone such as me whose sense of self-worth is rooted nearly entirely in the degree to which she’s taken seriously by professional academics, there is nothing quite so awful as it being so matter-of-factly demonstrated to you what an outsider you are.

I was reminded of this episode today not only because, with twelve days to go until I’m back on campus, I can think of nothing other than the academic world; but because I read Adam Kirsch’s brief obit of Frank Kermode in Slate. Kermode is one of the people whose name has entered my sphere of awareness through the academic conversations on which I habitually eavesdrop; like so many such names, I’ve never actually read his work, a fact which, like it does with so many other such names, never fails to produce a distinct feeling of shame. The point, however, is that I can’t comment on Kermode’s views of the state of literary criticism today except through Kirsch’s interpretation of them, which will no doubt expose me as a charlatan far more obviously than my failure to understand theory talks does; however, what Kirsch says does have some bearing on that very problem of failure to understand theory talks. According to Kirsch, Kermode expressed considerable concern about the inaccessibility and hyperspecialization of literary theory, and the modern habit of scholars of literature of keeping the public (like me) unable to understand what it is they do—due, I suppose, to their reliance on a particularly inscrutable and difficult set of secondary literature. Kirsch pays tribute to Kermode’s status as a consummate generalist and a popular critic in the London Review of Books (which he helped to found) and other publications, labeling this manner of practicing lit crit a dying breed in favorable contrast to the theorists.

And, well, it’s difficult not to sympathize with this perspective. As cognizant as I am that my failure to understand theory is probably due either to my own stupidity or my lack of initiative at studying on my own the fundamental theory texts which would help my understanding of that world, I must to some extent think that the sense of alienation I feel isn’t entirely my fault. I’ve taken a number of English classes for someone who isn’t a major, have dabbled in theory, have done my best to understand what it is my friends and my colleagues in my sister department do. And I have come to believe in the relevance of theory to understanding our world: when it’s explained in a simplistic way for undergrads to understand, I’ve gotten excited by it; I’ve seen firsthand the transformative power of, for example, queer theory on a queer person’s understanding of hirself and the world, and that’s a good thing. But I do find myself agreeing with Kirsch (and perhaps Kermode, though as I said, I don’t have a good sense of how much Kirsch is quoting Kermode, and how much he’s offering his own take) that what the academic practice of literary criticism and theory so insulates itself from the world of people who don’t have advanced degrees in the subject, we have a serious cultural problem which matters a great deal.

But why does it matter so much? After all, one Ivy-League-brat-with-self-esteem-issues’ self-absorbed feelings of alienation are probably not that important in the scheme of things. Recasting the language of literary criticism such that someone who hasn’t read a single post-structuralist could still engage with the process of thinking about literature won’t help to eliminate world poverty and hunger or stop global warming or bring relief to the flood victims in Pakistan. But a citizenry which sees the practice of humanistic inquiry as part of its time could restore reason and civility to the political sphere. It could find in itself a desire to reinvest in education and the arts in the name of the next generation. It could, regardless of whether there is such a thing as narrative or such a thing as reality or such a thing as authorial intent, become interested in scrutinizing the claims of politicians and pundits who take even more fast-and-furious approaches to Truth than do literary critics. Because, see, the fact is that we need the humanities. The practice of the close study of texts makes us better citizens, better thinkers, perhaps even better people. But if that study is not just hidden in an ivory tower, but hidden behind a wall of words, it’s going to be very difficult indeed to make the case for its survival to a public which cannot understand what it is that humanists do.

Of course, it would be lovely if we lived in a world in which people said, “I do not have the knowledge or cultural capital to understand your work or the culture in which it exists, and yet I will take your word for its importance.” But, as we all know by reading daily news which attests to the systematic defunding and vocationalizing of higher education, this is not the world in which we live. We live in a world in which intellectual culture must be rigorously defended as a good in itself, and in which a discourse which can bridge the gap between the closed circle of the academic conference panel and the larger western culture of anti-intellectualism is yet to be outlined. In order to do this, it seems to me as if it is necessary to rethink academic culture into something which is not dedicated to separating insiders from outsiders, and to rethink literary studies in particular into something which does not reward mere inscrutability and punish and induce shame in those who are not members of the club. This is not to say that theory has no place in the practice of understanding the world and its texts (or films, or music, or art, or culture), but rather simply to point out how difficult it will be to make a case for the humanities going forward, if the Frank Kermodes of this world really are such a dying breed. We have our work cut out for us—and I especially. Not only do I feel as if I need to begin to consider what it means to belong to the next generation of humanists still in the process of learning what it means to be engaged in this project of understanding the world through its texts; I need also, I feel, to do the reading and listening necessary such that I can loiter unseen in the back of an auditorium, listen to a scholar speak, and not feel quite so hopelessly, shamefully left out of a culture in which I want so desperately to be taken seriously and to belong. Once I feel I have moved beyond the stage of twenty-year-old charlatan, perhaps I can start to articulate a humanism I can call my own—but is it too much to ask that the theorists should meet me halfway?

QOTD (2010-08-25)

New York City Mayor Michael Bloomberg’s speech at the Gracie Mansion iftar brought tears to my eyes:

A few quotes worth highlighting:

Islam did not attack the World Trade Center. Al-Qaeda did. To implicate all Islam for the actions of a few who twisted a great religion is unfair and un-American. Today, we are not at war with Islam. We are at war with Al-Qaeda and other extremists who hate freedom.

Freedom and tolerance will always defeat tyranny and terrorism. And that’s the great lesson of the 20th century, and we must not abandon it here in the 21st.

This is a test of our commitment to American values, and we have to have the courage of our convictions. We must do what is right, not what is easy. We must put our faith in the freedoms that have sustained our great country for more than two hundred years.

There is nowhere in the five boroughs of New York City that is off-limit to any religion. And by affirming that basic idea, we will honor America’s values, and we will keep New York the most open, diverse, tolerant, and free city in the world.

This weekend is the anniversary of Martin Luther King, Jr.’s “I Have a Dream” speech at the August 28, 1963 March on Washington for Jobs and Freedom. Mayor Bloomberg’s remarks are in the best tradition of spiritually compassionate calls for tolerance, equality, and civil liberties which Dr. King epitomized in 1963. This is an American rhetorical and ideological tradition stretching back through the abolitionists, through Jefferson, and across the continent, though there is an argument to be made that it is New York City which best represents this spirit of freedom and inclusion:

Not like the brazen giant of Greek fame,
With conquering limbs astride from land to land;
Here at our sea-washed, sunset gates shall stand
A mighty woman with a torch, whose flame
Is the imprisoned lightning, and her name
Mother of Exiles. From her beacon-hand
Glows world-wide welcome; her mild eyes command
The air-bridged harbor that twin cities frame.
“Keep ancient lands, your storied pomp!” cries she
With silent lips. “Give me your tired, your poor,
Your huddled masses yearning to breathe free,
The wretched refuse of your teeming shore.
Send these, the homeless, tempest-tost to me,
I lift my lamp beside the golden door!”

I have devoted a lot of Facebook time these past couple weeks to promoting the voices of American freedom and acceptance which seek to put right the voices of misunderstanding which misguidedly believe that lower Manhattan is not the place for a Muslim community center. But you don’t have to be a politician whose words make it into a newspaper or onto YouTube to dispel intolerance—all you need to do is to remember that, from some vantage points in lower Manhattan—perhaps the top of the new Freedom Tower will be one of them—you can see the Statue of Liberty.

Daddy, What Did You Do in the Age of Late-Capitalist Decadence? or, Some Thoughts on Cultural Criticism for a New Generation of Mad Men

A few weeks ago, my former Campus Progress colleague Ned wrote a blog post which he titled, “The Left’s Poverty of Good Cultural Criticism.” I commented on the post in a state of some bemusement: after all, I waste vast quantities of time reading a lot of very good cultural criticism coming more-or-less from the left on a weekly basis. What do the New York Review of Books, the London Review of Books, the Observer, Harper’s, the New Yorker, the Atlantic, the New York Times Magazine, the Village Voice, the Paris Review and countless other publications do, after all, if not showcase the best of popular left-leaning criticism in the U.S. and Britain today?

It seemed to turn out, however, that Ned and I were disagreeing at cross-purposes. Ned’s post was spurred by his dissatisfaction with criticism of the TV show Mad Men, which he and others of my generation of writers/journalists find narrowly focused on the show’s historical accuracy at the expense of more literary criticism of its narrative, its character development, etc. Now, Mad Men is not a show which interests me; I stopped following it partway through the first season, and so haven’t read much of the discussion of the show which proliferates on the Internet. But Ned’s and others’ objections to that discussion, irrespective of the rest of the state of modern general-audience cultural criticism, leads me to wonder if there might possibly be a generation gap at work here. Sean Wilentz’s forthcoming book on Bob Dylan, for example, while promising to be a masterful piece of cultural, musical, and historical writing, seems as if it will speak primarily to those who have inhabited a certain set of historical moments beyond the ken of those of us in our early twenties. Prof. Wilentz’s book will serve an educative purpose to we GenY-ers, not entirely dissimilar to what we might get out of the late Tony Judt’s series of memoirs for the NYRB, or a Paris Review author interview. I can’t speak for my colleagues, but these are the sorts of writing I will read for their specifically educative purpose: they inform me about a middle-to-highbrow intellectual culture in which I am neither old enough nor educated enough nor sophisticated enough to participate, and they stand apart from, say, the Guardian’s attempt to explain Katy Perry to its readers, or particularly last week’s NYT Magazine article about “emerging adults”—that is, us. Those of us who are still in college or have recently graduated from it, who don’t have the critical soapboxes our elders do, turn to the culture pages of newspapers and magazines to find our elders helpfully explaining to us the cultural world of their youth and young adulthood, or striving to explain the lifestyles and cultural touchstones of a new generation of young adults in ways which can unfortunately wind up merely alienating those very same young adults, so sure are we that those of our parents’ generation have fatally misunderstood our world.

I am inclined to think that these generational gaps are at times overstated, because I am firmly wedded to the belief that culture moves in cycles; I also believe that by striving to understand the cultural context of previous generations, we can help them to understand ours. I am also pretentious, and a bit retro, and perhaps I myself live in too much of a bubble to really engage with the cultural context of my generation and see why members of our parents’ generation might not be getting it. Nevertheless, it is not difficult to see why the perspective of someone who lived through the era which Mad Men aims and claims to depict might have a very different reaction to it than someone slightly younger who grew up when identity politics and the culture wars were at their height; and that someone my age, reaching adulthood in a cultural context which aims both to synthesize and to reject entirely these two preceding milieux might find frustrating a reading of Mad Men which focuses predominantly, say, on the show as a concretized version of memory; or on the race, class, gender, and sexuality politics of the show’s world; or even on the classical Marxist critique which I think the show most desperately demands. It’s not difficult to see, I think, that any of these frameworks might be unwelcome in the eyes of a younger generation of critics—a generation which learned historicist, Marxist, feminist, queer, etc. critical methodologies in its English or philosophy or gender studies or cultural studies classes in college, and is understandably looking to find its own critical stamp to leave on the popular culture—the online meritocracy making this a more urgent task, since none of us require entrée into academe to need or to want to do this.

What, however, would such a critical stamp look like? I have to confess that I’ve no idea—perhaps I am too much of a historian-in-training, and too much of a traditionalist, to be the person to consider this. My attempts to engage with criticism of the popular culture have not really departed dramatically from the techniques I’ve learned in my classes; my own ideas for an article to submit to my Journal of Popular Gaga Studies don’t particularly deviate from the much-trodden ground of a standard queer-theory framework. I know, as I lightly said to a friend when ze told me that ze was confused about hir sexual orientation (gender-neutral pronouns to preserve confidentiality ftw!), that we’re all supposed to be post-labels nowadays. But I’m not sure what that means, actually—other than an apparent lack of interest in devoting one’s discussion of Mad Men entirely to its gender politics.

Whatever form this post-labels, forward-thinking criticism takes, however, I hope that it will only shape itself after due consideration of its predecessors, of history, and of the culture highbrow as well as popular. I hope that it will be shaped by young critics who read, in addition to blogs and Twitter, the NYRB as well as Rolling Stone, and I hope that it will prove capable of engaging with written as well as visual media. I also, as always, hope that it will do its work both inside and outside the academy. My generation presently bears the burden of forging a new intellectual left which can grapple with the problems which presently plague our states, our communities, and our cultures, and it can do that neither solely from within the ivory tower nor without the ivory tower’s help at all. I hope, too, that out of the hundreds, possibly thousands, of undereducated idiots like me writing blog posts about Criticism as if we know what we are talking about (news flash: we don’t), the media circus will find it within itself to highlight views considered and informed rather than sensationalist and needlessly polemic.

Yesterday, I finished reading Tony Judt’s valedictory book, Ill Fares the Land, and so I find myself thinking about these matters of generational succession, and of the task now set before we “emerging adults” to create not just the political, social, and economic, but also the cultural and intellectual world we want to live in. Perhaps this was not at all the intended takeaway of Ned’s post about Mad Men, but I find myself thinking, this morning, that we can and should listen to the advice of our elders—about how to rebuild social democracy, or about how to watch a television show. Before too long, however, this will be our world, and so more importantly we must begin to build the intellectual framework which will best allow us to take our place in running it. As to how to do this? Well, I certainly hope the strategies will evolve organically, because I don’t think it’s a question any number of years of higher education, or any number of vote-with-your-mouse pageviews, could answer.

A Word From Your Friendly Neighborhood Peer Academic Advisor

As I spend more time reading the professorial blogosphere, I find myself more frequently tempted to comment on academic questions I, as a college junior, am far underqualified to have an opinion about. I may be awfully opinionated, but you should probably listen to the professionals if you actually want to learn anything useful about life in the academy. That said, though, I was just barely, bureaucratically ineligible from becoming a peer academic adviser for freshmen in my college this year, and I have plenty of thoughts about what I did right and wrong in my first two years of university that may be worth sharing with my now-nonexistent advisees.

In the spirit of Historiann’s recent post about undergrad satisfaction and regrets, Tenured Radical’s advice to faculty academic advisors (no, I still don’t know whether “advisors” or “advisers” is correct, so I’m using both), and the multiple letters I’ve already gotten this summer from frosh and sophomores who want some advice on choosing classes; and in order to offer a more constructive tone than that of my whiny distribution requirements post of a couple weeks ago, I offer here some thoughts directed at first- and second-year university students trying to navigate a new academic world. These thoughts are probably better-suited to academically serious students for whom college is more about learning and intellectual development than it is about anything else (not to say that’s what college has to be; some students feel that way and some don’t), but I don’t see why it shouldn’t apply to anyone concerned about making the right choices and learning to decipher academia.

Listen to the experts. As Tenured Radical indicated in her post, the online rumor-mill is of limited use in determining which classes to take, especially if you’re looking for good classes and not just easy or fun ones. But many’s the time I’ve ignored the advice of a professor or grad student who knew me, knew the person teaching the class, and knew that I wouldn’t find the professor or the material a good fit for me. Work on politely phrasing questions such that you can ask a professor not what she thinks of her colleague’s teaching, but whether her colleague’s class would be a good fit for you. And if she says it wouldn’t, pay good attention to that recommendation.

Keep an eye out for professors’ names. Often I’ll ask frosh who’s teaching a particular class, and they’ll say they’ve forgotten the professor’s name. But a class taught by a fantastic professor, even if its topic is outside your immediate area of interest, is a better use of your time than a class in your area of interest taught by an unremarkable professor, and so it’s advisable to remember those names. This is where listening to the experts comes in, as some sources on who the best professors are will be more reliable than others. If you’re torn between the professor and the subject matter, take the professor every time. And be aware that a lot of different people teach, e.g., Victorian literature, or the American history survey, or SOC 101, and you might want to wait to take the class until the best professor is teaching it.

Be careful about what you can handle. Starker than the social divide between undergrads and “sketchy” grad students is the divide between the humanities and the sciences. If you’re a humanities major like me, you probably grew up thinking that as a “humanities person,” you couldn’t possibly be any good at math or science. You may have picked your first semester’s courses thinking that since your SAT math score was on the low side, you couldn’t possibly handle a college-level quantitative class, and so you decided to sign up for the easiest quantitative class in the whole university, a computer science class whose syllabus explained quite clearly that it was going to repeat a lot of material you’d already learned in high-school computer science. (This may or may not have happened to me.) That syllabus, dear frosh, is a good indication that you’re not going to learn anything from the class, and that you should consider taking one which will teach some new concepts.

This is not to say, however, that you need to choose the most challenging thing in all areas outside the ones in which you’re confident. To fulfill my lab science requirement, I took physical anthropology and environmental science: not taxing in the same way university-level physics or chemistry is, but nevertheless useful, interesting, and well-taught, and therefore not a waste of time. If, when you’re honest with yourself, it seems that it would take more work to pass intro physics than it would to get an A in your required departmental seminar, it’s probably best to leave yourself the time to get the A in your required departmental seminar.

Plan ahead. It’s probably just a tad neurotic to make a plan for what you’re going to take every semester for the next four years (which is not to say that I haven’t done it…), but you’ll find that it will benefit you to think farther ahead than the next semester. By the time you’re halfway through college, the number of course slots you have left will start to look increasingly finite (especially if, like me, you’re planning on a semester abroad), and you’ll find yourself having to make difficult choices between queer theory and colonial American history, or suddenly realizing that the course you’ve wanted to take since you sent in your matriculation forms is only offered in one of your four years. It might be worth looking through the course catalog, making a list of all the classes you feel as if you can’t possibly graduate without taking, and keeping an eye out for those titles every semester.

Start a new language. Obviously if you’re an engineer or premed or have three majors this is more tricky, but college is really the best time in your life to start a language you missed when you were young, and I regret only continuing the ones I began in junior high and high school. You may want to think about which new language will help you most in your future areas of academic or professional interest, but studying a language for which you can’t see any possible “use” is still worth it, and is “useful” for its own sake; I really regret chickening out of starting ancient Greek. Which brings me to my next point:

College doesn’t have to be vocational school. College students seem increasingly to be thinking of their bachelor’s degrees as discipline-specific professional credentials which will prepare them for specific career paths, or which just sound vocational (first-years of the world, academic economics is not the same thing as business or accounting!). There’s nothing inherently wrong in this, but you should know that there’s no reason to feel pressured to study something “useful” or something which has the same name as a profession. Not only can you certainly have any kind of successful professional life with an undergrad degree in any field, but studying what you love is important in and of itself. You should figure out which classes you enjoy the most and find most intellectually stimulating, and then continue to take those classes. You’ve got enough time to develop a career—right now, it’s time to learn how to think, and how to love to think.

This goes doubly for grad-school-bound kids. Just because you’re majoring in a not-usually-vocational subject doesn’t mean you can’t make it vocational by locking yourself into a path focused solely on grad school admissions and on making preparations to succeed in the professional world of academia. Your professors can advise you on what you need to do now to be prepared for grad school (and indeed whether you should apply at all), but it doesn’t hurt to distinguish undergrad from the rest of your life. Your undergraduate thesis is not a dissertation, your A- in a departmental seminar will not sabotage your chances of getting into a top program, and trying out courses across the curriculum won’t prevent you from being good at your intended field of study. You’ve got 5-10 years in grad school to become a specialist and to lose sleep over the job market; undergrad is not the right time.

Try out possible majors early. If your system is like the one at my school and you have to declare a major the spring of your sophomore year, you’ll probably want to take introductory/survey lectures in a variety of different departments your first few semesters. In terms of figuring out what you want to learn about for the next few years and possibly longer, doing this kind of exploration is more important than knocking out core-curriculum requirements just for the sake of knocking out requirements. While I regret some choices I made in my requirement-juggling, pushing the philosophy and science requirements till junior and senior years in order to try out sociology and English was not one of them. By taking sociology early on, I avoided making a terrible mistake when I discovered that I actually don’t like data; by taking English early on, I found a second home which has enriched my study of history in countless ways. And, indeed, don’t just stick to intro classes: by making time in my freshman-year schedule for an upper-division history course, I came in through a back door which got me much more enthusiastic about the discipline than subsequent more intro-level courses have.

However, there’s no need to take this selection process too seriously: your undergrad major does not determine the rest of your life. As per the comments about vocational education above, your undergrad major will probably have very little impact on what you do as an adult, even if you’re grad-school-bound. I know so many academics who have changed fields, it’s not funny—so study what you want to study right now, and let the rest follow.

Be skeptical about all-freshman programs. Your university is probably selling you a line about the “first-year experience,” and about how rewarding taking a freshman seminar would be, but I’ll be frank: a class entirely populated by first-years isn’t going to challenge you very much. This is not to say that just because you’re an academically serious student you’ll be better at college than everyone else in the class, but taking a lot of all-freshmen classes, while less scary than being in classes with mostly older students, can limit your opportunities to seek out mentors among the older undergrads and grad students who, in my experience, will make the difference in your undergraduate education.

The bright side of special small classes for first-years, particularly if you’re in a field or at a university which doesn’t otherwise offer a lot of small seminars, is that they can get you in contact with faculty early on, which is much harder to achieve in intro lectures with hundreds of students. I became a research assistant for the professor of the freshman seminar I took my first spring. Helping him do archival research and organize his primary sources that summer not only convinced me I wanted to be a historian and, practically speaking, taught me a lot more about research skills than I’d gotten in my classes so far; it also gave me a lasting mentor on the faculty. Such opportunities are not to be sneezed at, and can be worth 12-15 weeks of not learning a whole lot from your classmates.

Only compare yourself to yourself. In my first year I wasted hours sobbing to myself about whether my comments in class discussion were as clever as my prep school-educated classmates’, or whether I deserved to be at Princeton even though I couldn’t reference as many post-structuralists in casual conversation as some of my more pretentious classmates could. But I’ve learned not to worry: when professors evaluate your work, they’re not doing so on the basis of how frequently you can name-drop Lacan. As a first- or second-year, you cannot expect yourself to be as well-versed in disciplinary methodology or jargon as older students who have been in your department for a couple years and have done a lot more work in the discipline. Just make sure that you’re consistently putting in the most effort and turning in the best work that you can sanely manage, ignore the students who are obviously just bullshitting, and allow the ones who really know what they’re talking about to teach you how to talk the talk of a budding historian, or whatever it is you should happen to be.

Have fun, but carefully. For the academically serious student, a creative non-fiction writing workshop is a good “fun” class, and a worthwhile addition to your schedule. A 450-person children’s literature lecture largely populated by jocky fraternity and sorority members who spend the entire lecture talking about their upcoming rager may be more frustrating than “fun.” (I’ve done both.) It’s not wise to take only the most challenging classes, especially if you’re taking more than the required number of courses/credits; you’ll burn out. An arts class in which you turn in a painting or a performance can be a much-needed change from a barrage of 8-10-page analytic essays. But “easy” and “fun” are very different things. You’ll regret “easy” halfway through the semester when you’re in discussion section, no one’s done enough of the reading to have a conversation, everyone’s checking Facebook on their phones, and the poor instructor has long since given up holding the entire class’s attention. You’ll find yourself wanting to check Facebook, too, and let me tell you: it’s all downhill from there. If you’re uncertain about whether a class will be “easy” or “fun,” ask for advice.

And the moral of the story is…

Talk to adults. When you start college, you’re still a kid. You think the way you were taught to think in high school; you’re unused to making decisions (whether academic or otherwise) for yourself; unless you’re an academic brat, you’re probably unfamiliar with the arcanities of academic culture. Obviously, this is not your fault; it’s just the way things are, and at times academia can be a bit too impenetrable for its own good. But your next four years will be a lot more pleasant if you can crack the system, and it’s faculty and staff members, graduate students, and older undergrads who can help you make this transition both to adulthood and to an academic community. If you’re an academically serious student, regardless of whether you want to spend your life in academia, I can guarantee you that your life will be changed and your worldview will be opened if you allow your path to cross with those of older friend-mentors. Visit office hours. Accept dinner, lunch, and coffee invitations. (In my first year, I declined a coffee invitation from a grad student. I was shy and hadn’t yet figured out the social rules of meeting people for coffee, and that he was being friendly, not creepy. He could have been my friend, and I regret it to this day.) If you go to the sort of school where grown-ups eat in your dining hall and grad students and faculty members live in your residential system, sit down at their tables or knock on their doors. (If you don’t go to this kind of university, it’s certainly more difficult to meet grown-ups, but I’m given to understand it’s not impossible.) Ask them about your courses, but also talk to them about the books you’re reading, the things you’ve seen in the news, the brave new world you’re just beginning to puzzle through. Ask them about their work: you might discover a new area of interest. It’s not every four years that you’ll get the chance to live in a community populated by people in all different stages of life and intellectual development, and this is the most valuable thing you can get out of college. It certainly has made all the difference to my undergraduate education.

In fact, I think Tenured Radical’s academic-advising post made this point most effectively:

Needless to say, I made some spectacular errors in that first two years and had some great successes, all of which had to do with the opportunities and pitfalls of a large university. Would things have been different with a more attentive advisor? I doubt it. It wasn’t until, entirely by accident, I fell in with a group of graduate students and became invested in being regarded as — not a good student, but scholarly — that things straightened out for me.

This is actually the story of my life, so I feel qualified to endorse the strategy of seeking out mentors and not worrying too much about whether you’ve correctly distinguished one core requirement from another. Focus on having the time of your intellectual life and allowing your world to be opened and changed, and the rest will follow.

And dear readers, if you have any of your own advice for the Class of 2014, do leave it in the comments!

QOTD (2010-08-21)

The conclusion to Virginia Woolf’s “A Room of One’s Own,” which I finally read yesterday:

I told you in the course of this paper that Shakespeare had a sister; but do not look for her in Sir Sidney Lee’s life of the poet. She died young–alas, she never wrote a word. She lies buried where the omnibuses now stop, opposite the Elephant and Castle. Now my belief is that this poet who never wrote a word and was buried at the cross-roads still lives. She lives in you and me, and in many other women who are not here to-night, for they are washing up the dishes and putting the children to bed. But she lives; for great poets do not die; they are continuing presences; they need only the opportunity to walk among us in the flesh. This opportunity, as I think, it is now coming within your power to give her. For my belief is that if we live another century or so–I am talking of the common life which is the real life and not of the little separate lives which we live as individuals–and have five hundred a year each of us and rooms of our own; if we have the habit of freedom and the courage to write exactly what we think; if we escape a little from the common sitting-room and see human beings not always in their relation to each other but in relation to reality; and the sky, too, and the trees or whatever it may be in themselves; if we look past Milton’s bogey, for no human being should shut out the view; if we face the fact, for it is a fact, that there is no arm to cling to, but that we go alone and that our relation is to the world of reality and not only to the world of men and women, then the opportunity will come and the dead poet who was Shakespeare’s sister will put on the body which she has so often laid down. Drawing her life from the lives of the unknown who were her forerunners, as her brother did before her, she will be born. As for her coming without that preparation, without that effort on our part, without that determination that when she is born again she shall find it possible to live and write her poetry, that we cannot expect, for that would be impossible. But I maintain that she would come if we worked for her, and that so to work, even in poverty and obscurity, is worth while.

Happy birthday, 19th Amendment!

QOTD (2010-08-17), Continuity and Change Edition

In A Problem in Modern Ethics, Symonds, in his detailed discussion of German sexologist Ulrichs’ arguments for homosexual tolerance, reminds us just how little has changed in 120-odd years:

As the result of these considerations, Ulrichs concludes that there is no real ground for the persecution of Urnings [his word for men-loving men] except as may be found in the repugnance by the vast numerical majority for an insignificant minority. The majority encourages matrimony, condones seduction, sanctions prostitution, legalises divorce in the interests of its own sexual proclivities. It makes temporary or permanent unions illegal for the minority whose inversion of instinct it abhors. And this persecution, in the popular mind at any rate, is justified, like many other inequitable acts of prejudice or ignorance, by theological assumptions and the so-called mandates of revelation.

This fin-de-siècle argument is, other than some stylistic markers, barely distinguishable from federal judge Vaughn Walker’s decision in Perry v. Schwarzenegger—whose unveiling the other week reignited a conversation about equality, tolerance, and the nature and role of religion and morals in a society which includes men who love men and women who love women. Since there has been a thing called “homosexuality” (or “sexual inversion,” or “Greek love,” or “Urningliebe,” or other terms more recognizable to a Symonds or an Ulrichs), those who make a life out of thinking and writing about it have tried to puzzle through what its relationship is to the rest of our society. And it is remarkably striking that, just as surely as a discussion of men’s love for men will before long come back to Plato (even if by a circuitous, modern, post-classics route), it seems it will come back to marriage and divorce as well.

I remember how surprised I was, a year ago at the Smithsonian, to see a 1963 issue of One magazine with the cover text “Let’s Push Homophile Marriage,” a political rallying cry which, though advanced six years before the advent of gay liberation, though using the vocabulary of a pre-“gay” era, sounded disconcertingly familiar to 21st-century ears. I am even more surprised to see marriage rear its head in the equal-rights discussions of the 19th century: has the movement really, in 120 years, not come so far as all that? Is marriage equality as a 2010 cultural touchstone really so close to the cultural touchstones of 1890 as to make the lasting accomplishments of gay liberation seem like an illusion?

Obviously the past 120 years have brought decriminalization and the eradication of sodomy laws, a product of 1980s Britain and 2000s America largely unthinkable in Symonds’ and Ulrichs’ day, as standard as it already was in some European countries by the end of the 19th century. And yet despite shifts in public opinion and in the law, we seem to be having the same conversations, still unable to make up our collective cultural mind as to whether male homosexuality is a crime against nature or a psychological problem or just the way some people are; whether it’s fundamentally the same as or fundamentally different to heterosexuality; how the law should respect these categories and whether it should notice them at all; how we define male homosexuality as a cultural as well as a psychological category; and why, indeed, the hell it is that we get so exercised about male homosexuality while female homosexuality fades into the background! To the proverbial Martian anthropologist, our microbiological searches for the “gay gene” and endless academic psychological studies would likely seem as strange as Ulrichs’ obsessive taxonomizing of sexual behavior or Krafft-Ebing’s psychological-physiological quackery and a touch of the Freud about all their contemporaries’ early-childhood theories. We have come no closer than Symonds and his contemporaries to understanding why people are gay, and yet we seem just as wedded to an idea of the characteristic’s biological immutability as most homosexual-sympathetic sexologists of the late 19th century were. I suppose a good question to ask would be, why do we keep going around in circles? Why is it so difficult to construct a narrative of the history of homosexuality in which the arc of history bends as unremittingly towards progress as it’s supposed to?

And Christ, reader, have I really signed up to write a thesis about all this?

QOTD (2010-08-12), Thesis Research and Queer Theory Edition

We normally think of J.A. Symonds as one of the pioneers of a modern theory of male homosexuality, and my thesis presently (that is, until I change my mind again next week) hopes to discuss how Symonds’ work and life prefigured the gay identities of the 20th and 21st centuries. He was in many ways an extraordinary pioneer, and I don’t believe the importance of his work to modern queer scholarship has been fully realized. Sometimes, however, he says things which mark him as far away indeed from the mainstream of modern queer thought. For example, from Chapter 4 of his short 1896 book A Problem in Modern Ethics:

… as is always the case in the analysis of hitherto neglected phenomena, [German doctor and sexologist Casper’s] classification [of “congenital” and “acquired” sexual inversion] falls far short of the necessities of the problem. While treating of acquired sexual inversion, he only thinks of debauchees. He does not seem to have considered a deeper question—deeper in its bearing upon the way in which society will have to deal with the whole problem—the question of how far these instincts are capable of being communicated by contagion to persons in their fullest exercise of sexual vigour. Taste, fashion, preference, as factors in the dissemination of anomalous passions, he has left out of his account. It is also, but this is a minor matter, singular that he should have restricted his observations on the freemasonry among pæderasts to those in whom the instinct is acquired. That exists quite as much or even more among those in whom it is congenital.

By “freemasonry,” Symonds means the tactics “pæderasts” use to recognize each other (dress, ways of looking at each other, linguistic cues, etc.), which so happen to be a major interest of mine, so that’s cool. But I’m actually far more intrigued here by Symonds’ endorsement of Casper’s inclination to break the population of men-loving men into those in whom the trait is inborn or developed in early childhood, and those in whom it is “acquired” and to a certain extent voluntary. To endorse such a taxonomy flies in the face of most of what we think of as standard in modern homosexuality, and it would be natural to dismiss it out-of-hand as late-Victorian wackiness. There are three reasons, however, why I think we actually ought to give Symonds’ raising of a “deeper question” more thought.

The first is a fairly straightforward historical-context point: it was commonly understood in 19th- and early-20th-century American and European culture that men not generally disposed to have sex with other men might do so in extraordinary circumstances when there were no women available—the best examples being sailors on long voyages, boys and young men at single-sex boarding schools and universities, and the still-common trope of the men’s prison. Hence the sense Casper and Symonds have that some men’s sex with men does not necessarily stem from any deep-seated physiological or psychological characteristic, even though–as Symonds says at the end of this chapter–“‘the majority of persons who are subject’ to sexual inversion come into the world, or issue from the cradle, with their inclination clearly marked.”

The second is a point of semantics and close-reading: at first glance the kookiest of Symonds’ suggestions in this paragraph is that “these instincts are capable of being communicated by contagion to persons in their fullest exercise of sexual vigour.” It’s a strange sentence, one which seems to embody all the worst quackery of Victorian medical “knowledge” and to bear a disconcerting resemblance to modern warnings about the “homosexual agenda.” But putting Symonds’ suggestion in a slightly different context changes its meaning: how many stories do we continue to hear day by day about adults who come out reasonably late in life? For every modern teenage boy who grows up watching Logo, attending Pride parades, and looking for porn on the Internet, there is a middle-aged man who takes half a lifetime to realize or to determine that he is gay–and sometimes, I would imagine, this is because he has a sexual experience which spurs him to connect feelings he’s had all his life to the larger concept of “homosexuality.” It seems this could be a modern way of phrasing the problem which Symonds raises: how will society taxonomize the man who, though heretofore “normal” (in 19th-century parlance), has sex with a man and determines himself an invert? How would someone’s identity and sense of self be reshaped by having a homosexual sexual experience? It does not strike me as surprising that Symonds, who is fundamentally concerned with ideas of identity and the parts of history, culture, medicine, language, etc. which compose an identity for the sexual invert, would find this question important.

The third point is larger in scope, involving broader implications to Symonds’ observations which I don’t have the queer theory to attack properly, but which I think enormously important to understanding homosexuality both in its natal decade and today. By invoking “Taste, fashion, preference, as factors in the dissemination of anomalous passions,” I like to think of Symonds as raising the point I am fond of making that there is an extricable division between physiological/psychological, immutable homosexual sexual orientation and the much more mutable entity commonly known as “gay culture.” I am fond of pointing out that not all (male) homosexuals are participants in “gay (male) culture” (we can quibble about what that means, but regardless of what “gay culture” is, I think the point stands), while not all participants in “gay (male) culture” are (male) homosexuals themselves–I count myself in this group. And I am tempted to think of a man from Los Angeles I met in a gay bar in Paris who seemed to be enjoying the efforts of a couple guys to hit on him before confessing that he wasn’t gay himself. I think Symonds is right to point to “taste” and “fashion” as elements which make up an identity and a culture as much as something immutable does, and I think this is something which all of us who consider ourselves interested in the matter of queer identities could do a little more thinking about.

I think I’ll end on the note that I personally would like to do a lot more thinking about this word “taste,” because it could represent the simple uncontrollable desire of sexual orientation, but also quite obviously seems to connote a desire colored by preference. After all, homosexuality may not be a choice, but how many people are given to couching their sexual preferences for partners of a specific race, partners who prefer to engage in specific sexual acts, etc. in this same language of non-choice? Sometimes orientational models of these things surface–the orientational model of dominance and submission is beginning to catch on in many circles, while we nearly always speak both historically and contemporarily of pedophilic desire as something uncontrollable, only condemning in criminal terms the acting-out of that desire. But I think the language of “taste” colors many discussions of sexuality beyond LGBT activists’ “being gay is not a choice” mantra, and considering it a factor “in the dissemination of anomalous passions” could help us develop still further our understandings of how queer identities are shaped. I would think that “taste” would be incredibly important to men like Symonds who came to construct a theory of homosexual identity out of an intellectual trajectory heavily influenced by the British aesthetic movement, who joined Wilde and Pater in the study of Greats and the Renaissance, and who had much to say about art and its criticism. That Symonds’ origins in and study of this tradition related directly to his theory of homosexuality is something I hope to weigh in on in my thesis, and I hope that doing so can help us to consider what bearing “taste” has on homosexuality today, removed as it seems from ancient philosophy or Renaissance art.

My Fellow Americans; or, What I Did On My Summer Vacation

Every few days, Steve Benen, of whose Washington-politics blog I am an avid reader, writes a post summarizing the achievements of the 111th Congress and of the Obama administration’s first two years. It can become tiresome, representing as it does the endless partisanship of Washington politics, the defensiveness with which those who find themselves supporting the Democrats must react to the new normal of Republican nihilism. Tired cynic that I am, I find myself just barely satisfied with a set of accomplishments meant to advertise the promise of progressivism and the suggestion that these days will—we are certain of it!—go down in history along with the New Deal and the Great Society. I possess not just a cynicism but a conservatism which hesitates to consign the new progressivism to the historical narrative of the old progressivism quite so very soon, and I resent being dragged into a reelection campaign against my will.

And yet there is something nevertheless comforting about listing your accomplishments and thinking that, after all, you’ve been more productive than you’d supposed. I had occasion to reflect on this point this morning, as I sat morosely staring into my first cup of coffee and letting my father quote Aristotle on the inachievability of perfection at me. My father is exceptionally talented at making sensible points about the nature of academic life (frequently with the aid of ancient philosophy) which I know, rationally, to be quite true. Listening to him make these points tends to cause me to alternate between frustration that I couldn’t have figured these things out for myself and resolve to reapply myself, after all, to achieving the elusive nine-hour day of sitting at a desk reading and writing. And so this morning, of course, I sat and read 75 pages of a novel and the new issue of The American Scholar, and this afternoon I made a cup of tea and am now at least sitting at the desk, albeit thinking that in order to find the motivation for nine-hour days, I need to first congratulate myself—à la reelection campaign, reelection to the post of academic apprentice for another academic year—on the accomplishments of a summer which started two-and-a-half months ago and which now has three weeks to go. Campaign season is in full swing, and there’s no more time to waste on staying up late watching BBC documentaries on the history of British art, or on reassembling Ikea furniture that I’d screwed up the first time, or on walking to the neighbors’ house to download practically gigabytes of Facebook updates and book-review RSS feeds. And so: instead of lapsing into schizophrenic GOP-style sabotage, outlining my failure to the American people (who are also, obviously, me), we might as well catalogue two-and-a-half months’ worth of stimulus, health care and financial reform, and so on. I should be able to concoct a checked-off checklist of which to be proud: after all, I don’t have to contend with the U.S. Senate.

****

This summer began what seems like a long time ago, with the proper vacation bit, the part where it was okay that I wasn’t putting in nine-hour days, and was actually in its way the most productive of all. Not only did I get a valuable lesson in having fun, and travel north to lend a brief hand to one of the most welcoming communities and best local causes I know, my world got much bigger and brighter in the wake of my first trip to France, my first trip out of North America as an adult. In the process, somehow, I experienced the biggest surge of productivity I’ve had all summer: writing two good articles (one my valediction to Campus Progress, the other as yet homeless), and beginning to research grad schools and become a bit more knowledgeable about the real world of professional history. Of course, I spent hours sitting in front of the computer, but not entirely steeped in book reviews and academic blogs—also wading through bibliographies and archival finding aids and Amazon and Google Books and Powells, looking for titles to add to what is now my “to read” list and will someday become the bibliography of my first substantive research project. As June stretched on, I told myself every day that when July rolled around, and I was sitting at a desk in the suburbs without the delights of Paris to distract me, I would marshal all the intellectual and physical reserves of a Princeton semester and read for my thesis.

But wow, that perfection is so damnably impossible, isn’t it? Like a member of Congress who can’t quite look the American people in the eye, it seems as if I’ve read anything but thesis books, found any number of things to fill my day other than the desk and the notebook. Even when bureaucratic incompetency caused my volunteer job to vanish into the ether, leaving me with great washes of unmanaged time, I found myself inefficiently lingering over too-detailed notes, reorganizing instead of writing, and every day updating my thesislog with a new lens through which to focus my study of the intellectual history of homosexuality, unable to settle down and research just one. My fellow Americans, it is hard to begin a substantial research project with only the two-week, fifteen-page paper as guidance in how to do so—not quite as hard as generating support for a public option, perhaps; not quite as hard as withdrawing gracefully from land wars in central Asia; but hard all the same. I’ve spent six weeks dithering, skirting the edges, reading only a third of the books I’d excitedly checked out of the UCSD library in my first week of self-imposed summer term, and realizing last week that it had stupidly taken me over a month to notice that Eve Kosofsky Sedgwick had in about three sentences of a book she wrote in 1985 rendered the act of writing the thesis I’d thought I’d begun to research rather pointless. And so here we are: hence the dithering, hence the Ikea furniture and the book-club novel, hence my father’s Aristotelian pep-talk over the morning coffee. Hence this afternoon’s decision to turn the academic life which I sometimes believe I am not qualified to undertake into a reelection campaign.

Because, you see, that was the opposition’s attack ad (“Rutherford is no historian! She hasn’t the capability for original thinking! She has never done more than naïvely parrot Sedgwick!”), and here is Rutherford Junior Year PAC’s response: By assuming that this summer was about the thesis of my thesis, my opponent misses the point. By assuming that this summer was a failure because I haven’t been gainfully employed, my opponent misses the point. My opponent has turned a blind eye to the accomplishments of this administration: from essays, articles, and memoirs written, to restoring and life-changing travel foreign and domestic, to the real work of the academic amidst all this.

You see, my fellow Americans, one of the most rewarding parts of my life this summer has become what I can do to transmute my innate academic geekiness into something useful to other people’s lives—particularly in the links I post and the discussions which I moderate and in which I take part on Facebook. There are those in my life who have schooled me well in the possibility of taking Facebook seriously: it is where the audience is, the audience to which I hope to impart the values of civil discussion of current events and a respect for the liberal arts which the countries in which any of us live would like to funding-cut into oblivion. My days this summer have been made not by the slow inching progress on my thesis, but by the friends who have written to me all summer to tell me that they read what I post on Facebook, and that it is worth reading. My gratitude for these words of gratitude is immense, because it suggests to me—as I have suspected—that we are all starved for opportunities to read real things and to talk about real things. And it suggests to me that after too many years of being outcast for choosing a lonely life of the mind, and failing at too many social situations to ever dream of building communities based in friendship first and the sharing of ideas second, I can actually put my talents and my predilections to use. It suggests to me that when I seek to tune out of politics so that I can spend more time making my own life into a metaphoric midterm election, I do not do so needlessly or even entirely selfishly: I do so because I like having people to whom to talk, people with whom to talk, and people from whom I can feel needed. It is these interactions, electronic as often as not, which provide for me the real proof that the life of the mind is no waste of time.

But they are not always electronic. In San Diego, I found myself becoming part of a new social circle of kids with whom I went to high school: some of its members I’ve known for a long time, but some of them I just barely knew, or had never met at all. I grew closer to all of them, though, in a few weeks of cultural excursions and late nights in coffeeshops of which I could never have dreamed when we were all still in high school, when I was a little less sure of myself and a little more apprehensive of social settings than I am now. Twice or three times, in those languid summertime caffeine-fueled conversations, when I drank espresso and we made our Paris café right there in the cultural backwater of southern California, I caught myself, without realizing it, lapsing into my academic mode. Suddenly conscious that I was lecturing my friends about the history of gay identity politics, or had become the TA-like moderator of a political discussion, I would become embarrassed and step back—but not without first experiencing a little frisson of delight. Because I am never so much myself, and never so content with being so, as when I feel that I am teaching, and that my audience doesn’t object too much to being taught. And it is because of this that, despite the constant failure to achieve a nine-hour work day (I have now spent over an hour writing this post) I do not feel as if I have failed in my campaign promises of trainee scholarship.

****

When we make our cases for ourselves, we cannot admit to the world that we are—as is surely a philosophical truth—imperfect. Politicians do not schedule rallies only to tell their constituents that they will not be able to deliver on the promises they make; I think there was a part of me which belived when I left Princeton in June that I would write a book-length research project this summer. But in making the case for reelection—in calling upon all our physical and moral energy to take on the new academic year, and to make the case for fitness to do this for a lifetime—we must recast the debate. Determination of a candidate’s fitness should not be based in the partisan binary of what she has or hasn’t done, what she does or doesn’t believe, but rather in the presence of moral seriousness: the belief that she is acting not in the interests of lobbyists or in response to fleeting scholarly trends, but in pursuit of American ideals, in pursuit of knowledge, in a manner which demonstrates her caring for the next generation.

My fellow Americans, I have not written my senior thesis in the summer before my junior year. I have not, really, even started. But I have made great headway in understanding how to read the masters of my discipline and how to convey what I’ve read to those who haven’t; I have practiced in my own stumbling adolescent way the craft of public intellectualism, and I have begun to believe that research—as much as it will determine my career in the years to come—is absolutely the least important part of how my electorate (okay, fine, I) ought to judge my candidacy.

The 111th Congress passed the Affordable Care Act. I read a few books, and wrote a few essays. But the pollsters are asking the wrong questions. I would sooner they asked: are we good people striving to be better, committed to making the world a better place?

I leave it to the American people to decide.

We Must Cultivate Our Own Dancefloor; or, Thoughts While Listening to “Like a Prayer”

Look around: everywhere you turn it’s heartache
It’s everywhere that you go
You try everything you can to escape
The pain of life that you know
If all else fails and you long to be
Something better than you are today
I know a place where you can get away:
It’s called the dancefloor, and here’s what it’s for…!

—Madonna, “Vogue”

I am sitting in a darkened bedroom in a big, empty house on a rural island in the Strait of Georgia, listening to Madonna’s “Like a Prayer.” I am enraptured by this song whenever I listen to it, and I have listened to it many times this summer: on a train to Rhode Island, on a plane to Paris, speeding down the highway from the San Diego suburbs. Each time I have the same response, which is that a great gush of emotion wells up inside me, and I can do nothing but raise my head to the sky, shout the lyrics (silently, if need be, as now), and form a fist which pumps in time to the beat. It’s strange how much I love this song and how few pieces of music move me more—I’ve come a long way from the civil rights anthems of the elementary-school car ride, or the Scottish folk ballads of the middle-school bus, or the Pink Floyd that got me through high school. Weirdly (I think), I’ve wound up through all of this at Madonna, despite my actual political, feminist objections to the Madonna phenomenon itself. Weirdly, there are only a few Pete Seeger songs, a few finales from Russian ballets and symphonies, which can command this much attention and passion from my soul.

It must have something to do with the single memory I associate with this song, which (unlike most of the memories I associate with songs) is not related to the first time I heard it, or the person who introduced me to it. That’s a fairly uninteresting, and fairly embarrassing story: I’d never heard Madonna before this spring, when the Glee Madonna episode prompted me to download a “Best of” album from iTunes. I had the songs in that collection on in the background through a few months of walking across campus and doing laundry and surfing the web. I learned the lyrics, and I got them stuck in my head, and then I was playing the songs again and again as, finally, I packed my things up in boxes and got ready to leave Princeton for another summer.

One of the components of leaving Princeton for the summer was attending the three-day bacchanal my university throws for its alumni, an event of which I’m far more ashamed than I was of my sudden descent into pop culture and the slippery slope in that direction which Glee seems to have set in motion. I spent three days getting no sleep, hating the 70-year-old rich white men and 20-year-old rich white football players who made my beloved quad smell like stale beer and vomit, and letting my shame at actually enjoying myself at the alumni parties subside into the retributive self-righteousness I felt at abusing their free food. And after three days of all this, of the longest and most exhausting and most emotionally up-and-down party I’ve ever paid $45 to attend (or, indeed, attended for free), I went to the last dance, a Saturday night affair sponsored by the LGBT alumni organization, a high-school-prom of the very drunk queer kids set who spent a few hours grinding with their friends in a too-large multipurpose room with a DJ and a disco ball. It was cheesy and ridiculous and the most fun I’ve had in the past several months. Because it was a gay dance, the DJ played Lady Gaga and Ke$ha and even the Spice Girls, and of course various disco standards, and of course everyone knew all the songs. It was a collective experience of dancing to collectively popular music of the sort which I don’t get to experience very often, and for me it culminated with “Like a Prayer,” a song to which I remember shouting the words with one of my friends, all the energy I’d built up in months of secretly listening to Madonna in my room all coming out in the realization which it took that cheesy fag-and-dyke-prom of an alumni reunions event to bring home: I love the dancefloor. And I love it because it represents unbridled joy.

In the past twelve months, I’ve been absorbed neck-deep in the personal struggles of young adulthood: sorting out desires from obligations, trying to figure out my purpose in life, striving to identify what a good person is and what she does to be good and to be better. This time a year ago, I was about to return to my family after a summer of depression, disillusionment, and cynicism in the District of Columbia, and I spent my sophomore year of college salvaging my faith in humanity by coming to love art galleries, classical music, literary criticism, and other trappings of highbrow culture; by investing my emotions in friendships instead of in elections; by making a difference through the person-to-person contact of the dining hall or the LGBT Center; and by investing myself in my scholarship and in the notion that studying now so that I can be an academic in a decade or so is as worthy a use of my time as working for a political cause. In the intervening moments, between applying all my mental will towards figuring out what a good person is and then trying to become one, I have snatched slices of transcendent happiness: on my first and then successive road trips; having madcap ideas and making them happen; basking in the company of the brilliant people I idolize who let me tag along in their far-more-interesting lives; watching the sun set from that beautiful little room with its window seat over the archway in the college quad I call home. Occasionally, going to a party. Dancing. Laughing. Going back to my room too late, still grinning, still with pop songs and their unrelenting beats running through my head.

Last October, when I put aside my political and personal prevarications and went to the annual Terrace Drag Ball, I had fun. I danced with friends and strangers, all delighting in the party and in the dancing. I came home realizing that it was wrong to deny myself simple pleasures like this, because what is the LGBT political movement fighting for, exactly, if not the right to hold drag balls? The right to ownership of a dancefloor has presented itself to me, slowly, over the course of the past year, as a fundamental right surely on par with a few others which top the front pages of the news. As I’ve read more books by Edmund White, taken an American studies class which talked (among other things) about the birth of the downtown music scene, and more importantly stepped away from gay history once or twice and gone out for myself and danced till hours I never see otherwise, I’ve realized what a powerful sense of collective identity, and collective pride, and collective joy, dancing together provides. I’ve come to understand it as a tool to banish ugliness and despair, to create resolve and strength, to assert defiance and freedom from fear. Granted, I’ve only been to a few parties, and a few dances, in the past twelve months: work always takes priority, and more often than not I recreate the magic of the dancefloor for myself, with my eyes closed in a darkened bedroom and more than enough happiness and energy to swallow up a whole nightclub (granted, being a shy young thing of twenty, I’ve also never been to an actual nightclub). For “dancing” is as much a metaphor as it is a reality, and for me it functions easily as a symbol of the process of using joy to banish ugliness, using beauty (once sought) as a peaceful weapon, a route to strengthening moral resolve to fight the next battle of the human condition. In the past year, in the process of learning to have fun, learning to do good by being good, and learning to accept and to appreciate myself in the meantime, “dancing” as symbol has helped me to keep myself whole, to keep me going through my days, and to create, hovering in the back of my mind, a vision of what the Platonic ideal of happiness can be. Yes, that’s what it is, an ideal: an ideal I have experienced only elusively, but an ideal to keep in mind when working to build a world in which the right to the dancefloor is inalienable—consider it the universal Stonewall, the Stonewall of the mind. Everyone deserves a liberation led by a drag-queen kickline, a hedonistic music-and-club-and-drug-and-sex scene born in the bowels of Manhattan, and a resilient spirit which can rebirth that scene into one which can confront death and impoverishment and come out fighting. Everyone—even those of us for whom the gender politics of the female divas so beloved by the gay male stereotype create problems—deserves their Madonna, whatever their actual gender or sexual orientation or personal struggles or routes to community and acceptance.

The other afternoon (back on the quiet, far-flung, isolated island) I was reading the newspaper stories and blog posts I’d downloaded from the neighbors’ internet connection (we don’t have one of our own), my headphones in and my body bouncing a bit to Donna Summers, another of my recent discoveries from the canon. As I flipped through stories about the dysfunction of our government; about the peril in which the environment finds itself; about soldiers and civilians killed in countries so far away I can’t imagine them; about economic crises or hate crimes the world over; I felt a sharp stab of guilt for dancing to disco while reading of such hurt and sorrow. But—as I steeled myself to move onto the stories about funding cuts to universities, a lack of investment in the humanities, and the end of tenure—I resolved that there is nothing shameful about seeking a slice of the dancefloor where you can find it, about trying to recreate for yourself where and how you can the rapture of “Like a Prayer.” I am tempted to think, as I rationalized myself into submission by thinking on that occasion, of the forces of disco and pop (allied with the forces of 19th-century portraiture and Elizabeth Bishop’s “One Art” and the Tchaikovsky symphonies and the Declaration of Independence and Oscar Wilde’s statements in the witness box and oh, at least five dozen other things) ranged in a great cosmic battle against the forces of hate and evil and ugliness; all doing their best, whether in earnest or in camp (though those aren’t too different!), to help us to cultivate our own dancefloor.

And, well: if this is what gets us through the days and nights and helps us to keep our shit together, I am all too happy to be “putting my queer shoulder to the wheel” on this one.