14 December 2010

Prorogued

Paper writing.

In the meantime, I'm sure that all my fellow former residents of Read Residence Hall (Curry-Three 4 Life!) will appreciate the astounding attention to detail devoted to our quondam home in this Web comic.

08 December 2010

Number One for 8 December 2010

Candy-coated misery:


Carrie Underwood, "Cowboy Casanova"

07 December 2010

Number One for 7 December 2010

Hoover, he was a body remover:
Rage Against the Machine, "Wake Up"

06 December 2010

Number One for 6 December 2010

Take field trips to our favorite mall:

Jonathan Coulton, "Shop-Vac"

Shop Vac from Jarrett Heather on Vimeo.

04 December 2010

Academic Titles

What do you call a doctor of philosophy?

When I was an undergraduate, the answer was straightforward: "Doctor [lastname]" or "Professor [lastname]." These days, I generally call them "firstname," except (a) sometimes when emailing (I write several registers more formally than I speak, especially when asking for favors!) or (b) when dealing with someone particularly eminent or senior. And even (b) is a diminishing category.

But it's never crossed my mind to demand that someone in an informal or presumptively collegial setting call me "Mr." Much less have I ever dreamt about being called "Professor" on a routine basis. (I still think the first time should be pretty fun.)

A recent contretemps has shown that apparently some holdouts on formality remain, something that is deeply ironic considering the presumably egalitarian norms of the professor in quo. The comments section was, as often happens, livelier than the post, particularly FLG's question "Would you accept Dark Lord Alpheus? Or Darth Alpheus?"

So now I know what I will insist my students call me once I've received my doctorate: Darth Pedantus.

Number One for 4-5 December 2010

In America, you get food to eat:

Randy Newman, "Sail Away"

03 December 2010

Number One for 3 December 2010

But here's an FYI: You're all going to die screaming.
  • One fewer reason to see the Nats, as Dunn goes to ChiSox -- WaPo
  • Former California assemblyman Chuck DeVore is a prolific Amazon reviewer (I found this entirely by accident) -- Amazon.Com
  • Mysterious spacecraft returns to Earth -- AP
Jonathan Coulton, "Re: Your Brains"

02 December 2010

Perceptions and Misperceptions and Zombies

I went to Foggy Bottom today to see Daniel Drezner's presentation about his forthcoming Princeton University Press (February 2011) book, Theories of International Politics and Zombies. The talk was phenomenal and the book looks like it will be a great teaching tool; the atmosphere was somewhere between a comedy club and an MC Frontalot concert.

Drezner's Foreign Policy article covers the essentials of his presentation, although if you can you should see it live. (So to speak.) The discussion was equal parts nerd-fest and theoretical disputation; Dan Nexon, the chief critic of Drezner's IR zombies approach, pointed out that Drezner suffered from a "vitalist" perspective that blinded him to the post-zombie world's problematization of the hitherto binary life/death category. (He also noted that the COW dataset will have a problem in coding armed conflicts between humans and zombies, since it requires 1,000 battle deaths.)

Drezner and Nexon have staked out the ground for productive debates. But there's still more to be said about IR and zombies. For instance:
  • Buck passing. There are two ways in which states could choose to buck-pass. The first are small powers, which could simply choose to let great powers shoulder the burden of resisting the menace. This, as Drezner points out, could lead to great-power intervention--but the strain of regime change in addition to zombie-nuking might be too great. (In fact, compellence might instead take the form of warning potential buffer states that they are to be turned into cordon sanitaires.) The second is that great powers may choose to postpone their own interventions until the hegemon or some other k-group chooses to act in order to safeguard their own interests.
  • Zombie gerrymandering. Why not selectively exclude troublesome populations (or voting blocs) from counter-zombie efforts? A Republican president, for instance, might choose to draw the defensive lines at red-state borders instead of the national borders. (Democrats would be too wussy to do this.) Of course, the U.S. constitution is silent on how to handle the congressional seats that would be elected by voters in undead-held lands. But a dictator (a Stalin or a Kim) might be quite happy to see rebellious provinces subjugated by the zombie menace, which would allow them a twofer of both eliminating domestic opposition and having the U.S. foot the bill for cleanup.
  • Religious reformations. There would be millions of undead wandering the earth. Certainly, if religion matters in IR, this would be an event that would make it salient.
  • Post-bailout fatigue. Contra Drezner, zombie protection is not a public good; it is a classic private good. (Want to exclude someone? Just don't send the Marines.) So why should we expect to see states with varying preferences choosing to protect each other? Sure, there may be some constructivists who believe the "West" will stick together, but recent experiences in the much more thickly constructed European Union demonstrate that even "good international citizens" like Germany are unwilling to provide bailouts for, say, Portugal. Would a GOP House vote for a zombie bailout of Ghana?
  • The end of IR. How can any meaningfully constituted "international society" survive a zombie apocalypse? Both Nexon and Drezner assume that international relations will survive in some fashion after the flesh-eating undead hordes attack. But why should we continue to see anything resembling a Westphalian anarchic world--or any other definition that could reasonably be construed as "international"--after such an existential threat that would lead to the end of international trade and migration, as well as the immediate extinction of homo sapiens in entire countries and continents? Nexon posits a return to empire, but I suspect that the true end state is that of a 1984--style totalitarian government. (Twelve Monkeys suggests how this could play out.) Consequently, even for the survivors, the zombie uprising will lead to the end of anything that a Western liberal would regard as "human" in anything but the biological sense.

Number One for 2 December 2010

They used to call me lightning

Toby Keith, "Bullets in the Gun"

01 December 2010

Office hours

Two observations:
  • Every guide to college success says that students should go to office hours.
  • Hardly anyone ever goes to office hours.
I decided last week that from next semester I will no longer hold scheduled office hours. It is literally a waste of my time. In five semesters of TAing, I have had perhaps six, or maybe it was eight, students drop-in to office hours. Many more have come to office-hours-by-appointment, and many, many more have come to the ten p.m. office hours I used to hold in the library a day or two before midterms. But to a first approximation, of the nearly three hundred students whom I have graded and taken pains to at least minimally understand course material which held wildly varying degrees of interest for me, pretty much none of them have come to scheduled office hours.

And a lot of them should have. It's remarkable that students who have received a C or worse (and at my institution, a B- is probably pretty bad news) refuse to come to office hours, even when I've told them to. It's trite but true that a student who gets a B+ is much more likely to pay me a visit in the hopes of angling for an A-.

(I suppose it's the same principle by which stout burghers are more likely to get a new sidewalk installed by the city government than their slum-dwelling co-urbanites. Even though the sidewalk is probably much more important to someone who has to walk everywhere, or at least to the bus, than to the homeowner who only visits the sidewalk when it rains, nevertheless it's the latter who knows that city hall will, in fact, take care of these things.)

I don't mind that students have almost no idea what office hours are supposed to be like. I never did. I went either because I was intensely interested by the subject or drowning--as well, of course, to negotiate grades. Of course, now, I actively don't want people interested in the subject to come and talk to me; if you really care, come by and I can point you to some things to read. I much prefer talking to people who are earnestly and sincerely grappling with the material, because in plain fact they often understand the material better than people who are really passionate about international relations, comparative government, and other do-gooding endeavors.

It took me about a semester to realize what TAing was supposed to be, and since that first term I've watched, amused, as succeeding cohorts have repeated my early mistakes. Here's the classic rookie errors:

  1. Offering to hold multiple scheduled office hours per week. This is the biggest mistake. It's dumb. If people won't come to one office hour session, they won't come to two. And you'll be chained to some stupid table in a coffee shop playing Solitaire two hours a week, waiting for undergrads to drop in.
  2. Believing that section is a lecture. It's not. It's not about you; it's about them. Don't prepare extensive notes or handouts for them; listen to them instead, and find out what they don't understand. Generally, you should be trying to simplify, not complexify.
  3. Thinking that people will do the reading. They won't. You didn't--and if you did, you should know right off the bat that you're weird. Part of your responsibility is to help them know what they have to read. Another part is to make them read what they should be reading. Reading quizzes are great for this.
  4. Being disappointed if they don't take the course seriously. Chances are, this is an intro class, which means that at most a third of the students actually care about the course. And why should they? Frankly, partying is both more fun and likely more important to their future happiness than your course. Better a pig than Socrates.
  5. Thinking that you're paid to be a TA. You're not. You're paid to do research. If you enjoy teaching, then great--but anything over the de minimis effort should come out of the time you budget for recreation, not research.

From now on, though, I'll be providing a much less personal experience. Students who want to visit me will be able to check out my Tungle page and schedule 20-minute appointments with me. But no more fixed office hours por nada.

Number One for 1 December 2010

Stevie Wonder, Secretary of Fine Arts:

  • Chocolate City Brewery to open this spring; vanilla suburbs rejoice -- Conor Williams
  • God fails football -- Daily News
  • Which shipping company will punt your fragile shipment? -- Popular Mechanics
  • "[I]t is still striking how many pies the United States has its fingers in, and how others keep expecting us to supply the ingredients, do most of the baking, and clean up the kitchen afterwards." -- Foreign Policy
  • Do elected officials perform better than appointed ones? -- Enik Rising
  • Why Euro defaults are necessary -- Felix Salmon
Parliament, "Chocolate City"

30 November 2010

In political economy, politics comes first

Pure economic theory is about as useful to understanding economic policy as Newtonian physics is to understanding a bird in flight. The bird is subject to the same forces and constraints as, say, a stone of equivalent mass--but unlike a stone, a bird can change its own course. So with political economy. The starkest laws of economics--opportunity costs, the general propensity for diminishing marginal returns, the determination of outcomes by the mutually influencing strategies of players in games--always apply, but real-world agents often act in ways that the theory, as usually applied, does not quite predict.

Recent developments in Europe have brought this lesson home. For people of my generation, who remember the period between the fall of the Soviet empire and the fall of the Twin Towers with crystal clarity, the Washington consensus attained something of the status of divine write. Good governance may have been difficult to establish, but it was at once straightforward and persistent. All a government had to do was let the rule of law take hold, establish some private property rights, and hold some elections and, presto, you've reached the end of history.

In the past decade, each of these recommendations and a host of corollaries has been shown to be either vastly difficult than assumed or actively counterproductive. (To take just one example, international relations theorists now argue that although democratic countries are less likely to go to war among themselves, democratizing countries are much more bellicose than the run-of-the-mill authoritarian state. Think for a second what that implies for a democratizing PRC.) One measure of this shift is what I like to call the Friedman Index, which is a measure of the fraction of Serious People who take Tom Friedman seriously. By all indications, the Friedman Index is a trailing measure that peaked sometime around 2005. Among NYT columnists, it is Krugman, not Friedman, who now speaks for the professional elite of the East Coast, with the prominent exception of the current occupant of the Oval Office. And Krugman, for obvious reasons, is hardly in thrall to the quantitative mystique in which economists have clad themselves over the past generation.

Which draws us back to Ireland. Conventional political economists have sought to explain the Irish example by looking at past IMF interventions and focusing on the narrow (but important) questions of how such international institutions allow states to credibly bind themselves to necessary but draconian actions they would otherwise be unable to sustain. (See, for instance, The Monkey Cage.) But this analysis is badly misguided, since it ignores the rather obvious point that the states now threatened by financial contagion have options that the Asian currency crisis countries did not. European states are massively wealthier and more influential than Thailand or Indonesia; unlike Jakarta and Bangkok, Paris and Berlin can simply rewrite the rules to benefit the bulk of their citizens who are not international creditors.

Would they? I expect they will. An editorial from the Irish Times suggests why:
IT MAY seem strange to some that The Irish Times would ask whether this is what the men of 1916 died for: a bailout from the German chancellor with a few shillings of sympathy from the British chancellor on the side. There is the shame of it all. Having obtained our political independence from Britain to be the masters of our own affairs, we have now surrendered our sovereignty to the European Commission, the European Central Bank, and the International Monetary Fund. ... The Irish people do not need to be told that, especially for small nations, there is no such thing as absolute sovereignty. We know very well that we have made our independence more meaningful by sharing it with our European neighbours. We are not naive enough to think that this State ever can, or ever could, take large decisions in isolation from the rest of the world. What we do expect, however, is that those decisions will still be our own. A nation’s independence is defined by the choices it can make for itself.
I would not be surprised if the Irish government at some point in the next six months breaks its most recent commitments to Brussels and M. Strauss-Kahn. I fully expect that whatever "punishment" is meted out to Ireland to that transgression will be forgotten within a decade. Equally, I expect the eurozone to do what everyone believed was impossible as recently as seven or eight months ago: to shrink to the point where it is an actual optimal currency area, which is to say that it will probably be Germany, France, Benelux, and some German quasi-possessions on its borders (Austria, the Czech Republic, and Slovakia).

A social science that proceeds inductively from a limited wealth of experience will miss the really important shifts. Generalizing from the international arrangements that governed the world economy over the past sixty years to the actions we can expect influential actors to take in the next decade is wrong-headed. It is time, instead, to begin preparing both scholars and policymakers to be ready for a world in which defaults are policy tools, states reassert their primacy over economics, and protectionism is put back on the table.

Number One for November 30, 2010

I love her and she loves everyone:

  • Richard Lugar, liberal maverick -- NYT
  • Grand strategy is a crock -- Drezner
  • Grand strategy is a crock -- Krasner
  • Humans are getting fatter. But so are animals that live near humans -- Stross
  • Markets say there will be a Portugal default. Didn't markets also once say that U.S. housing values could never decline? -- Salmon
  • Good teacher, bad teacher: It doesn't matter. The kids will forget what teachers say. "Why make the poor kids suffer if they won't retain what they learn anyway?" -- Caplan
  • Why measures of central tendency give a distorted view of legal earnings -- Empirical Legal Studies
  • The tragic death of this blog -- Technologizer
  • Alan Moore's never-written DC saga: "one of the things that prevents superhero stories from ever attaining the status of true modern myths or legends is that they are open ended." Four-Color Heroes

Reel Big Fish, "Scott's a Dork"

21 August 2010

Number One for 21-22 August 2010

Thought we were so grown up:

  • Why pick on nerds? [Overcoming Bias]
  • Why nerds are unpopular [Paul Graham]
  • Obama and gay marriage: A case study in ambition [The New Republic]
  • What's more valuable: a college or a stereo? [The New Republic]
  • When did the infovore evolve? [Mother Jones]
  • Eric Schmidt, creeper [Daring Fireball]
  • How to become an expert [Dan Drezner]
Kate Nash, "Merry Happy"

20 August 2010

Don't mention the war

WHO WINS WARS? There is the hippie answer--"No one, man." There is the Machtpolitik answer--"The strongest." And there is the thoughtful answer, which is contextual.

Consider the Second World War. Everyone knows the Allies won. But judging from the representation of the war in British memory you would find at least as many instances in which it is not altogether obvious that London was among the victors. 1984, famously, projected the material deprivations of London 1948 into the future; it was as much a work of observation as of speculation. And the weird nature of the postwar settlement made it hard for ordinary Britons to gloat about their role in winning the "good war":



By "winning" we can mean two things. The first is a question of contributions: Without X strategy or Y materiel, would the war had been won? The counterfactual then takes the form of "Had the U.S.S.R. not entered the war, would the U.K. and the U.S.A. not won?" or "Had the Western Allies made a separate peace in 1940, would the U.S.S.R. have defeated Nazi Germany?" These questions are impossible to answer for certain, but we can make a plausible case. In this instance, it becomes more plausible to argue for a solo Soviet victory, and even more for a solo Soviet near-loss, than for a joint Anglo-American victory or for a solo Anglo victory or near-loss. In this sense, then, it is meaningful to talk of the Soviet Union "winning" the war.

The more important sense is that of outcomes. Who in the international system benefited the most from the international environment at the war's end compared to their position at the war's beginning? This focus on relative power accords with most theorizing about the international system. A player who goes from having 10 percent of the distribution of power to 50 percent is strictly better off, regardless if the size of the pie has shrunk.

An alternative definition of winning, of course, would simply assert that the top-ranked player at the end of the period won. That is inadequate. It implies that a hegemon could engage in a costly, pointless battle, lose every engagement and waste scads of money, and yet still "win" simply because they didn't lose enough to become the second-ranked player.

As a corollary, I should note that it is much easier to identify the losers of a war: anyone who moves down the relative or ordinal league tables. The greatest loser, however, need not be the player who fell the farthest; here, losing the top spot and falling to second place may be much worse than falling from 25th place to 100th, given a flat distribution of powers (the familiar "long tail" effect).

There are three ways to measure changes in relative power. The first is strictly relative: Who arithmetically gained the most? A player that increases from 20 percent to 40 percent has probably gained more than anyone else in the system. This leads to the second definition: the moment at which changes in relative power lead to qualitative changes in the organization of the international system. The United States entered the Second World War as the largest power in a multipolar world; it exited the conflict the hegemon in a uni- or bipolar system. A system shift is a major consequence, and a system shift in your favor surely counts as a different category of "win."

I argue there is a third kind of victory: the relative proportion of relative gains. That is to say, in the universe where player A goes from a 20 to a 40 percent share as a consequence of war, he has doubled his share of power in the system; but if player B goes from a 2 percent share to a 10 percent share, they have quintupled theirs. Assuming that the structure of the system has not changed (that is, that A is not a unipole), I contend that player B is also a winner, and in some ways even more of a winner than A, since they have produced gains more efficiently.

Consider the Korean War. Arguably, the United States won; definitely, the North Koreans lost a lot and the South Koreans lost a little bit (in the short run). But I contend the real winner was the Mao regime, who was left with a firmer position at home and abroad.

Here are some other Startling Propositions which, as Dean Acheson would say, are clearer than the truth:
  1. Japan was the real winner of World War I. It managed to eliminate its major security threat (the USSR), marginalize its principal offshore competitor (the UK), and make major gains in China at low cost to itself.
  2. Southern poor whites were the real losers of World War II. The real winners? Northern industrialists.
  3. The real winner of the Seven Years' War was the United States of America. The real loser was Britain.
  4. Actually, the United States really did win the Spanish-American War.
  5. The winners of the Mexican War were Southern plantation owners. This was recognized at the time but bears repeating.
So many wars, so many revisionist papers. Who really won the Napoleonic Wars? Who really lost the War of the Spanish Succession? And so on. But one thing is clear provocative: The Communist Chinese regime also won the Cold War.

Number One for 20 August 2010

I don't have a single thing planned for today:
Pizzicato Five, "Such a beautiful girl like you"

18 August 2010

Number One for 18 August 2010

I'm not a fashionista or a consumerist:

Simona Molinari, "Egocentrica":

17 August 2010

Life is an application

Even William of Orange had to apply for jobs.
For years, one of the most important file folders on my computer has been called "Application Hell." Every resume, cover letter, portfolio, and list of recommenders I have sent to grad schools and employers for most of the past decade is in there (as well as some college application essays, just for the hell of it).

Applying for things is hateful. Nearly everyone hates being a salesman, and even practiced salesmen hate selling themselves. Yet the nature of modern life is that salesmanship is an essential skill. At the moment, I am collecting emails from people interested in taking a room in the house I share, and it is clear that they are treating the process for what it really is: an application-and-interview process no less irritating or important than a job search.

Why do we hate talking about ourselves? Even my most successful friends loathe writing cover letters or resumes. More to the point, even my friends who enjoy talking about themselves in any other context will postpone writing applications until the last possible moment, and when they do produce one it will be listless and ashamed.

There are three strategies to cope with applications. The first is to be a narcissistic sociopath:



But that path has some drawbacks.

The second is to have someone else write your application. (I know at least one professor who took this route; it worked not least because the spouse who wrote the tenure packet was an academic as well.) When this works, it works great, but it requires the other person to know you well and also to know how to handle applications--which is a rare combination.

The third is to grit your teeth and do it. Grit your teeth, not gnash them, because there is no use raging, raging against the writing of the C.V. It is trivially easy to spend more time complaining about the unfairness of a universe that requires such a noisy filter for matching people with jobs, roommates, and Match.com hookups than actually writing an application.

The good news is that it gets easier to write these applications with time, if for no other reason than that you have a well-stocked "Application Hell" folder of your own. The bad news is that it never feels any easier.

Number One for 17 August 2010

In a round of decades three stages stand out in a loop
  • Sorry to miss posting yesterday. Playing with the puppy took up a bit more free time than I'd expected.
  • There are many ways to waste time if you use Gmail. [Lifehacker, Simple Productivity, AlphaGeek, Best of the Web]
  • The grad school version mentions coping with neuroses [EHow]
  • Charlie Stross asks: What's the next bubble? My answer: AI. [Charles Stross]
  • Drezner says enough about Cordoba House as a threat to America; enough about saying critics of Cordoba House are threats to America [Dan Drezner]
  • Kevin Drum says not so fast [Calpundit]

14 August 2010

Number One for 14-15 August 2010

Your lies have spoiled two confessions:

Inara George, "Genius." This blog's Official Theme Song. (Or, given the theme of this blog, I should say this blog's anthem.)

13 August 2010

Number One for 13 August 2010

I walk the line like Johnny Cash:

Plushgun, "Just Impolite"

The Excitement of the Boring

CC image by Cliff1066
AMONG THE MAJOR MUSEUMS, none has less inherent interest than the Smithsonian's Postal Museum. It is a museum about ... the Postal Service.

A friend's Facebook update reminded me about its existence. I visited once, last year, and it is fair to say that I was just shy of enthralled. First, of course, I wanted to see what they made of the topic. For those of us who have been in the museum business, no matter how slightly, there is always some professional interest in seeing how a well-funded museum spends its funds. (I once spent ten minutes at the Georgia O'Keeffe museum examining how the curators had lit their galleries. For a few minutes, at least, that was more illuminating than the paintings themselves.)

In this case, what I really wanted to know was how you could bring to life the story of something that is inherently undramatic. The answer, it turns out, was lots of props--and a few good computer simulations. As a way of conveying history, this has the advantage of being tactile with the disadvantage of being a little misleading. Okay, so the Postal Service used to sort mail on trains, and so you have a caboose. But what does this tell us about how Rural Free Delivery reshaped rural life? And what do we learn about how the Postal Department was a major source of political patronage, helping to develop the American party system on a national basis?

This leads neatly to the second reason I wanted to visit, and that is that I genuinely find the history of the Postal Service interesting. And this isn't the only boring institution I care about--far from it. Of course, I'm using "boring" to refer to how most people, most of the time, see such institutions. And me, too. I'm no anorak or trainspotter. I'm not interested in learning the details of rate cards or the breed of pony used in the Pony Express.

WHAT INTERESTS ME INSTEAD is how transformations in such institutions shape everyday life. The existence of international postal reply coupons was the cause of the Ponzi scheme; the existence of the Sears Catalog made the Midwest habitable. And at the core of all such institutions is politics. True, economic and technical considerations matter, but their resolution is often guided by considerations of sheer power.

The Postal Service, like all other boring institutions, is in its way a product of the fundamental arrangements of power in society. And that makes it something worth paying attention to, at least a little bit, in a way that pop culture and high culture--which are more intrinsically more interesting--can't compete with.

12 August 2010

Number One for 12 August 2010

I'll ace any trivia quiz you bring on:
  • Kirk or Picard? Kirk. But Sir Patrick Stewart is obviously a better actor.
  • Favorite science fiction TV show? Tough to say...but it is probably Firefly.
  • Favorite comic book series? Avengers
  • Favorite comic book issue? Fantastic Four 262, "The Trial of Reed Richards"
  • Favorite DC character? Batman.
  • Favorite Marvel character? Victor Von Doom.
  • Desired superpower? Phoenix's.
  • Favorite Cylon? Eight.
  • Favorite sf novel? The Man in the High Castle. Very, very close second: A Canticle for Leibowitz.
  • Favorite sf short story? Actually, "The Rocket Man" (Ray Bradbury, R is for Rocket). At least five of my top 10 would be Bradbury. And three of them would be from Martian Chronicles?.
  • Favorite SF movie (non-series)? Blade Runner
  • Favorite SF movie (series)? Star Trek II: The Wrath of Khan.
  • Luke or Han? Han.

Resurrecting "Poseur"

Probably not exactly fair use.
WHEN I WAS in middle school, the worst insult anyone could hurl at you was "poseur." Did you claim to like rap but not own any Snoop Dogg CDs? Did you say that you were a Cardinals fan but never watch the games? Then, clearly, you were a poseur.

I think that this is an insult whose day has come again.

The concept of the poseur is superficially related to the concept of authenticity, but without any of the troublesome ideological overtones. "Authenticity," after all, is a purely artificial concept. (As Ernest Gellner's readers know, the "authentic" Ruritanian is the Megalomanian who behaves the way actual Ruritanians would act if only they were enlightened.) But a poseur, by contrast, is a fake. He knows just enough to skate by in casual conversation or the everyday presentation of self, but he really doesn't have any claim to the status he claims for himself. Accordingly, anyone can be a poseur in any dimension. You can pose as a comic book fan, and even pass as ones to most people, but five minutes of conversation with someone who knows what they're talking about will reveal you as a poser.

The greatest asset of resurrecting the "poseur" tag will not be in adding a new insult to our quiver, but in deterring behavior that smacks of poseurdom. All of us feel the temptation from time to time to pose. But if we're more afraid of being caught out than we are tempted of putting on a false front, then we may be able to overcome the Dunning-Kruger effect.

So let's all start acting like middle schoolers again. Sometimes, kids can be just cruel enough.

31 July 2010

30 July 2010

Number One for 30 July 2010

A satellite recalled your voice:


Thievery Corporation, "Lebanese Blonde"

29 July 2010

Number One for 29 July 2010

The devil in my pocket turned to gold
Bitter:Sweet, "The Mating Game"

28 July 2010

Number One for 28 July 2010

Wait in line / til your time:
Zero 7, "Waiting Line"

27 July 2010

Re-ception

Thinking about it more, I realize that the problem with Inception is just that it wasn't imaginative enough. More precisely, the nature of the plot prevented the director from fully realizing the unique logic of the dreamscape. We experience dreams as variations on reality, in which the laws of physics and of narrative can be suspended, sometimes at will and sometimes because of exogenous shocks. Inception allowed for only a tiny fraction of that variation to be used. The whole point was that the plot within each level of the dreaming had to make sense, had to be designed in order to make sense, and had to proceed from one starting point to one end point. There was no room for intra-dream switching. The ice world could never shift to become a beach; the city Di Caprio and Cotillard had designed could not become a cottage; and the narrative structure within each dream allowed for no alternate solutions. By structuring the adventure as dependent on an architects' creation of a maze, the plot therefore foreclosed what would have been much more interesting: namely, the experience of goal-driven actors in a completely stochastic environment.

And, of course, it meant that the dream could never become a nightmare.

Enough already

I am only a Level 2 Mac fanboy; the Higher Mysteries of the RDF have not been revealed to me. Like a lot of people, I switched because of the Apple design concept, and like many people I was introduced to its pleasures by the iPod. (This is the model that I thought was so brilliant, back in 2004.)

It strikes me that the essence of the iDesign philosophy is metadictatorship. Steve Jobs does not control what I do to my iDevices, but he does control how I can use or change them. Thus, I can customize, but only to a certain point. This chafes at people who believe they are better computer programmers and designers than those in Apple's employ, including even a few people who actually are better computer programmers. For the rest of us, complaining that our phones aren't jailbroken is like complaining that we can't replace the engines in our Prius.

All of this is just a prelude to say that I have been spoiled by this benevolent dictatorship. I noticed this morning when I went to the New York Times, as I have been doing now for fourteen years, that the site is awful. Beyond awful. There are 1903 words on the home page.

Nineteen hundred and three words.

I copied and pasted the page into Word. It was 16 pages long.

Some of that is page formatting. But most of it is cruft. Look at your NYT app on your iPhone and then look at NYTimes.com. Design matters.

Creative Commons image by Incase.

Number One for 27 July 2010

Got a counterfeit dollar in his hand:

Stevie Wonder, "Misstra Know-It-All"

26 July 2010

The Idea of a Midwestern University

I am spending the summer in Ann Arbor. This may be the best, or at least the second-best, summer of my adult life. I am taking classes, but will not be graded. I am reading, but at my own direction. I am cloistered, but part of a community. I am, in short, having the experience that I thought that grad school proper would be like: private, intense, and liberating.

One reason for my contentment is the setting. It has been seven years since I spent time in one of the large Midwestern universities, and I realize now how much I had missed the environment. The urban campus I attend now has its brief moments of beauty, but they are pockets amidst a jumbled campus whose architectural incoherence is testament to the poor financial planning of previous generations of administrators. It took a lot for universities to miss out on WPA funds for new construction, but somehow the old priests managed the trick. Their successors during the Cold War failed to acquire the American talent for wealth creation but learned architecture from the Soviets. At least they had the good fortune to have inherited a stately nineteenth-century quadrangle; in Dublin, at John Newman's university, the new campus in the suburbs was built from the ground up by Brutalists.

Number One for 26 July 2010

In life revised you never went away:
The Gregory Brothers, "Summertime"

24 July 2010

Number One for 24-25 July 2010

Another summer's passing by:
Belle and Sebastian, "Asleep on a Sunbeam"
Bookmark and Share

23 July 2010

Cyclicality, again

Yesterday's post didn't end where I thought it would. It got a bit philosophical and mopey. What I'd meant to write was a much more practical piece about how the expectation of cycles constrain and condition expectations in organizational life.

If you live and work in a world with strong cycles, then you have to account for those when planning new activities. Periods of high organizational stress, or periods when high organizational performance are needed, are bad times to focus on secondary matters. That rules out changes to standard operating procedure Budget bureaus shouldn't undertake sweeping new initiatives at the beginning of a fiscal year, anymore than it's a good idea to try out a new quarterback in the postseason.

In academia, the cycles are even faster. There are at least three: the two semesters, and the summer. These are layered in the broader cycle of the school year. The separate nature of these cycles combine to make innovation peculiarly difficult in an atmosphere that already makes changes difficult.

I rule out summer, because I address faculty and grad students, not administrators. My hunch is that summer is the right time for redoing administrative procedures, since it is their relatively quiet season. But coordinating academics over the summer adds total impossibility to extreme difficulty.

But the semesters are hardly easier. The first and last weeks of the semester are no good, as is the middle of the semester. High-intensity projects would simply compete with more important responsibilities--and lose. That leaves four windows a year when there is even the possibility of adding new activities.

I have been thinking about this because, obviously, I'm involved with a new group (a workshop on advanced methods). There are many debates involved in founding a new institution, from questions of group behavioral norms norms (which can be established easily at the beginning, but which are tough to change later) to expectations about individual members' involvement to administrative worries. This last category deserves a post of its own. Drafting constitutions, sorting out financial controls, and settling issues of executive competence versus board oversight are tough, even when the group is relatively small and straightforward. One factor that has to be overcome is that academics usually privilege discussion over experimentation and deliberation over decision. Isonomy is an ideal, but it's a harsh mistress.

The more immediate questions we face now are how to keep the group going. There's loads of enthusiasm and the first semester went well, but having a vision for a group means understanding the factors that can sap those traits and lead to a gradual deflation of the popular will that sustains a collectivity and leads to the reproduction of its values and practices. In particular, I wonder if there's a good argument that this group shouldn't explicitly take into account the cycles of the semester and academic year in setting its schedule: having exciting but relatively low-work sessions to begin and end the year, while having the most difficult and labor-intensive sessions in November and January. (November, because it's a time when people want to procrastinate during the doldrums between midterms and finals; January, because the midpoint of the year finds most everyone in midseason form.)

Lowering ambitions a bit deflates expectations at the beginning. Adopting a more conservative attitude makes it more likely that the group can achieve the goals it wants to. The greater danger, though, is in allowing enthusiasm to outstrip capabilities and creating a gap between what is achievable and what is expected. Cyclicality encourages conservatism.

Bookmark and Share

Number One for 23 July 2010

It's a godawful small affair:
David Bowie. Life on Mars? Uploaded by kidibiza. - Explore more music videos.
Bookmark and Share

22 July 2010

Cyclical time and the academy

Well, here it is, another summer and I'm back in school. There is something odd about being more excited to go to class than going to the beach, but thankfully the adult world is structured so that people who share enthusiasms can congregate.

I wonder sometimes if Americans don't have different connections to the seasons than do other cultures. I wonder this not because I want to posit some uniquely American relationship with fall or with winter, but largely because from age 5 to 18, at least, Americans experience summer as a long, unbroken string of endless days. (There's an entire, and astonishingly subversive, Disney cartoon about this phenomenon.) Other countries generally have a shorter summer break; Americans experience summer as a nice preview of life itself. The summer begins full of promise, ripens even as it sours, and ends in a haze of boredom and anticipation. The metaphor breaks down at that point, though, because the coming of fall heralds both the beginning of a new cycle and a promotion within a nicely hierarchical system. Whereas you were once a lowly second-grader, now you may know the mysteries of Third Grade.

Most people outgrow this cycle and graduate into the Real World. I think, in fact, that the linear nature of the Real World is what people have in mind when they discuss this mythical place. (That, and money.) After all, the stages of adult life are strictly sequential, and I suspect that the cumulative nature of outside relationships begins to overwhelm even the seasonality of jobs like those in retail, fashion, and tax accounting. By contrast, academics repeat the cycle until death or denial of tenure, in increasing order of terror. Each year brings a new crop of students, who are there to be taught, nurtured, tolerated, and finally cast out into the world. We grow older, and they stay the same age.

Cyclicality is probably the calendrical equivalent of folk physics. There's probably a good reason why religions structure themselves around cycles. From one perspective, human life is just the rehearsal of roles defined by forces beyond our comprehension and before our understanding. We think there is something natural and inevitable about cycles that are plainly both artificial and recent. Consider the concepts of childhood, adolescence, and young adulthood, none of which existed in recognizable forms two hundred years ago, and for only a very few people a few decades ago. (I like to look at historical statistics, and I'm always stunned at how recently it was customary to leave school at 13 or 14 and begin working in what were essentially adult occupations.) The persistence of such notions in the face of obvious counter-evidence and despite changes across roles between generations is a good sign that we are slotting in our observations about life into a preconceived template.

In fact, I can think of only one other tribe of adults who live by as cyclical a calendar as academics (into which category I will admit, for one night only, teachers): politicians. The electoral cycle is slower now than it used to be, in the 19th century, when one- and two-year terms were the norm, but it must feel more hectic than it was. The principal difference between the electoral cycle and the academic cycle is stark: the participants in one cycle are all but assured that they will be in the same jobs in the next revolution.


Bookmark and Share

Number One for 22 July 2010

Let's all get up and dance to a song:
Bookmark and Share

21 July 2010

Number One for 21 July 2010

Windows rolled down with the heat on high:
  • Dear Pinot Noir: It's not me, it's you. [The Gray Market Report]
  • Bill Murray thought Garfield was a Coen brothers movie [Vulture]
  • Yes: Let's end the American aristocracy. But I'm tired of these weak, Cass Sunstein "nudge"-style policy proposals. How about our progressives propose some real, sanguinary, Bolshevist proposals? [Ta-Nehisi]
  • Suck it, Aaron Friedberg: America didn't become a garrison state because we're too corporate [Who is IOZ, via ZW]
  • Drastic oversimplification: Do Confucians believe in sex? [IPE @ UNC]
  • Jim Vreeland gets an uncredited guest blog [The Guest Blog]
Carrie Underwood,"Get out of this town". No, these links aren't designed to prove I have good taste ...
Bookmark and Share

20 July 2010

Quote of the day

From an anonymous commenter on the PoliSciJobRumors Web site:
Stata 11 is of course going to feature the often demanded "figure this shit out" or ftso command. Simply type the command: ftso 'depvar' and it will give you the results you need in order to answer your research question! If you have time-series cross-sectional data, or if you have no clue what kind of data you have, but want it to look more sophisticated anyways, you should use xtftso.

Bookmark and Share

Auto-Tune the Chart 2

Nobody but me cares, but this is fun...



Bookmark and Share

Number One for 20 July 2010

Guess I'll try to go despise a blog by someone else.
  • IRV gains a new supporter. Too bad he only supports it because he lost. [Yglesias]
  • Dan Drezner gives two cheers for redundancy. He should have called the post "Department of Redundancy Department". [Drezner]
  • Bellisles didn't fabricate, but he didn't fact-check [Chronicle of Higher Ed.]
  • Kathryn Lopez fawns over Mel Gibson [NRO, via reader AT]
  • Science is becoming exponentially more difficult. [Boston Globe, via Monkey Cage]
MC Frontalot, "I hate your blog"
Bookmark and Share

19 July 2010

Number One for 19 July 2010

I think that we make a pretty good team:

  • How Cornficker defeated the smartest guys in the world. [Atlantic]
  • My guess is Stochastic Democracy will eat 538's shorts. [Stochastic Democracy]
  • Today is upgrade day. I hope Stata releases aren't like the Star Trek films, where only the even-numbered ones are good. [Stata]
  • It's also the first day of classes: [1], [2], [3] [ICPSR]
  • Calibrating your gaydar. (Can you draw a ROC curve for that?) [Gelman Blog]
  • Straight talk from Tom Friedman [New York Times]
  • David Blackwell, game theory and Bayesian pioneer, died. More information here. [New York Times, Mathematicians of the African Diaspora]
  • Taiwanese news portrays Steve Jobs as Darth Vader. NB: "Apple" is "pingguo" in Mandarin; "problem" is "wenti". Count how many times you hear those words! [Via Daring Fireball]
Obi Best, "Nothing Can Come Between Us"
Bookmark and Share

18 July 2010

Inception

The critics speak.

For what it's worth, I didn't see much in the film that hadn't been done better by Dark City, the Matrix* or Total Recall. The only hints that we have that anyone besides Fischer is a real person is the kiss between Ellen Page and Joseph Gordon-Levitt; it's the only actual moment of human feeling in the entire piece. Marion Cotillard is radiant and rises above her lines (the incantation about the train sounds dumb when we find out what it is), but imagine if Page had been a rival to her charms. The plot "twists" were all heavily telegraphed and easily familiar to anyone who's read Dick, Borges, the better Bradburys, or Poe. Would it have hurt to have made Saito Chinese instead and had a reference to Zhuangzi?

I think the final scene makes the whole thing obvious (remember: we don't know how Di Caprio washed up at the beach at the beginning of the film, which is a dead giveaway). That is a big disappointment, especially compared to Total Recall. Clearly, Nolan is brilliant--the film is gorgeous and visually inventive--but his talents are better deployed at adaptation than invention. In particular, Dark Knight portrayed a better understanding of ethical challenges and moral questions than Inception, which has none.

Bookmark and Share

Humor Department, Bureau of "Your Matriarch" Jokes

From a loyal reader:

1: We were so poor growing up.
2: How poor were you?
1: We had to shop at the Quarter Foods store.

Bookmark and Share

Number One for 18 July 2010

Modern minds can come up with three questions:
  • Don't fill much-needed holes in the literature, says Erik Voeten. [The Monkey Cage] (See also James Stimson)
  • What is a "computer"? Paging Dr. Wittgenstein. [Charlie Stross]
  • Losing $9.2bn is the result of a non-material deficiency. I'd hate to see a material one. [FT Alphaville]
  • Incidentally, FT is right that EDGAR is teh suxx0r. In fact, most federal databases are awful. Please: make documents available as txt and pdf, make all searches Boolean, tag all documents consistently, present tabular data as csv, and mathematics as TeX. Never again should I have to read a document like this one or use a database as terrible as this one.
  • McChrystal, F*** YEAH. [Atlantic]
  • Robin Hanson is beginning to understand the alienation of labor. [Overcoming Bias, via ZW]
Mr. Show presents "The Limits of Science":
Bookmark and Share

Periodizing U.S.--Soviet Conflict

As my study partner and I re-read the political science literature on U.S. foreign policy, we have wondered at the number of times the United States has been proclaimed the world's only superpower, which is exceeded only by the number of times the IR community has proclaimed that the era of U.S. unipolarity has been finished. Offhand, I can find citations that would bolster the claim that American unipolarity began in the 1940s and ended in the 1950s, in the 1960s, in the 1970s, and in the 1980s, as well as arguments that American hyperpuissance began in the 1980s and ended in the 1990s, began in the 1990s and ended in the 2000s, began in the 1990s and will end in the 2010s or 2020s, and never began or ended at all. Judging by Bear Braumoeller's working paper on U.S. isolationism, I could probably also make a good argument that American unipolarity was at least a possibility in the 1920s. And what else can we take away from Kindleberger but that the United States failed to exercise the global leadership to which it was so plainly entitled?

If you think that dating the potential of American hegemony to before the Second World War is hyperbole, consider the criteria by which Spain, the Netherlands, and the United Kingdom were all retrospectively crowned hegemon; certainly the United States of the 1920s exceeded in relative power the Great Britain of the later Victorian years, when London was unable to contemplate maintaining Canada and its position in the Western Hemisphere without significant rapprochement with Washington. Had the United States bothered to maintain a significant land army or invested in its air force to a greater degree, either of which it could have afforded without a problem in either the 1920s or the 1930s, its military power coupled with its economic influence and de facto imperial hold on the Latin American countries would have certainly made it surpass the relative power position of Athens at the Periclean height. (I suspect that American influence in the Western Hemisphere peaked about 1940, which is when the FBI--the FBI!--ran U.S. intelligence operations throughout the region and external penetration of regimes was at its minimum.)

If periodizing U.S. unipolarity is such a problem, it is no less difficult than determining when the Cold War began and ended. The high school history textbook answer is 1946 to 1991, but over the past decade I have come to the radical position that everything we learn in high school is probably wrong. (Even the Pythagorean theorem.) A very informal survey of the IR literature leads me to conclude that the Cold War as understood at the time actually ended about 1971, +/- four years (in other words, within the period between Glassboro and Helsinki). The renewed pattern of hostile interactions between the invasion of Afghanistan and Reagan's second inauguration was widely seen by everyone except the editors of Human Events as a throwback or a reignition of a dormant conflict. Moreover, this Cold War ended at least three times: with the conclusion of major arms limitation talks in Europe, with the fall of the Berlin Wall and the dissolution of the Soviet Eastern European empire, and with the collapse of the U.S.S.R. itself in 1991. (For extra credit, pinpoint the dissolution of the U.S.S.R.: was it the August coup, the signing of the C.I.S. treaty, or the resignation of Mikhail Gorbachev?)

Politics ain't beanbag, and political science ain't physics. There is no shame in our having multiple definitions of the inauguration and conclusion of different eras. The different periods may be useful for different purposes. (I think it is clear as can be that 1973 marked the end of American economic hegemony and the beginning of meaningful multilateral governance of aspects of the international--read first world--economic system.) Yet the proliferation of periodizations nevertheless should prompt some epistemic humility among contemporary IR scholars and also a re-evaluation of the way we present the "stylized facts" of 20th century history to undergraduates. In particular, we should reject the high school narrative of the Cold War as a monolithic event that serves a useful analytical purpose and instead present the years between Roosevelt's death and Clinton's boxers as a series of more discrete and more analytically-defined periods. I suggest the following:
  • The Cold War, 1947 to 1962. The Truman Doctrine and the Cuban Missile Crisis bookend the height of the Cold War. The Truman Doctrine symbolizes U.S. resolution to engage the Soviet Union and neatly outlines the doctrine of containment; the Cuban Missile Crisis symbolizes both the rise of Soviet power and the need of the United States to adapt to a world in which its strategic supremacy was no longer a given.
  • The Soviet-American condominium, 1963-1979. The signal fact about the 1960s and the 1970s was the strategic stability of the global order, as assured destruction and concomitant strategic talks between Moscow and Washington imposed an order on bilateral relations. The "opening" to China---a far more complex event than normally portrayed---was as much a way for the United States to maintain the global order as it was for Washington to seek an advantage versus the U.S.S.R. (In particular, a Sino-Soviet war, as seemed possible in 1968 and 1969, could have had incalculable consequences for global order generally.) The Kissingerian mantra of a "structure" of global peace fits the period well, in which the drumbeat of nuclear tests had been replaced by a numbing succession of test-ban treaties and SALT talks.
  • Strategic supremacy, 1979 to ?. Washington's response to the Soviet invasion of Afghanistan and the buildup of American military budgets, combined with the increasingly unsustainable Soviet economic and political structure, produced a situation in which the domestically-determined collapse of the U.S.S.R. unfolded to maximum American advantage. It was Washington, not any multipolar arrangement, that dictated the fundamentals of the post-Soviet era: a unified Germany in NATO, the deference to the use of American military power in the Gulf and later in the Balkans, and the ability of the United States to project power throughout the world.
This is obviously a rough schematization of the period, and its essential elements are not original, but given the fact that undergraduates have little historical sense and much of what they do know they seem to have imbibed from presidential hagiographers it is probably a good idea to begin pushing back.
Bookmark and Share