Film Review: Paths of Glory (1957)

The best movies grow on you, so that they become so much larger in the mind than they could ever be on film. Stanley Kubrick’s Paths of Glory works in that way.

Set during the First World War, the heart of the film is about a miscarriage of justice. A French attack on an entrenched German position goes badly wrong. To save face, the general staff orders that three men be shot for cowardice. Three innocent men are chosen, and the noble Colonel Dax (tautly played by Kirk Douglas) works to save them from the firing squad.

Paths of Glory

For Kubrick, whose quiet obsession was the place of the individual in cold, often mechanistic worlds, the First World War was the perfect setting. The French high command, quartered miles from the front in a regal chateau, bandy about casualty figures and talk about the executions as a “perfect tonic” for morale. In the movie’s lone battle scene, cinema’s best until Spielberg stormed Omaha Beach, masses of anonymous soldiers are gunned down by an enemy we never even see. In this uniquely stupid war where millions were senselessly killed, what are we to make of the plight of three innocent men?

Kubrick understood that the best argument against injustice is to depict it plainly. The film runs just under an hour-and-a-half, with hardly a wasted frame, and is shot with the unvarnished clarity possible only in black-and-white. With each new indignity, each cruel turn of the plot, one can sense the righteous anger building behind the camera, kept in check only by Kubrick’s supreme discipline as a director. There are no deus ex machina or Hollywood contrivances to nourish our sympathies, hardly any music to heighten the drama.

But there is one notable exception to the latter rule—the haunting song that ends the film. Without its final scene, Paths of Glory would be a good film, but perhaps not a great one. It would be remembered for its stark photography, its strong anti-war stance, and perhaps for its place in Kubrick’s early oeuvre. But with the final scene, a statement on the injustice of abused authority turns into something at once more tragic and beautiful. Roger Ebert writes, “songs at the ends of dramas make us feel better… this song [is] a twist of Kubrick’s emotional knife.”

At the end of the film, a captured German girl is hauled before a raucous crowd of French soldiers. They have suffered much at the hands of the Germans, so they make lewd jokes and pelt insults at her. Forced to perform, she sings a little German folk tune, The Faithful Hussar. At first, jeers drown out her halting voice. But as she continues to sing the room grows still. The soldiers’ weathered faces soften, and their eyes gaze intently at the girl, but now with kindness, not malice. They do not know the words, so slowly, one by one, they hum along. Together their voices, dissonant but lovely, fill the room.

The Japanese director Akira Kurosawa once summarized his films with a single unanswered question: “Why can’t people be happier together?” In its unflinching depiction of war, and its wrenching final scene, Paths of Glory deals with the same theme. Its message is this: we share a common humanity that cannot be extinguished by any amount of cruelty. And the great tragedy of it all is not that war destroys, but that, in spite of our love for each other, that we should go to war in the first place.

Belated Thoughts on The Force Awakens

By now, you’ve already seen Star Wars: the Force Awakens—perhaps more than once. Reviews seem curiously irrelevant when over $1.5 billion sits in the bank.

Still, for what it’s worth, the early reviews for the Force Awakens were glowing; Rotten Tomatoes says that 94% of reviews are positive.* J.J. Abrams can surely rest easy, having brought his second Star franchise roaringly, profitably back to life.

But with a bit of distance from Opening Day’s orgiastic outpouring of collective nostalgia (and from the real danger of provoking a horde of lightsaber-wielding fanatics), critics have quietly begun reappraising the Force Awakens. Two charges against TFA have emerged as common threads: that it is overly commercial, and that it is unoriginal.

Try telling them that Greedo shot first.
Try telling them that Greedo shot first.

Before jumping into my own thoughts on the film, I’d like to tackle these two points. I find the first charge of commerciality absurd. Complaining that profits hold too much sway over Star Wars is like criticizing a rancor for its claws; dollars are in the series’ DNA. Gary Kurtz, a producer on the original trilogy, admits that the happy ending of Return of the Jedi was designed to maximize toy sales. Both Harrison Ford and screenwriter Lawrence Kasdan wanted to kill Han Solo off in a blaze of glory, but Lucas nixed the idea.

And if it all feels unoriginal—well, the “original” films all owed a heavy debt to Westerns, Samurai pictures, and pulpy sci-fi serials. In some instances, George Lucas shameless copied his inspirations shot by shot:

The novelty of the original films came not from a bold vision of the future, but from the way they cheerfully borrowed from the past. In 1977, Star Wars returned viewers to the innocent world of adventure serials and comic books. For audiences in 2016, the closest analogue in terms of pure nostalgic value is—well, 1977’s Star Wars. Attacking TFA for being too derivative of Star Wars feels like a double standard when almost every blockbuster since 1977 bears its mark. Star Wars has so colonized the American mind that escape from its orbit is impossible—its cultural gravity is so great that the franchise has collapsed into a singularity of self-reference.**

Anyway, I digress. My critique of the criticism of TFA‘s is really just a long way of saying that I think the film should be judged on its own merits. Not as Exhibit A in the case against Hollywood’s greed, or as a symbol of the immaturity of the age, but as a standalone piece of art. And with all that said, I think that The Force Awakens is not a great film—a nice movie, but not a notable one.

Enough to inspire glee in the heart of even the most  hardened cynic
Enough to inspire glee in the heart of even the most hardened cynic

Everyone who cares about spoilers has already seen it, so I will discuss the plot openly. It’s obvious by now that The Force Awakens closely hews to the basic structure of A New Hope, from its opening act on a desert planet to its climactic final attack on an armored space station. And for the first thirty minutes, a dose of vintage Star Wars is exactly what the doctor ordered. In the hands of J.J. Abrams, who can evoke a sense of mood better than any other big-budget director of his generation, it’s an intravenous injection into the nostalgic system. The sight of hulking star destroyers lying wrecked in the sand is enough to inspire glee in the heart of even the most hardened cynic. Just as in the fluid, Spielbergian first act of Super Eight (probably Abrams’s most personal film), much of the first half-hour of TFA glides by in a warm, bubbling feeling of joy.

But TFA sags in the middle act. There are too many comings and goings, false partings and goodbyes: I could have done without Finn’s half-hearted departure at Maz’s tavern, and the confusion that allows Rey to be captured seems more a writer’s contortion than a natural development. When the pacing slows, one begins to see the plot’s turning cogs, and the similarities to A New Hope become plain. By the time the group arrives at the rebel base, I knew what I was in store for. I’ll admit that Han Solo’s death came as a surprise (mostly because I thought the writers lacked the courage to harm the original icons), but from there the dénouement is inevitable.

"I don't like sand. It's coarse and rough and irritating and it gets everywhere. "
“I don’t like sand. It’s coarse and rough and irritating and it gets everywhere. “

It’s not bad, mind you. The second half may be more workmanlike than the first, but on the whole TFA is still made with much more joy than the thin gruel of your annual Marvel sequel. The new cast—particularly Daisy Ridley as Rey—is generally zippy and likable, and though I found some characterizations a little thin (like Finn and his decision to turn coat), there is at least no Hayden Christensen-sized miscasting here, since all the actors have the ability to emote. The special and practical effects are as excellent as you might expect, and the latter sometimes even shows flashes of the quirky charm that so illuminated the original films.

To sum up: Abrams and crew go far in painstakingly recreating the look and feel of the original films, and for the first thirty minutes, the illusion works. But as the necessities of plot and character take over, the deficiencies in both become apparent. That sublime floating feeling of nostalgia slips away, and what one is left with is a nice but flawed movie, made with much technical care but missing that rare touch that turns the everyday into the sublime. The Force Awakens may be not be high art, but it is the heartfelt work of an artisan.

* Rotten Tomatoes is a rather silly measure of critical approval—but that is a subject for another time.

** Not sure if that sentence actually makes sense, but it was so much fun to write that I’m leaving it in.

There Is No Secret Ingredient: What Kung Fu Panda Means for Chinese Growth

Nuggets of economic wisdom can appear in the strangest of places. Consider this chop suey epigram from the 2008 cartoon Kung Fu Panda, which neatly encapsulates the story of modern China’s economy. Down on his luck, our hero Po turns to his noodle-selling father for advice. Po’s father sees that his son has lost his way and, knowing little about kung fu, confides in him this revelation: that there is no secret ingredient to his popular secret ingredient soup.

As it turns out, there is no secret ingredient in good noodle soup, great kung fu—and China’s meteoric economic rise. China’s recent sputtering—to the tune of 5 trillion in lost market capitalization, and financial tremors reverberating across the world—could (with some poetic license) be described as a sudden realization of this fact. For years, self-proclaimed China experts had insisted that China was different—that its blend of authoritarian capitalism or perhaps its Confucian culture meant that it didn’t follow the same rules. Many investors must have believed this as they poured into the stock market with reckless abandon.

But economists had long known that China’s economic model is no secret. If anything, it is the tried-and-true strategy for rapid growth: High rates of investment funded by even higher rates of saving, coupled with a focus on manufacturing exports. Japan and later the East Asian Tigers set the example in the mid-20th century. China is in large part copying this script.

What makes China’s story notable is its scale. Hundreds of millions have moved up from rural subsistence to urban affluence. (Chinese New Year, when urban workers return to their rural hometowns for the holidays, is the world’s largest annual human migration for precisely this reason.) High household savings rates have funded huge investment projects at home, with cities springing up seemingly overnight. And Chinese growth has been the engine of the world economy for over a decade, through recession and recovery.

Perhaps it is no surprise then that China’s economy has acquired a mystic aura of invincibility, the sense that there is some secret sauce. But economists know that this simple growth strategy has its limits.

Eventually, as a country becomes more developed, investment runs into diminishing returns: With the low-hanging fruit exhausted, new projects become less productive and growth slows. Mature economies tend to rely more on domestic consumption than investment and exports, which further lowers the potential rate of growth. Add in the complication that China’s population is aging prematurely (an unintended result of the One Child Policy), and the forecast for growth seems gloomy.

To their credit, Chinese policymakers understood that a slowdown was coming: In 2011, the Central Committee revised its annual growth target down from 10% to 7%. To their detriment, they seem unwilling to accept the inevitable pains of this slowdown: Billions in government dollars have been deployed to shore up the ailing stock market.

So the short-term transition from 10% to 7% (or lower) growth has already proven shaky. But is there hope for the long term?

Once again Kung Fu Panda may provide an answer. In 2008, Panda was a smash hit at the Chinese box office, much to the consternation of Chinese filmmakers and politicians. Here was  an American cartoon, about as authentically Chinese as a fortune cookie, that leapt across cultural borders and connected with mainland audiences. Why couldn’t it have been made by a Chinese company?

Producing a film like Panda is enormously complex. Behind the celluloid there is technical innovation, a well-organized artistic industry, a financial system to bankroll the project, copyright laws to protect the intellectual property. Economists lump these factors together under the catch-all label of “technology”—all the growth that cannot be explained by simple investment and population growth.

Economists still don’t understand technological progress very well: where it comes from, how it can be nurtured. But if long-run growth has a secret ingredient, it means more Kung Fu Pandas.

The End of a Season


The World Series is over; the Kansas City Royals have won.

I wish the victors no ill will. They are a feisty team and, like us, a perennial underdog. Traveling to New York only to see the Mets lose was a shame, but I do not regret the trip.

I’ll admit that the loss will sting for a bit. But perhaps more daunting is the prospect of four long months until Spring Training.

For the fans who have followed from the early days of spring, this team has been a central fact of life. With such commitment, it has been impossible not to read some meaning into this season: an almost-miracle team, a Cinderella whose gown turned to rags at the ball. Cynics would say we are imposing a narrative on statistical randomness. So what? Life in its unpredictability can be indistinguishable from noise, but we organize the errors and passed balls and bad hops into stories that make sense. To overinterpret is human; to detach oneself, divine.

Friends often ask how I can be invested in such a boring, antiquated sport. A critic of baseball once quipped that the game was thirty seconds of action jammed into three hours of play. This is true—but isn’t really a criticism. Life is as much about the long stretch of quiet routine as it is about flashes of electricity. Baseball is in equal parts grand slams and chewed bubble gum, no-hitters and riding the pine. Its rhythms sync with life.

And why the Mets, in particular? I cannot improve on the words of Roger Angell:

Suddenly the Mets fans made sense to me. What we were witnessing was precisely the opposite of the kind of rooting that goes on across the river. This was the losing cheer, the gallant yell for a good try—antimatter to the sounds of Yankee Stadium. This was a new recognition that perfection is admirable but a trifle inhuman, and that a stumbling kind of semi-success can be much more warming. Most of all, perhaps, these exultant yells for the Mets were also yells for ourselves, and came from a wry, half-understood recognition that there is more Met than Yankeee in every one of us. I knew for whom that foghorn blew; it blew for me.

I am grateful for these 2015 New York Mets. I am grateful for their company through the warm summer months, and (beyond all expectations) into the brisk days of autumn. By all accounts they are a happy, amiable clubhouse, and they are young and talented, with great unrealized potential. Defeat in the World Series carries no shame, and will not sour the memories of a glorious season. I am happy to have had them as part of my life.

Baseball’s 9th Inning

After a frigid winter, baseball’s return is most welcome. In the high summer months, it’s hard to have a bad time at a ballgame. Winning or losing is almost immaterial. There are a 162 games in a season; no contest feels like life or death. At worst, a day at the ballpark is an overlong picnic—you return home a little burnt from the sun, your stomach heavier and your pocket lighter from overpriced junk food, but really no worse off from the experience.

Well-played baseball is a simple aesthetic pleasure: a circus of long arcing throws, and of swift line drives. It is a game of detail, where the little things—hitting the cutoff man, working the count, bunting the runner over—add up. Like a Japanese tea ceremony, it is a meticulous ritual: “Take Me Out to the Ball Game” during the seventh-inning stretch, and an arcane lexicon which would seem silly were it not so deeply embedded in the American mind.

Baseball is a beautiful game. Which makes it such a shame that it is dying.

Let’s dispel all illusions—baseball is aging as fast as the Republican Party. The median age for a viewer of a nationally broadcast baseball game is 54, higher than for football, hockey, and basketball; only 6 percent of people who watched the 2013 World Series were under 18. The result will be a whole generation that grew up not watching baseball.

Baseball apologists point to strong attendance figures, and record team valuations. But national television audiences, surely the strongest indicator of the sport’s overall popularity, are dwindling.

Much of this is a self-inflicted wound. More and more teams are walling themselves behind regional cable networks, which promise lucrative subscription revenues but limit access to all but the most diehard fans.

But I think the cause of this decline is more fundamental. The root of the problem is that baseball remains a nineteenth-century game stuck in the twenty-first century.

Baseball’s Golden Age coincided with a period of rapid urbanization in America. The ballpark was a small slice of the country’s recent rural past, transplanted into the heart of the bustling new cities. The leisurely pace, the picnic-like occasion of going to a ballgame, evoked a simpler time.

But for a younger generation, the game is far too slow. There are highlight reel plays and astonishing feats of athleticism, but these merely punctuate long stretches of routine—pitch, hit, pitch, hit, double play, infield fly. The fan sees every event as part of a larger story: Ortiz is hitless with runners on second and third; Ortiz has a grudge against this pitcher after being beaned last season. But for the non-fan, this tension is invisible. The game is merely boring.

Major League Baseball has made some recent concessions towards shortening games, like reducing dead time between breaks, and forcing batters to keep a foot in the box between pitches. But these changes are at most marginal; the sport has stubbornly stuck to tradition. And I would not have it any other way.

To abandon baseball’s peculiarities in an attempt to boost its popular appeal would be disastrous. Baseball speaks to old-time values of fair play and equality. Pitch clocks and time limits are antithetical to its spirit—there’s no running out the clock; you have to give the other team a chance to bat. The philosopher Jacques Barzun once said that to know baseball was to know the “heart and mind” of America. Barzun wrote this in the 1950s, baseball’s high watermark. Today, if TV ratings were your only guide, football’s ritualized violence would reflect America’s soul.

There is a sad beauty to a fading thing. But, for the moment, summer is here—the ballparks are still filling up, and the smell of peanuts and hot dogs beckons. It may be the 9th inning, but there’s still plenty of baseball left to play.

Generic New York Times Opinion Column

“Witty epigram commenting about man’s place in modern society/the economy/education.”

Attribution to a dead white man who has two (or more!) initials instead of a first name. Broad, sweeping generalization about the human condition.

Cherry-picked statistic illustrating disturbing trend over recent years. Out-of-context quotation from prominent businessman/politician expressing agreement.

Citation of recent trendy publication—preferably by another Times author—concerning the problem. Expression of agreement with a niggling qualification.

Anecdote about the Way America Was. Nostalgia for older era of American society/education (I’m old as dirt!). Slight qualification to show that I’m still hip, with it, a true hoopy frood (people still say that, right?).

Simplified three-point argument, clearly enumerated for the dolts who actually flip to the opinion pages.

Classical allusion to show that I’m better-educated than these dolts.

Facile technological/business-related bullet point to reveal that maybe I’m not.

Quick third point to round out the argument, preserve the symmetry of the thing, and wrap this all up in a nice ribbon to share at cocktail parties. (See what I did in that last sentence?)

One more padded quote to get me to 800 words, and another New York Times column complete—cha-ching, baby!

A Salute to an Icon

This February saw the passing of Leonard Nimoy, known to millions of “Star Trek” fans as Mr. Spock. If “Trek” is a religion, for its legions of adherents (including this author), Spock is something of a patron saint. Spock’s death at the end of “Star Trek II” inspired more anguish than any fictional death since that of Sherlock Holmes; the death of Nimoy, Spock in the flesh, has left a wound in the imaginations and hearts of millions of fans.

For Nimoy, this mythological status was at times a reluctant burden. His 1975 memoir, “I Am Not Spock”, attempted to disentangle the man from his character—and was summarily rejected by diehard Trekkers. But slowly, inevitably, Leonard Nimoy became inseparable from Mr. Spock, a fact that Nimoy first accepted then (at least publicly) celebrated. A contrite follow-up, “I Am Spock”, was published in 1995.

Perhaps it is not surprising that Nimoy, who had serious artistic ambitions outside of Trek, once wished to disassociate himself from the pointy-eared Vulcan. Most critics quickly dismiss Star Trek as camp. Often the show’s idealism became preachy and the sci-fi allegories stupidly transparent (cf. Nazi Planet and Garden of Eden Planet). On occasion, Nimoy’s costar Shatner chewed up the scenery faster than the planet-killer from “the Doomsday Machine”. And after it came out in 1977, “Star Wars”’s spectacular space battles made Trek’s styrofoam rocks and velour costumes look cheap.

But “Star Trek” has an earnest, unvarnished optimism—absent from the slickly commercial “Star Wars”—that defies cynicism. The profound loyalty that “Trek” commands from millions of intelligent fans suggests that there is something artistically honest and essentially true about it. And whatever that thing is, Leonard Nimoy’s thoughtful, serious portrayal of Spock certainly had a lot to do with it.

Spock could very easily have been the emotionless automaton that he is often reduced to in “Trek”’s numerous parodies. But Nimoy gave Spock life, an understated wit, and a comforting wisdom that reached beyond the bounds of cold logic. Spock was no paragon of pure reason; his half-human heritage gave him emotions that his logical Vulcan side could not hide. What critic Roger Ebert called “Star Trek”’s essential question—the line between what is human and what is not—found its central battleground in the soul of Mr. Spock.

Watch Nimoy’s performance in “This Side of Paradise”, an episode where psychedelic spores—these were the 60s, after all—induce Spock to finally let loose and fall in love. When cured of the spores’ effect and asked about his experience, Spock wistfully responds: “I have little to say about it, captain. Except that for the first time in my life, I was happy.” At that moment, years of repressed feelings subtly lurk beneath Nimoy’s gaunt face.

Or see “Amok Time”, the classic episode that introduced the Vulcan salute and the accompanying “live long and prosper” (the former was Nimoy’s own invention, based on a Jewish priestly gesture of blessing). After a rare defeat in a battle of wits, Spock wryly notes to his victor that “having is not so pleasing a thing after all as wanting. It is not logical, but it is often true.”

As the series went on, Spock grew with Nimoy, and Nimoy with Spock; in a late interview, Nimoy noted that Spock’s reason and calm influenced his own personality. Spock the character eventually learned to balance logic and his emotions, maturing from a scientist suspicious of his feelings into a sort of Stoic sage. In classic “Star Trek” fashion, the answer to conflict was not total victory (a la “Star Wars”’s fable of good and evil) but understanding.

That Spock’s catchphrases have become so ubiquitous, his mannerisms instantly recognizable, has obscured how complex a character he was. Here was a character richly drawn. Spock’s reconciliation of his human and Vulcan halves formed the heart of the Enterprise’s original voyage. Nimoy’s inspired portrayal bridged another divide—the gap between camp and art, between the mundane and the sublime.

My Objection to Moral Objections to the Ice Bucket Challenge

I’m a contrarian by nature. If something is trending on social media, my natural inclination is to figure out what’s wrong with it, and hopefully spoil some of the fun (I can be a real sourpuss). For instance, here are two goofs in everyone’s favorite movie, Frozen:

Elsa's braid passes through her arm
Elsa’s braid passes through her arm
Kristoff's thumb clips into Anna's side
Kristoff’s thumb clips into Anna’s side

So you can imagine my frustration when the ALS Ice Bucket Challenge exploded over my Facebook newsfeeds, and I was nominated (more on that later). As is my wont, I began reading up on some of the contrarian literature to figure out how to respond.

But I found myself annoyed by the arguments and the poor philosophy involved, so much so that I dumped a bucket of ice over my head. This piece by William MacAskill for Quartz, in particular, gets it seriously wrong.

MacAskill writes:

The key problem [with the Ice Bucket Challenge] is funding cannibalismThat $3 million in donations doesn’t appear out of a vacuumBecause people on average are limited in how much they’re willing to donate to good causes, if someone donates $100 to the ALS Association, he or she will likely donate less to other charities… Research from my own non-profit… has found that, for every $1 we raise, 50¢ would have been donated anyway… So, because of the $3 million that the ALS Association has received, I’d bet that much more than $1.5 million has been lost by other charities.

Sure, I buy the crowding out argument—it seems reasonable that people have a budget for charity, and increased giving to one area can mean less giving to another. But it ignores the possibility of expanding the overall pool of charitable funds. A campaign like the ALS Ice Bucket Challenge with a highly visible commitment mechanism (i.e. if you get tagged and don’t donate or pour ice over your head, you’re an asshole) seems to me like an excellent way to force people into giving when they ordinarily wouldn’t have. Implicitly, MacAskill even acknowledges this: if 50¢ on the dollar is crowding out other charities, then 50¢ of new money is going to charity. That’s $1.5 million that wouldn’t have gone to charity otherwise.

He adds:

Almost every charity does the same thing — engaging in a race to the bottom where the benefits to the donor have to be as large as possible, and the costs as small as possible We should be very worried about this, because competitive fundraising ultimately destroys value for the social sector as a whole. We should not reward people for minor acts of altruism, when they could have done so much more, because doing so creates a culture where the correct response to the existence of preventable death and suffering is to give some pocket change.

… There is a countervailing psychological force, called commitment effectsIf in donating to charity you don’t conceive of it as “doing your bit” but instead as taking one small step towards making altruism a part of your identity, then one good deed really will beget anotherThis means that we should tie new altruistic commitments to serious, long-lasting behavior changeRather than making a small donation to a charity you’ve barely heard of, you could make a commitment to find out which charities are most cost-effective, and to set up an ongoing commitment to those charities that you conclude do the most good with your donations

I find critiques like this odd, because they’re essentially critiques of human nature. I view the “competitive fundraising” and the “minor acts of altruism” as responses by charitable organizations to the constraints of human behavior. People have limited funds to give, and there are a lot of good causes out there, so of course there’s competition. Moreover, human beings are naturally myopic, prone to recency bias, and vulnerable to sensationalism, so charities respond by trying to catch people’s attention and attract one-time donations—the “minor acts of altruism” and loose “pocket change” that MacAskill derides. In this sense, the slickly-filmed Kony 2012 campaign and the viral ALS Ice Bucket Challenge campaigns were exceptionally well-designed.

But the correct response shouldn’t be to complain (as MacAskill does) about how people don’t make long-term commitments to charitable giving, because human beings don’t seem inclined to act in that way to begin with. Given the choice of trying to alter human nature or fitting the incentive structure of charity to fit human nature (á la Kony or ALS), I would prescribe the latter.

Prescriptive philosophical arguments should deal exclusively with the world we live in, not some moral fairyland where we can assume away problems of human nature. That’s the realm of economics.

And, in large part to raise awareness for Lou Gehrig’s Disease (and in small part to stand by my argument), here is me dropping a bucket of ice over my head.

Rule, Britannia? Do Hegemons Enforce Peace? (Paper)

May at Harvard means Reading Period, and Reading Period means writing papers and for studying exams. I took an economics tutorial this semester on the Political Economy of International Conflict, which I absolutely loved–it was a chance to combine two of my favorite subjects, economics and military history.

Hopefully some of that enthusiasm rubbed off on the final paper I wrote for the class, which is on hegemonic stability theory. In a nutshell, the idea is that if one country (a hegemon) is much more powerful than the others, this country’s ability to unilaterally impose costs on aggressors will deter smaller countries from going to war. In the paper, I set up a game theoretic model to express this a little more formally, and ran some regressions to test this hypothesis empirically.

Here is the link: Rule, Britannia? Do Hegemons Enforce Peace?

This was the first real independent research paper I’ve ever done, so it was actually a little exciting. (That last sentence takes second prize in “nerdiest sentences uttered by me this month”, after: “It sucks that Lucasfilm is making the Expanded Universe non-canon, because that means all my Star Wars books are obsolete”.) Some personal observations after writing it:

  • Once you get the hang of it, LaTeX makes your life a lot easier–it handles citations smoothly and makes equations less of a pain. For a brief moment, when I was running Stata and LaTeX at the same time to compile some regressions, I felt like an economics god. Then I remembered that these are the preferred tools of most academic economists. And I felt less cool.
  • I was really happy to get some statistically significant results in my analysis. But I definitely did feel the pressure of producing something that was statistically significant–it certainly wouldn’t have felt as good if I’d turned in a regression table bereft of those lovely asterisks. And this was only a term paper–imagine the pressure career academics must feel! No wonder science suffers from “exaggeration [and] cherry-picking“.
  • Academic writing is a very weird and very specific genre. The goal is clarity at the expense of all else, and unfortunately “all else” happens to include readability. At some point when I have more time I’ll try and recast this as a blog post, with less jargon (what is a dyad, anyway?) and a little more wit.
  • Nevertheless, I enjoyed writing this! There’s nothing more fun and satisfying than being creative and–and when it’s something you’re really interested in, all the better.

Thank you for reading. If you’re a Harvard student and read this on top of all the papers you have to cover, kudos to you. Now get back to studying!

The Worst Hyperinflation in World History

No real analysis here, but an interesting thing I picked up in my reading:

Weimar Germany is typically thought of as the worst modern example of hyperinflation. In terms of its political consequences, it certainly was disastrous. But the dubious honor of the worst inflation in world history actually goes to postwar Hungary:

When the war ended, the U.S. dollar traded at 1,320 pengös, already a severe collapse from the 5.4 pengös to the dollar of 1938… By the end of 1945 Hungarian prices had risen four hundred-fold, and the pengö had ropped to 290,000 to the dollar… By the end of July one U.S. dollar was worth five nonillion pengös (a nonillion is ten followed by thirty zeros). The government printing presses could not keep pace with the wild inflation, and at this point all of Hungary’s banknotes in circulation combined were worth one one-thousandth of an American cent (Frieden, Global Capitalism, p 273).

Scary stuff.