INTRODUCTION
The Counterfeiters, an award-winning German film set in 1940s Europe, opens with the concentration camp survivor Salomon Sorowitsch, played by the Austrian actor Karl Markovics, sitting fully clothed on the beach holding a suitcase full of dollars. The war has just ended, and he intends to put that currency, of dubious provenance, to work on the tables of Monte Carlo. That it is dollars rather than French francs is essential to the authenticity of the scene. In post–World War II Europe it was the dollar, the currency of the only major economy still standing, that people in all countries wanted. Dollars were the only plausible currency that a Holocaust survivor might carry into a casino in 1945.
Fast forward now 50-odd years. In City of Ghosts, a 2002 thriller set in contemporary Cambodia, the hero, a crooked insurance salesman played by Matt Dillon, uses a suitcase full of dollars to ransom his partner and mentor, played by James Caan, who has been kidnapped by business associates. More than half a century of cinematic time has passed and the location is now developing Asia rather than Europe, but the suitcase still contains dollars, not Japanese yen or Chinese renminbi. That any self-respecting kidnapper would expect the ransom to be paid in dollars is so obvious as to go unstated.
The suitcase full of dollars is by now a standard trope of mystery novels and Hollywood screenplays. But this artistic convention reflects a common truth. For more than half a century the dollar has been the world’s monetary lingua franca. When a senator from the Republic of Kalmykia is caught shaking down a Russian airline, he is apprehended with a suitcase containing $300,000 in marked U.S. bills. When Somali pirates ransom a ship, they demand that the ransom money be parachuted to them in dollars. As the Wall Street Journal has put it, “In the black market, the dollar still rules.”1 The fact that nearly three-quarters of all $100 bills circulate outside the United States attests to the dollar’s dominance of this dubious realm.
But what is true of illicit transactions is true equally of legitimate business. The dollar remains far and away the most important currency for invoicing and settling international transactions, including even imports and exports that do not touch U.S. shores. South Korea and Thailand set the prices of more than 80 percent of their trade in dollars despite the fact that only 20 percent of their exports go to American buyers. Fully 70 percent of Australia’s exports are invoiced in dollars despite the fact that fewer than 6 percent are destined for the United States. The principal commodity exchanges quote prices in dollars. Oil is priced in dollars. The dollar is used in 85 percent of all foreign exchange transactions worldwide. It accounts for nearly half of the global stock of international debt securities.2 It is the form in which central banks hold more than 60 percent of their foreign currency reserves.
This situation is more than a bit peculiar. It made sense after World War II when the United States accounted for more than half of the combined economic output of the Great Powers.3 America being far and away the largest importer and main source of trade credit, it made sense for imports and exports to be denominated in dollars. Since the United States was the leading source of foreign capital, it made sense that international financial business was transacted in dollars. And with these same considerations encouraging central banks to stabilize their currencies against the dollar, it made sense that they should hold dollars in reserve in case of a problem in foreign exchange markets.
But what made sense then makes less sense now, when both China and Germany export more than the United States. Today the U.S. share of global exports is only 13 percent. The United States is the source of less than 20 percent of foreign direct investment, down from nearly 85 percent between 1945 and 1980.4
These two changes are both manifestations of the same fact: the United States is less dominant economically than 50 years ago. This fact reflects the progress of other economies, first Europe, then Japan, and most recently emerging markets like China and India, in closing the per capita income gap. Economists refer to this narrowing as catch-up or convergence. It is entirely natural insofar as there is no intrinsic reason that U.S. incomes and levels of labor productivity should be multiples of those in the rest of the world. This process of catch-up is one of the great achievements of the late twentieth and early twenty-first centuries in that it has begun lifting out of poverty the majority of the world’s population. But it also means that the United States accounts for a smaller share of international transactions. And this fact creates an uneasy tension with the peculiar dominance of the dollar.
This dominance is something from which we Americans derive considerable benefit. An American tourist in New Delhi who can pay his cab driver in dollars is spared the inconvenience of having to change money at his hotel. The widespread international use of the dollar is similarly an advantage for American banks and firms. A German company exporting machine tools to China and receiving payment in dollars incurs the additional cost of converting those dollars into euros, the currency it uses to pay its workers and purchase its materials. Not so a U.S. exporter of machine tools. Unlike firms in other countries, the U.S. producer receives payment in the same currency, dollars, that it uses to pay its workers, suppliers, and shareholders.
Similarly, a Swiss bank accepting deposits in francs but making foreign loans in dollars, since that’s what its customers want, has to worry about the risk to its profits if the exchange rate moves.5 That risk can be managed, but doing so is an added cost of business. Our Swiss bank can protect itself by buying a forward contract that converts the receipts on its dollar loan into francs when the loan matures, at a rate agreed when the loan is made. But that additional transaction has an additional cost. American banks that make foreign loans in dollars as well as taking deposits in dollars are spared the expense of having to hedge their foreign currency positions in this way.
A more controversial benefit of the dollar’s international-currency status is the real resources that other countries provide the United States in order to obtain our dollars. It costs only a few cents for the Bureau of Engraving and Printing to produce a $100 bill, but other countries have to pony up $100 of actual goods and services in order to obtain one. (That difference between what it costs the government to print the note and a foreigner to procure it is known as “seignorage” after the right of the medieval lord, or seigneur, to coin money and keep for himself some of the precious metal from which it was made.) About $500 billion of U.S. currency circulates outside the United States, for which foreigners have had to provide the United States with $500 billion of actual goods and services.6
Even more important is that foreign firms and banks hold not just U.S. currency but bills and bonds that are convenient for international transactions and at the same time have the attraction of bearing interest. Foreign central banks hold close to $5 trillion of the bonds of the U.S. treasury and quasi-governmental agencies like Fannie Mae and Freddie Mac. They add to them year after year.
And insofar as foreign banks and firms value the convenience of dollar securities, they are willing to pay more to obtain them. Equivalently, the interest rate they require to hold them is less. This effect is substantial: the interest that the United States must pay on its foreign liabilities is two to three percentage points less than the rate of return on its foreign investments.7 The U.S. can run an external deficit in the amount of this difference, importing more than it exports and consuming more than it produces year after year without becoming more indebted to the rest of the world. Or it can scoop up foreign companies in that amount as the result of the dollar’s singular status as the world’s currency.
This has long been a sore point for foreigners, who see themselves as supporting American living standards and subsidizing American multinationals through the operation of this asymmetric financial system. Charles de Gaulle made the issue a cause célèbre in a series of presidential press conferences in the 1960s. His finance minister, Valéry Giscard d’Estaing, referred to it as America’s “exorbitant privilege.”
Not that this high-flown rhetoric led to changes in the actual existing system. In international finance as in politics, incumbency is an advantage. With other countries doing the bulk of their transactions in dollars, it was impossible for any individual country, even one as critical of America’s exorbitant privilege as France, to move away from the currency. And what was true in the 1960s remained true for the balance of the twentieth century.
But today, in the wake of the most serious financial crisis in 80 years, a crisis born and bred in the United States, there is again widespread criticism of America’s exorbitant privilege. Other countries question whether the United States should have been permitted to run current account deficits approaching 6 percent of GDP in the run-up to the crisis. Emerging markets complain that as their economies expanded and their central banks felt compelled to augment their dollar reserves, they were obliged to provide cheap finance for the U.S. external deficit, like it or not. With cheap foreign finance keeping U.S. interest rates low and enabling American households to live beyond their means, poor households in the developing world ended up subsidizing rich ones in the United States. The cheap finance that other countries provided the U.S. in order to obtain the dollars needed to back an expanding volume of international transactions underwrote the practices that culminated in the crisis. The United States lit the fire, but foreigners were forced by the perverse structure of the system to provide the fuel.
If this was not injustice enough, there is the fact that America’s international financial position was actually strengthened by the crisis. In the course of 2007 the dollar weakened by about 8 percent on the foreign exchange market.8 But since our debts are denominated in our own currency, there was no impact on their dollar value. In contrast, our foreign investments, whether in bonds or factories, became more valuable as the dollar fell.9 The interest and dividends they threw off were worth more when converted back into dollars.
The dollar’s depreciation thereby improved the U.S. external position by almost $450 billion.10 This largely offset the increase in U.S. indebtedness to the rest of the world that would have otherwise resulted from our $660 billion current account deficit. It was almost enough to keep our debts to other countries stable, despite our consuming 6 percent more than we produced. Then in 2008, in the throes of the most serious financial crisis in 80 years, the federal government was able to borrow vast sums at low interest rates because foreigners figured that the dollar was the safest currency to be in at a time of great turmoil. And again in the spring of 2010, when financial volatility spiked, investors fled into the most liquid market, that for U.S. treasury bonds, pushing down the cost of borrowing for the U.S. government and, along with it, the mortgage interest rates available to American households. This is what exorbitant privilege is all about.
But now, as a result of the financial mismanagement that spawned the crisis and growing dissatisfaction with the operation of the international monetary system, the dollar’s singular status is in doubt. The U.S. government has not been a worthy steward of an international currency, its critics complain. It looked the other way while the private sector produced the mother of all financial crises. It ran enormous budget deficits and incurred a gigantic debt. Foreigners have lost faith in the almighty dollar. They are moving away from it as a unit in which to invoice and settle trade, denominate commodity prices, and conduct international financial transactions. The dollar is at risk of losing its exorbitant privilege to the euro, the renminbi, or the bookkeeping claims issued by the International Monetary Fund known as Special Drawing Rights (SDRs).
Or so it is said. It is said by no less an authority than Sarah Palin on her Facebook page, who warned in October 2009 that talk that the Gulf countries might shift to pricing oil in a basket of currencies “weakens the dollar and renews fears about its continued viability as an international reserve currency.”11
That this issue has flashed across the radar screens of politicians who are not exactly renowned for their financial expertise reflects the belief that larger things are at stake. It is thought that widespread international use of a currency confers on its issuer geopolitical and strategic leverage. Because the country’s financial position is stronger, its foreign policy is stronger. Because it pays less on its debts, it is better able to finance foreign operations and exert strategic influence. It does not depend on other people’s money. Instead, it has leverage over other countries that depend on its currency. Compare the nineteenth century, it is said, when Britannia ruled the waves and the pound dominated international financial markets, with the post–World War II period, when sterling lost its dominance and the United States, not Britain, called the foreign-policy shots.
Were all this right, there would have been no reason for me to write this book or for you to read it. In fact, however, much of what passes for conventional wisdom on this subject is wrong. To start, it has cause and effect backward. There may be an association between the economic and military power of a country and the use of its currency by others, but it is a country’s position as a great power that results in the international status of its currency. A currency is attractive because the country issuing it is large, rich, and growing. It is attractive because the country standing behind it is powerful and secure. For both reasons, the economic health of the country issuing the currency is critical for its acquisition and retention of an international role.
But whether its currency is used internationally has at best limited implications for a country’s economic performance and prospects. Seignorage is nice, but it is about number 23 on the list of factors, ranked in descending order of importance, determining the place of the United States in the world. That said, how the country does economically, and whether it avoids policy blunders as serious as those that led to the financial crisis, will determine the dollar’s fate. Sterling lost its position as an international currency because Britain lost its great-power status, not the other way around. And Britain lost its great-power status as a result of homegrown economic problems.
The conventional wisdom about the historical processes resulting in the current state of affairs—that incumbency is an overwhelming advantage in the competition for reserve currency status—is similarly wrong. It is asserted that the pound remained the dominant international currency until after World War II, long after the United States had overtaken Britain as the leading economy, reflecting those self-same advantages of incumbency. In fact, the dollar already rivaled sterling as an international currency in the mid-1920s, only 10 short years after the establishment of the Federal Reserve System. It did so as a result of some very concrete actions by the Fed to promote the dollar’s international role. This fact has very different implications than the conventional wisdom for how and when the Chinese renminbi might come to rival the dollar. It suggests that the challenge may come sooner rather than later.
Finally, the idea that the dollar is now doomed to lose its international currency status is equally wrong. The dollar has its problems, but so do its rivals. The euro is a currency without a state. When the euro area experiences economic and financial problems, as in 2010, there is no powerful executive branch with the power to solve them, only a collection of national governments more inclined to pander to their domestic constituencies. The only euro-area institution capable of quick action is the European Central Bank. And if quick action means printing money to monetize government debts, then this is hardly something that will inspire confidence in and international use of the euro. The renminbi, for its part, is a currency with too much state. Access to China’s financial markets and international use of its currency are limited by strict government controls. The SDR is funny money. It is not, in fact, a currency. It is not used to invoice and settle trade or in private financial transactions. As a result, it is not particularly attractive for use by governments in their own transactions.
The United States, whatever its other failings, is still the largest economy in the world. It has the largest financial markets of any country. Its demographics imply relatively favorable growth prospects.
But the fundamental fallacy behind the notion that the dollar is engaged in a death race with its rivals is the belief that there is room for only one international currency. History suggests otherwise. Aside from the very peculiar second half of the twentieth century, there has always been more than one international currency. There is no reason that a few years from now countries on China’s border could not use the renminbi in their international transactions, while countries in Europe’s neighborhood use the euro, and countries doing business with the United States use the dollar. There is no reason that only one country can have financial markets deep and broad enough to make international use of its currency attractive. There may have been only one country with sufficiently deep financial markets in the second half of the twentieth century, but not because this exclusivity is an intrinsic feature of the global financial system.
The world for which we need to prepare is thus one in which several international currencies coexist. It was with this world in mind that the euro was created. A world of several international currencies is similarly what China is after. China has no interest in “dethroning” the dollar. To the contrary, it has too much invested in the greenback. But preserving its investment in the dollar is entirely compatible with creating a more consequential international role for its own currency. And where the renminbi leads, other emerging market currencies, such as the Indian rupee and Brazilian real, could eventually follow.
Serious economic and financial mismanagement by the United States is the one thing that could precipitate flight from the dollar. And serious mismanagement, recent events remind us, is not something that can be ruled out. We may yet suffer a dollar crash, but only if we bring it on ourselves. The Chinese are not going to do it to us.
But this is to get ahead of the story.


DEBUT
When in 1620 a landing party of English religious dissidents led by William Bradford and Myles Standish came ashore near what is today Provincetown, Massachusetts, they brought with them English money and a custom of expressing values in pounds, shillings, and pence. The colonists were not a wealthy band, and it was not many years before they had expended their English money on supplies from the Old World. Finding a substitute was not easy in a colony without a mint or the permission to establish one, and with England prohibiting the export of coin (the English monarchs husbanding all the precious metal they possessed for fighting expensive wars).
Commodity currency was the obvious alternative. Every schoolchild learns about the colonists’ use of wampum. Native Americans valued the purple and white quahog and whelk shells strung in the form of necklaces and ornamental belts and were willing to part with furs, skins, and other commodities in order to obtain them.1 The colonists with their tools were efficient producers of necklaces and belts. From trade with the natives the use of wampum spread to transactions among the colonists themselves. In 1637 wampum was made legal tender, officially recognized money for paying debts, in the Massachusetts Bay Colony at a rate of six white beads or three purple beads per penny.
But there were only so many snail and clam shells to go around. So the colonists turned to other commodities for use in their barter transactions: corn, codfish, and beaver in the north, tobacco and rice in the south. These items were used in transactions because they were the dominant products of the region. Local governments allowed residents use corn or tobacco to discharge their tax obligations.2 The next step was to declare that the commodity in question should be accepted not just in public payments but by private parties. Massachusetts made corn legal tender. Connecticut did the same for wheat, Virginia for tobacco.3
Making these commodities legal tender had some awkward consequences. When Virginia gave tobacco legal-tender status, there was an incentive to increase production, of the lowest grades in particular. With more tobacco chasing the same goods, the purchasing power of tobacco declined. Farmers complained of low prices. The General Assembly of Burgesses, the representatives of Virginia’s agricultural districts, considered measures to restrict tobacco cultivation but could not agree. In 1682, farmers angry over low crop prices took matters into their own hands, rampaging through the fields and destroying their neighbors’ tobacco plants. The government mustered the militia. The rioters carried out their work under cover of darkness. Order was restored only after several months of police action.
Farm products like tobacco had further disadvantages as media of exchange, stores of value, and means of payment—most obviously bulk, lack of uniformity, and spoilage. There understandably developed a preference for coin. Since the English authorities did not permit the colonies to operate a mint, such coin as circulated had to be imported.4 English coin could be obtained by exporting other merchandise, although London with its prohibitions did not make this easy. Closer at hand, coin could be obtained in the West Indies, silver coins being abundant there as a result of Spain’s prolific Mexican and Peruvian mines. The North American colonists sold dried fish, whale oil, pickled beef, and grain in return for coin. Exportation of many of these products to destinations other than other English colonies and the mother country being prohibited, much of this was smuggled. Coin earned by exporting merchandise was supplemented by that acquired through piracy, an important industry for the seventeenth-century colonists in the established English tradition.5 The pirates spent much of their booty, which included Spanish coin, while on shore leave in the northern colonies.
The most popular coins, weighing 423.7 grains of silver were known as “pesos.” Valued at eight Spanish reals, they were referred to as “pieces of eight.”6 Dealers in foreign exchange in London referred to them as “dollars” or “Spanish dollars,” the Bohemian state of Joachimsthal having produced a coin of similar size and content known as the Joachimsthaler, or as anglicized the “Joachimsdollar.” In addition, gold johannes were imported from Portugal, louis d’or from France, sequins from Venice. But on the eve of the colonies’ war of independence, Spanish silver coins were the dominant part of the coinage.
Coin was supplemented with bills of credit—paper money issued via public loans.7 It was issued, that is, when the colonists’ English overseers permitted, Parliament prohibiting the practice starting in 1751. This ban was among the economic grievances setting the stage for the American Revolution. No less a figure than Benjamin Franklin objected to it in testimony to the British Parliament in 1766.

Comments

Popular posts from this blog

ft

gillian tett 1