Archive for the ‘America’ Category

Countering China’s Military Modernization

February 12, 2009
Author:
Jayshree Bajoria, Staff Writer
February 4, 2009


Implications for the Region

China has been steadily building up its strategic and conventional capabilities since the 1990s. Eighteen years ago, experts say, China had a “bare-bones” military: basic capabilities, but nothing sophisticated or top-of-the-line. But two decades of double-digit spending increases have changed that picture. China says its 2008 defense budget is $61 billion, though the Pentagon has historically challenged Beijing’s reported figures. In its annual report to Congress, the U.S. Defense Department estimated China’s total military-related spending for 2007 to be between $97 billion and $139 billion, as compared to $52 billion reported by China. All that spending has gone to building a sophisticated, modern military: a large, increasingly capable submarine fleet, an air force stocked with Russian warplanes, and technical strides which have improved China’s ballistic missile arsenal, as well as satellite surveillance, radar, and interception capabilities.
China clearly complicates U.S. defense planning in Asia – Adam Segal.

China continues to stress that its military modernization is in line with its peaceful rise in the world. Its latest White Paper on national defense emphasizes it will never seek hegemony or engage in military expansionism. However, this has not alleviated concerns among its neighbors and regional rivals, say experts. A CFR Independent Task Force report (PDF) in 2007 on U.S.-China relations noted that many of China’s neighbors and potential adversaries were making adjustments to their own defense plans and expenditures to balance China’s growing military capabilities.

Japan: Japan and China compete over a host of issues, from regional security to international trade to access to energy. The two countries have a centuries-old history of conflict, including two Sino-Japanese wars that began in 1894 and 1931, and a bloody Japanese occupation of China during World War II. As this Backgrounder points out these animosities surface in recurring cycles, often involving Chinese anger over Japan’s perceived lack of contrition for wartime crimes. But concrete territorial and economic issues also aggravate the relationship, including Japan’s close alliance with the United States, trade frictions, and ongoing disputes over ownership of various islands in the East China Sea. In 2007, China and Japan ranked third and fifth respectively in national defense expenditures (PBS), both spending only a small fraction of the U.S. budget even after adjusting for gross underreporting by Beijing. China’s military modernization fuels Japanese fears that China will use its growing economic leverage and military prowess to throw its weight around and dominate the region. Tanaka Akihiko of the University of Tokyo, speaking at a December 2008 CFR symposium on U.S.-Japan relations said China’s growing military forces might change the balance of power in East Asia, which “would necessitate for Japan and the United States to readjust its force structure and other military management.”

Japan has significantly upgraded capabilities over the past 15 years, deploying the Aegis radar and accompanying missile systems for its navy and warplanes armed with advanced air-to-air missiles for its air force. Since 1998, when a North Korean missile test violated Japanese airspace, Toyko has been working in partnership with the United States to develop theater missile defenses which have obvious application in the event of any conflict with China. Over the past decade the U.S.-Japanese security alliance has been strengthened through revised defense guidelines, which expand Japan’s noncombatant role in a regional contingency, allows for the deployment of an X-Band radar system in Japan as part of a missile defense system, expands bilateral cooperation in training and intelligence sharing, and allows a nuclear-powered U.S. aircraft carrier in the Yokosuka Naval Base. In September 2007 Japan joined a multinational naval exercise with the United States, Australia, Singapore, and India in the area west of the Malacca Straits. The exercise reinforced the U.S.-led campaign of strengthening security ties among its democratic allies and “the strategic countering (PDF) of Chinese military power,” argues a December 2008 U.S. Congressional Research Service report.

In 2005, a joint statement by U.S.-Japan Security Consultative Committee 2005 for the first time included Taiwan as a common strategic objective where the parties would “encourage the peaceful resolution of issues” through dialogue. Though Japan’s foreign affairs ministry said this did not change the country’s position on Taiwan, many experts believe the shift indicates that Japan is increasingly concerned with China’s growing military capabilities.

Taiwan: It is the main driver for China’s militarization drive and biggest concern for the United States as this Backgrounder points out. Taiwan is also pursuing modernization goals which include procurement of army attack helicopters, army utility helicopters, PAC-3 missile defense systems, fighter jets, and diesel-electric submarines, as well as transformation of the military. Relations between China and Taiwan have improved dramatically under the administration of President Ma Ying-jeou, although U.S. arms sales to Taiwan remains contentious. In its white paper on national defense, China says the United States continues to sell arms to Taiwan “causing serious harm to Sino-US relations as well as peace and stability across the Taiwan Strait.” In October 2008, Beijing suspended military contacts with the United States in protest of the U.S. decision to sell $6.4 billion in defense equipment and services to Taiwan.

South Korea: It has undertaken a major modernization drive, replacing antiquated fighter aircraft, frigates, tanks, and artillery pieces with advanced systems, many of them purchased from the United States or developed in partnership with U.S. defense industries, notes the 2007 CFR report (PDF). However, most experts say South Korea’s military initiatives are more in response to a possible conflict with a nuclear North Korea.A thorough look at the status of South Korea’s military is contained in CFR’s Crisis Guide: The Korean Peninsula.

Russia: The country is China’s largest supplier of advanced military hardware as well as a potential great power rival. Moscow experienced a significant decline in its overall military capabilities during the 1990s, but buoyed by strong oil revenues in the past decade, it has been increasing its defense expenditure, in what most experts see as a sign to counter U.S. influence in the region. The 2009 defense budget is expected to be $50 billion, a 25.7 percent increase from previous year. Fedor Lukyanov, chief editor of Russia in Global Affairs, told a January 2009 CFR meetingthat there are limits to Russian military cooperation and arms sales to China. “We sold everything we could without making damage to Russian security,” he said. CFR’s Senior Fellow on Russian and Eurasian Studies, Stephen Sestanovich saysRussia’s relationship with China is based on the interests of some elites–those in the energy sector, the nuclear power sector, and the arms exports, that see China as an important market.

China and Russia also formed a security alliance to solve border disputes, which has grown into an important regional organization, the Shanghai Cooperation Organization (SCO), and includes Central Asian countries Kazakhstan, Kyrgyzstan, Tajikistan, and Uzbekistan. China and Russia have held a number of joint military exercises under the SCO. In October 2007, the SCO also signed a memorandum of understanding with the Collective Security Treaty Organization, a military alliance of several former Soviet states. Some analysts see the SCO as a vehicle for Russia and China to curb U.S. access to the region’s vast energy supplies. But others say Russia and China have very different objectives in Central Asia. Russia wants to reassert its regional leadership there, while China seeks energy ties, note some analysts. In 2008, China and Russia resolved (BBC) their last remaining border dispute involving islands in the eastern part of the border which had seen armed clashes between the two sides during the Cold War.

Southeast Asia: Experts say Southeast Asian countries, including Indonesia, the Philippines, and Thailand, are currently calculating whether the political and economic benefits of closer ties with a strong China outweigh the military risks. Bilateral trade between China and all ten countries within the Association of Southeast Asian Nations (ASEAN) is expected to exceed $200 billion in 2008 up from $ 190 billion in 2007. The region is now China’s fourth-largest trading partner. Despite the economic windfall, ASEAN countries want the United States to pay more attention to the new security trends in the region, experts say. While these countries are not very vocal about their fears, experts say they are nervously looking over their shoulders at China’s military buildup and wondering where it’s headed.

Border disputes with some countries also complicate China’s relations with its Southeast Asian neighbors. Vietnam and China each assert claims to the Spratly and Paracel Islands, archipelagos in a potentially oil-rich area of the South China Sea. Malaysia, the Philippines, and Taiwan also claim all or part of the South China Sea. China’s assertion of “indisputable sovereignty” over the Spratly Islands and the entire South China Sea has elicited concern from Vietnam and its Southeast Asia neighbors, according to the U.S. State Department. Vietnam has been pursuing closer military relations with the United States through joint military exercises, and sharing intelligence on terrorism, drugs, and other transnational threats. Vietnam has also hosted U.S. warships at its ports.

India: It has long-time rivalry with China over disputed borders and these two Asian giants also fought a war in 1962. New Delhi watches Beijing’s military buildup closely and has undertaken military reforms of its own. It is currently building a nuclear submarine and is trying to acquire two more aircraft carriers in addition to the one it possesses. It also launched its first unmanned moon mission in October 2008, another step in what many analysts see as a race with China in space. It has also been expanding its military cooperation with the United States. In June 2005, New Delhi and Washington agreed on a new framework for their defense relationship, including increases in defense trade, technology transfers, and joint exercises for the next ten years.

China’s close military cooperation with Pakistan also concerns New Delhi. Each country helps the other to check India’s power, say experts. China remains a key supplier of arms to Pakistan and in 2008, agreed to help Pakistan build two nuclear power plants. China also supplies Pakistan with nuclear technology and assistance, including what many experts suspect was the blueprint for Pakistan’s nuclear bomb. Pakistan’s army has both short- and medium-range ballistic missiles that experts say came from China. According to Thomas C. Reed, a former U.S. Air Force secretary, China probably helped Pakistan test a nuclear weapon (Physics Today) inside China in 1990. Reed adds that this weapon was most likely based on a Chinese design.
Central Asia: Several Central Asian countries– Kazakhstan, Kyrgyzstan, Tajikistan, and Uzbekistan –have forged closer relations with China as part of the Shanghai Cooperation Organization. Experts say tensions in China’s western province of Xinjiang is one of the major reasons behind Beijing’s eagerness to improve relations with the Central Asian states. Xinjiang is largely made up of Uighurs, predominantly a Muslim community with ties to Central Asia, and China has been concerned that Central Asian states may back a separatist movement in Xinjiang. Beijing also seeks energy ties in the region. Under the SCO, the countries conduct joint military exercises. Problems with their own Muslim fundamentalist groups have led these countries to assist China in its struggle against separatists from Xinjiang, say experts. However, some experts say the Central Asian states still view China and Russia as possible hegemonic powers.

Implications for the United States

China clearly complicates U.S. defense planning in Asia, says CFR’s Senior Fellow for China Studies, Adam Segal. The Pentagon’s 2008 report to Congress states: “Current trends in China’s military capabilities are a major factor in changing East Asian military balances, and could provide China with a force capable of prosecuting a range of military operations in Asia-well beyond Taiwan.” Most countries in the region have some degree of caution in their relationship with China, says James Mulvenon, director of Washington-based Center for Intelligence Research and Analysis. However, none of them, he adds, want to engage in any form of containment policy with the United States. Meanwhile, though China is wary of U.S. military presence close to its border, its troubles with Uighurs has led it to support U.S. military actions inside Afghanistan, say experts.

The best way for the United States to ensure that its security interests in the region are not compromised by China’s growing military capabilities is to strengthen security alliances with China’s neighbors, notes the 2007 Council Task Force report. The report says the United States should better coordinate U.S.-South Korea-Japan security planning, give greater attention to ASEAN, work with ASEAN members to help draw China into constructive security relationships, and pursue a deeper military relationship with India.
Esther Pan contributed to the Backgrounder

China’s Military Power

February 12, 2009
Author: Jayshree Bajoria, Staff Writer
February 4, 2009
Scope of the Threat
China’s Modernization Agenda
U.S. Policy Response

Scope of the Threat

Since the 1990s, China has dramatically improved its military capabilities on land and sea, in the air, and in space. Recently, China has begun to project its military power beyond the Pacific Ocean by deploying a flotilla of small warships in December 2008 to the Gulf of Aden to aid in international efforts to fight Somali piracy. Historically, the United States is most concerned about the possibility of a conflict between China and Taiwan, though tensions between the two have lessened since 2008. But looking decades ahead, U.S. military planners clearly see the potential for China to develop as a “peer competitor.” The U.S. Defense Department’s 2008 report on China’s military power says “much uncertainty surrounds China’s future course, in particular in the area of its expanding military power and how that power might be used.”

But experts say China is still decades away from challenging U.S. military’s preeminence. Its ground forces field 1980s vintage armor and suffer from significant shortcomings in command and control, air defense, logistics, and communications. Its air force, too, lags behind those of Western powers, though China flies about one hundred top-end Russian Su-27 warplanes and has contracted to purchase newer Su-33s, which are capable of carrier-based operations. China plans to build aircraft carriers domestically, but currently has none under construction.

None of this, however, adds up to an arms race. James Mulvenon, director of the Washington-based Center for Intelligence Research and Analysis, says China’s military modernization “makes perfect sense to me as a natural evolution commensurate with China’s rise as a great power.” The concerns expressed by Western military experts focus on longer-term motives. Kerry Dumbaugh, a specialist in Asian affairs at the U.S. Congressional Research Service, sums up these security concerns (PDF) in a 2008 report, citing China’s lack of transparency in military funding and operations; recurring instances of espionage directed at obtaining U.S. military secrets; evidence of China’s improving military and technological prowess; and Beijing’s military and technological assistance to states like Zimbabwe, Myanmar, and others viewed as repressive or international pariahs. Many of these issues may become less contentious through a better military-to-military relationship and improved trust between the two powers, say experts. For instance, the increasing economic interdependence of the United States and China should provide a solid basis for avoiding conflict. But accidents between the two militaries, such as the midair collision between a U.S. spy plane and a Chinese fighter in 2001, or the accidental missile strike on China’s embassy in Belgrade in 1999, could still spark a conflict neither side desires.

China’s Modernization Agenda

China says it pursues a national defense policy solely aimed at protecting its territory and people, and in keeping with its concept of “peaceful development.” The government’s latest white paper on national defense says it will “by and large reach the goal of modernization of national defense and armed forces by the mid-21st century.”  The paper stresses China’s  hopes to create a more technologically advanced, capable military that will allow it to conduct and sustain operations at a greater distance from its border and says the country will make much progress toward that goal by 2020.

As part of this modernization agenda, China is acquiring advanced weapons systems from foreign suppliers as well as trying to develop its own. U.S. Defense Secretary Robert M. Gates, in a statement to the U.S. Senate Armed Services Committee in January 2009, said the areas of greatest concern are cyber- and anti-satellite warfare, anti-air and anti-ship weaponry, submarines, and ballistic missiles. “Modernization in these areas could threaten America’s primary means of projecting power and helping allies in the Pacific: our bases, air and sea assets, and the networks that support them,” he said.

China caused an international uproar in January 2007 when it launched a ballistic missile and destroyed one of its own satellites. The anti-satellite test displayed the growing prowess of China’s space program and raised questions about China’s intentions and civil-military relations within the country. In February 2008, U.S. destruction of a crippled U.S. spy satellite demonstrated that space may emerge as the new contested domain between the great powers. Yet the United States’ relative space advantage will probably shrink as China strengthens its space capabilities over the next ten to twenty years, writes Bruce W. MacDonald in a September 2008 Council Special Report. The United States should champion an approach to space that emphasizes deterrence, he suggests, as well as consider new diplomatic initiatives aimed at preventing space from becoming a potential conflict zone.

The United States has also accused China of hacking into government computer networks at the U.S. Departments of State, Commerce, and Defense. Chinese electronic espionage has been alleged against British companies, as well as government agencies in France, Germany, South Korea, and Taiwan. In its November 2008 report to Congress, the U.S.-China Economic and Security Review Commission noted that cyberspace is a critical vulnerability of the U.S. government and economy. The report warns: “China is aggressively pursuing cyber warfare capabilities that may provide it with an asymmetric advantage against the United States. In a conflict situation, this advantage would reduce current U.S. conventional military dominance.”

China has been modernizing its nuclear weapons systems and continues to emphasize its “no first use” policy on nuclear weapons. However, nonproliferation expert Henry Sokolski, in  May 2008 testimony (PDF) before the U.S.-China Economic and Security Review Commission, said if China were to increase its nuclear weapons deployment, it could prompt its immediate neighbors–South Korea, Japan, and Taiwan–to initiate nuclear weapons programs. Mulvenon says it is a concern for Washington that China has now finally deployed the DF-31–a solid-fueled, nuclear-tipped intercontinental ballistic missile to replace its aging liquid-fueled missiles. The DF-31 provides China with credible and secure second-strike capability, the ability to respond to a nuclear attack with powerful nuclear retaliation.

Beyond specific areas of concern, analysts express worry about discrepancies in China’s defense budgets. China says its defense expenditure for 2007 was around $52 billion and its 2008 defense budget is $61 billion. However, the Pentagon says these figures are grossly underreported. In its annual report to Congress, it estimated China’s total military-related spending for 2007 to be between $97 billion and $139 billion. China argues that its military budget was only 1.38 percent of its gross domestic product (GDP) in 2007, while U.S. defense expenditure was 4.5 percent of GDP. Experts also point to the absolute size of the United States’ defense budgets to show the asymmetric comparison. The 2008 U.S. defense budget was $ 481.4 billion plus  $141.7 billion for the “Global War on Terror.”

U.S. Policy Response

While economic and trade relations between the United States and China have been growing, military to military relations remain relatively underdeveloped.  Military conflict between the two is highly unlikely, but “not impossible” according to CFR Senior Fellow Adam Segal.  Misperception or misunderstanding over an incident in the Taiwan Strait or the sudden collapse of North Korea might be the spark of a conflict neither side wants, he says.

The April 2001 “spy plane” debacle is an example of such a misunderstanding. Experts are worried there could be more incidents as the Chinese Navy and Air Force counduct exercises and patrols farther away from China and come into more frequent contact with the United States, Taiwan, and Japan.

But some experts see little prospect of a closer military relationship between the two countries in the near future. Admiral Timothy J. Keating, commander of U.S. Pacific Command, told CFR.org  it would be a “giant leap of faith” to believe the United States and China could develop a close military partnership any time soon. To improve the relationship, Keating says, will require “more transparency, a better understanding of intention on our part of the Chinese, and to get there we would need more active cooperation with the Chinese.”

The United States has followed a two-pronged strategy in the military-to-military relationship, says Mulvenon. It engages with the Chinese military through senior-level exchanges, and it has sought to modernize and reform its own forces. A 2007 CFR Independent Task Force report (PDF), which was cochaired by Dennis Blair, the Obama administration’s director of national intelligence, recommended a sustained high-level military-strategic dialogue to complement the “Senior Dialogue” started by the deputy secretary of state in 2005 and the “Strategic Economic Dialogue” launched by the treasury secretary in 2006. It also recommends that Washington strengthen its security partnerships with China’s neighbors. As this Backgrounder notes, the United States has been forging closer relations with countries in the region, including India, another regional power. It concluded a groundbreaking nuclear deal with New Delhi in 2008, lifting a three-decade U.S. moratorium on nuclear trade with India. The United States has also been upgrading forward deployed naval and air forces in Guam.

How U.S. defense planners will respond to China’s military buildup going forward is also dependent on the ongoing debate over the biggest threats to U.S. national security in the twenty-first century. Writing in the latest Foreign Affairs, Defense Secretary Robert M. Gates says the country now faces both conventional adversaries and irregular conflicts from insurgents and non-state actors, and it should  “seek a better balance in the portfolio of capabilities it has–the types of units fielded, the weapons bought, the training done.” However, in a situation of finite Department of Defense resources, some analysts argue that “striking the correct planning balance” (PDF) between operations in Iraq and Afghanistan, the war on terror, and China’s military modernization will be a key defense planning challenge.

You Can’t Sell News by the Slice

February 10, 2009

SOMEWHERE at Microsoft, there is a closet packed with leftover Slate umbrellas — a monument to the folly of asking people to pay for what they read on the Internet. These umbrellas — a $20 value! — were the premium we offered to people who would pay $19 for a year’s subscription to Slate, the Microsoft-owned online magazine (later purchased by The Washington Post). We were quite self-righteous about the alleged principle that “content” should not be free. The word itself was an insult — as if we were just making Jell-O salad in order to sell Tupperware.

The experiment lasted about a year. Still, every so often the dream of getting people to pay recurs. It’s recurring now because of the newspaper crisis: they have been hemorrhaging subscribers and advertisers for their paper editions, even as they give away their contents online. In the current Time, its former managing editor, Walter Isaacson, urges a solution: “micropayments.”

Micropayments are systems that make it easy to pay small amounts of money. (Your subway card is an example.) You could pay a nickel to read an article, or a dime for a whole day’s newspaper.

Well, maybe. But it would be a first. Newspaper readers have never paid for the content (words and photos). What they have paid for is the paper that content is printed on. A week of The Washington Post weighs about eight pounds and costs $1.81 for new subscribers, home-delivered. With newsprint (that’s the paper, not the ink) costing around $750 a metric ton, or 34 cents a pound, Post subscribers are getting almost a dollar’s worth of paper free every week — not to mention the ink, the delivery, etc. The Times is more svelte and more expensive. It might even have a viable business model if it could sell the paper with nothing written on it. A more promising idea is the opposite: give away the content without the paper. In theory, a reader who stops paying for the physical paper but continues to read the content online is doing the publisher a favor.

If the only effect of the Internet on newspapers was a drastic reduction in their distribution costs, publishers could probably keep a bit of that savings, rather than passing all of it and more on to the readers. But the Internet has also increased competition — not just from new media but among newspapers as well. Or rather, it has introduced competition into an industry legendary for its monopoly power.

Just a few years ago, there was no sweeter perch in American capitalism than ownership of the only newspaper in town. Now, every English-language newspaper is in direct competition with every other. Millions of Americans get their news online from The Guardian, which is published in London. This competition, and not some kind of petulance or laziness or addled philosophy, is what keeps readers from shelling out for news.

Micropayment advocates imagine extracting as much as $2 a month from readers. The Times sells just over a million daily papers. If every one of those million buyers went online and paid $2 a month, that would be $24 million a year. Even with the economic crisis, paper and digital advertising in The Times brought in about $1 billion last year. Circulation brought in $668 million. Two bucks per reader per month is not going to save newspapers.

And the harsh truth is that the typical American newspaper is an anachronism. It is an artifact from a time when chopping down trees was essential to telling the news, and when you couldn’t get The New York Times or The Washington Post closer to your bed than the front door, where the local paper lies, sopping wet.

The Times, The Post and a few others probably will survive. When the recession ends, advertising will come back, with fewer places to go. There will be a couple of surprises — local papers that execute their transfer to the Web so brilliantly that they will earn a national readership (like the old Manchester Guardian in England). Or some Web site might mutate into a real Web newspaper.

With even half a dozen papers, the American newspaper industry will be more competitive than it was when there were hundreds. Competition will keep the Baghdad bureaus open and the investigative units stoked with dudgeon. Competition is growing as well among Web sites that think there is money to be made performing the local paper’s local functions. One or two of these will turn out to be right. And then, who will pay even a nickel for the hometown rag?

Michael Kinsley is the founding editor of Slate magazine.

Intelligent Design: How To Change America

February 10, 2009


nation-building in america


Insanity is doing the same thing over and over and expecting different results.—Albert Einstein

When experts and pundits are asked what the next President should do, they typically respond with a list of policy suggestions, often mixed with stylistic and political pointers aimed at improving the chances that these suggestions will be well received. Few are likely to focus their answers on institutional change, which is all too easy to conflate with the yawn-inducing phrase “governmental reorganization.” Yet the neglect of institutional design is always a mistake, and never more so than in times of change and crisis. With few exceptions, it is all but impossible to devise and implement genuinely new policies without concomitant organizational design innovation.

Our own history makes this clear enough. A long list of profound challenges has summoned bursts of institutional creativity, the effects of which linger far longer than the occasions that evoked them. The inadequacies of the Articles of Confederation set the stage for the Philadelphia convention and a new Constitution. The electoral crisis of 1800 produced the 12th Amendment, the first significant change in the structures the men of Philadelphia had produced. In the aftermath of the Civil War, Congress and the American people ratified three amendments that resolved—at least in principle—our founding ambivalence between the people and the states as the source of national authority, between the states and the nation as the locus of citizenship, and between slavery and the equality proclaimed and promised by the Declaration of Independence.

We did not stop there. Recurrent financial panics in the last decade of the 19th century and the first decade of the 20th seeded the creation in 1913 of the Federal Reserve Board. The Great Depression produced a flurry of new Executive Branch and independent agencies, as well as the Bretton Woods international economic institutions in the wake of World War II. The onset of the Cold War spawned the Department of Defense, the National Security Council and the CIA. The growing monopoly of fiscal competence and power in the Executive Branch led the legislature to counter by creating the Congressional Budget Office. The terrorist attacks of September 11, 2001 led to the creation of the Department of Homeland Security and the reorganization of the U.S. intelligence system.

Genuine policy innovation does not always require the creation of new structures and processes. Some institutions do find ways to transform themselves from within. Perhaps the best recent example is the U.S. Army, whose established structure and modus operandi experienced not one but two rude shocks in Iraq. Not only did the Army need to develop new capacities to fight counterinsurgency campaigns; it also had to pivot from winning the war to a “nation-building” focus for securing the peace. Each required profound changes in doctrine, organization, personnel and training.

The Army’s capacity for self-reflection and innovative renewal should be a model for the rest of the government. The Army uses after-action reviews to assess, as honestly and bluntly as possible, what went right and wrong. It would be a quiet revolution if other U.S. government agencies did the same.

But they generally don’t. As bureaucratic as military organizations and the Department of Defense itself can be, these institutions are necessarily focused on outcomes, while many other government institutions seem focused on process. It is harder for process-habituated institutions to respond quickly and effectively to external challenges without the guiding hand of an institutional re-designer, but we desperately need such a guiding hand now, for signs abound that we are entering a new cycle of challenge and response.

The global economic meltdown offers the most obvious example. Take the Wall Street Journal of October 7, 2008, where readers found an article lamenting the European Union’s failure to arrive at a coordinated response to the crisis, and tracing that failure to the lack of institutions that could do for Europe what the Fed does for the United States. On the same page appeared a summary of a speech by World Bank President Robert Zoellick decrying the failure of the G-7 and calling for a more globally representative organization with greater capacity—including a new “steering group”—to forge coordinated policies. Elsewhere that same day, reports appeared of President Bush’s willingness to join with European leaders to forge a “new Bretton Woods.” Also prominently featured were scathing criticisms of the array of Rube Goldberg-style regulatory institutions that had proved unable to foresee and ward off the financial crisis, along with calls for radical institutional change.

The moral is clear: In challenging times, political leaders are drawn to institutional reform, not because they want to do it, but because they must. Our own era will be no exception. The question is not whether we will create new institutions in the next four, eight or ten years, but how, to what purpose, and with what consequences for the nation and the world.

Aclassic reason to create new institutions is to elevate the visibility and importance of particular topics. Whether well designed for its purpose or not (the jury is still out), the Department of Homeland Security reflected the new salience of a challenge many Americans had previously downplayed or entirely ignored. While pressure from teachers’ unions played a role in the Carter Administration’s push to elevate the Office of Education to departmental status, the move also coincided with rising concern about the performance of U.S. public schools. Not only did Jimmy Carter’s successor not fulfill his campaign pledge to abolish the Department of Education; it was Ronald Reagan’s Secretary of Education who pushed to publish Nation at Risk, a report that captured public attention and defined the agenda of education reform for years to come.

This sequence of events illustrates another important feature of institutions: Once created, they amplify the voices of those who care about their specific missions. They do so by enabling agency leaders to sit at relevant Executive Branch tables, by providing them an opportunity to deal directly with the legislature, and by giving outside advocates—interest groups, non-profit organizations and policy experts, among others—an address to which they can take their concerns and advice.

From the standpoint of democratic theory and practice, there is much to be said for using institutional design to multiply the number and variety of voices participating in the policy process. But, as with most things in life, tradeoffs arise: The more independence institutional voices enjoy, the more difficult but important it is to create a new layer of institutions to bring them together to make decisions. This can take the form of high-level coordinating mechanisms, such as the National Security Council, the National Economic Council and the Domestic Policy Council. It can also take the form of super-agencies that formally bring together previously separated entities (as the Department of Defense did in 1947 and the Department of Homeland Security did in 2004).

At the same time, constitutional democracies often use institutional design to insulate certain activities from direct political pressure. The U.S. Constitution, for example, mandates that Federal judges, once confirmed, hold office during “good behavior” and receive salaries that Congress may not reduce during their term of service. (By contrast, many state judges are subject to regular election and possible recall.) The Federal Reserve Board is an even more striking example. Members of its governing board are appointed to 14-year non-renewable terms, limiting the ability of the Executive Branch to change its membership rapidly and removing governors’ incentives to trim their policy sails in hopes of reappointment. Moreover, no entity in the Executive Branch is necessary to ratify the Fed’s decisions, and Fed chairmen have been known to take steps that displease the occupant of the Oval Office.

Any strategy designed to insulate an institution is bound to be controversial in a mass democracy. Officials with populist leanings often argue that fundamental decisions affecting the economy should be made through transparent democratic processes. The counter-argument is that experience dating back to the founding of the republic suggests that, for example, when interest rates and the money supply are set at the whim of transient majorities, economic growth and stability suffer; it is necessary, therefore, to pursue democratic ends with less than fully democratic means. It will be recalled, too, that the Founders vouchsafed the ultimate guarantee of American democracy in the least democratic of the three branches, the Supreme Court.

CHANGE WE NEED

Institutions matter. They are neither sideshows nor side effects of more fundamental social realities: They are fundamental social realities. Institutions ensure effective cooperation by mediating and controlling differences. They constitute systems of formal constraints on both the ends and means of collective action. They shape decisions by how the process of deciding takes place as well as by the intended content of decisions. Institutions are also focal points for information, expertise and conversation, and they serve as sites of memory—of events that have shaped the past and of policies tried without success—a particularly valuable function in a political system with what seems to be a steadily decreasing attention span.

Because of the critical functions institutions serve—in essence, they are the collective cognitive infrastructure of society—they must keep up with the external environment, the management of which is their basic purpose. When institutions fall badly behind, they do not produce only random or episodic failure but generative and systemic failure. In short, when they are at far less than their best, they can cause all hell to break loose.

That, precisely, may well be the risk we are now running. For most of the past twenty years, we have witnessed a flowering of private initiative, economic innovation and cultural creativity in America. We have as a society simply outgrown or transcended many of our apparent problems from a generation ago. But we have not outgrown and transcended them all. Indeed, in sharp contrast to what has been going on in American society, government at all levels has become increasingly sclerotic and ever more misaligned with reality. Wherever we look today in the Federal government, we find mediocre, and in many cases failing, institutional structures across a broad array of functions. Through a program of bold reform, the new Obama Administration could get more bang for our government’s increasingly scarce bucks, and also send a credible signal to the people that the future won’t be business as usual. That program could be designed along three basic rubrics: fiscal redesign, deconsolidation and reduced polarization.

Fiscal Redesign. The meltdown of the U.S and global financial system offers the clearest example of the institutional malfunction that Congress and the President must end. Piecemeal tinkering will not work. Institutions that regulate lending, insurance and solvency must be reinvented as they were during the 1930s. And to secure vital international cooperation, President Obama will have to work with other world leaders to do for the first half of the 21st century what the Bretton Woods institutions did for the last half of the 20th.

Whatever the outcome of these discussions, one thing is clear: Unfreezing the credit markets and stabilizing the financial system will be an enormously costly undertaking. The deficit for the current fiscal year alone may well exceed $1 trillion, and the national debt is on course to top $11 trillion, double what it was just eight years ago. In these circumstances, controlling Federal spending will be more important than ever. Restoring pay-as-you-go budgeting and putting some teeth in it will be a start, but that is not nearly enough. We need more radical changes in rules and procedures.

One option, recently proposed by a bipartisan group that includes three former Directors of the Congressional Budget Office, would change the status of entitlement programs such as Social Security, Medicare and Medicaid. The new rules would require a review every five years to determine whether projected revenues and outlays are in balance. If they are found to be out of balance, Congress would be required to restore balance through revenue increases dedicated to these programs, benefits cuts or some combination of the two. After a severe financial crisis in the early 1990s, Sweden introduced a variant of this plan, and it has worked reasonably well.

Another option draws on the experience of the Base Realignment and Closure Commission, which has enabled the military to surmount NIMBY (i.e., “not in my backyard”) politics and shut down unneeded bases. The basic idea is straightforward: Once the independent commission settles on a list of proposed closures, Congress can vote it up or down but cannot micromanage the package through amendments. A similar logic undergirds presidential “fast-track” authority to negotiate trade treaties, which Congress can reject but cannot modify.

Suitably adapted, this concept could help break long-standing logjams in the fiscal arena. Here is one way it might work. Independent commissions with members drawn from both political parties could submit proposals in designated areas of fiscal policy. To increase the odds of bipartisan appeal, each proposal would require a three-fifths majority of the commission to be adopted. After the proposal reaches the floor, the majority and the minority would each have the opportunity to offer a single amendment, after which there would be a vote on final passage. This strategy of “empowered commissions” would change the incentive structure of members of Congress and reduce the opportunity for negative logrolling to undermine the prospects of proposals that would otherwise gain majority support.

Deconsolidation. Other institutional changes President Obama could adopt are less momentous but could still significantly improve government performance at modest cost. Reversing unsuccessful consolidations would be a good place to start.

When the Department of Homeland Security (DHS) was created, Congress and the Bush Administration cast a wide net. Indeed, they cast it too wide—much wider than suggested by the initial expert proposals for a such a Department. Including the Federal Emergency Management Agency in DHS, for example, made more sense in bureaucratic flow charts than in the real world. To be sure, FEMA has raised its game since the Hurricane Katrina fiasco. But as long as this agency remains buried in a department whose principle mission is fighting terrorism, it will not command top-quality management or adequate resources. As the Kennedy School’s Elaine Kamarck has argued, the best and simplest remedy is to restore the status quo ante—namely, an independent FEMA, an arrangement which performed well during the Clinton Administration.

Our nation’s public diplomacy institutions mark another instance of a failed consolidation. Secretary of Defense Robert Gates argued last year that, in waging and winning the Cold War, institutions mattered as much as people and policies. In the wake of the Cold War, we weakened not only our military and intelligence capabilities, but also the institutions of “soft power” that enabled us to communicate effectively with other parts of the world. The fault was bipartisan. During the 1990s, arch-conservative Senator Jesse Helms (R-NC) applied relentless pressure to abolish the U.S. Information Agency. He found partners in two successive Secretaries of State interested in bringing public diplomacy under State Department control, and in a White House eager to demonstrate that it was reinventing government. By 1999, as Secretary Gates put it, “the U.S. Information Agency was abolished as an independent entity, split into pieces, and many of its capabilities folded into a small corner of the State Department.”1

Today, the United States finds itself engaged in new ideological struggles against authoritarian populism and Islamist radicalism. Opinion surveys show that we are on the defensive and losing ground. While inherently unpopular policies are largely responsible, our failure to build bridges to other peoples has made matters worse. What we are doing is not working. Simply recreating the institutions of the past is unlikely to make matters better, but neither will simply adding people and money to today’s flawed structures.

Our public diplomacy will remain weak as long as it is subordinated to the State Department’s traditional concerns—a fact that leading conservatives are readier to acknowledge than are most liberals. To give public diplomacy the clout it needs, we should create a new cabinet-level Department of Global Information and Communications (DGIC) and back it with the resources it would need to succeed. Its functions would include: training public diplomacy officers for embassies and consulates around the world; offering media and communications training for State Department officials and those other agencies that represent the United States abroad; jump-starting the study of critical foreign languages through a new National Security Languages Initiative; bolstering area studies, including creating new interdisciplinary centers and convening high-level conferences; sponsoring and conducting ongoing survey research on public opinion abroad; assuming responsibility for international broadcasting beyond pure news; expanding vital people-to-people programs, including exchanges of scholars, students and cultural institutions, expert American speakers abroad and English-language teaching; and launching a massive new translation program, the “American Knowledge Library Initiative”, designed to make the best of our thought and culture available abroad in critical foreign languages.

The DGIC would assume a significant role within the U.S. foreign policy bureaucracy. For example, its representative would be at the table during interagency deputies’ meetings, articulating the consequences of policy options for our standing in the world. As the Advisory Group on Public Diplomacy for the Muslim and Arab World argued in 2003,

while the United States cannot and should not simply change its policies to suit public opinion abroad, we must use the tools of public diplomacy to assess the likely effectiveness of particular policies. Without such assessment, our policies could produce unintended consequences that do not serve our interests.

In addition, a senior official of the DGIC would chair an interagency Policy Coordination Council on public diplomacy, with representation from the State Department, USAID, Defense and Commerce, among others.

This is not to say that consolidation of related functions in a single agency is always (or even usually) mistaken. There are many currently disjointed governmental activities that should be brought together, food safety being a clear example. Responsibility for the safety of the nation’s food supply is now spread among the Departments of Treasury, Commerce and Agriculture, as well as the Environmental Protection Agency, the Food and Drug Administration and the Centers for Disease Control and Prevention. Coordination among these entities, which possess varying powers and enforce diverse standards, is hard to achieve. In addition, the USDA is cross-pressured by producers who do not want the kind of inspections that would slow production and raise costs. The globalization of the supply chain has introduced a new level of complexity, as recent problems with salmonella-tainted Mexican vegetables and melamine-laced Chinese food products have illustrated.

This situation cries out for what many advocates have long urged—namely, a single, unified agency whose principal mission is to ensure the safety of the nation’s food supply. It is hard to avoid the conclusion that if the challenge of food safety had been elevated out of the muck of special interest influence and governed solely by the criterion of serving the public good, such an agency would long since have come into existence.

Reduced polarization. The rise of political polarization in recent decades has made it much more difficult for the U.S. government to work effectively. A multi-year collaboration between the Brookings and Hoover Institutions has mapped the scope of the phenomenon and has shown that while political elites are more sharply divided than the people, citizens are more likely to place themselves at the ends rather than in the middle of the ideological spectrum than they were as recently as the 1980s. With a smaller political center to work with, even leaders committed to bipartisan compromise have found it harder to come by.2 The fate of President Bush’s 2005 Social Security proposal illustrates the difficulty of addressing tough issues under these circumstances.

It might seem that the only cure for polarization is a shift of public sentiment back toward moderation. The Brookings-Hoover project found, however, that changes in institutional design could help mute the consequences of polarization and might over time lower the partisan temperature. Here are three ideas, culled from a much longer list.3

First, the judicial confirmation process has become poisonously adversarial. One response is to increase the use of bipartisan commissions to generate slates of nominees from which an administration would have to choose. This would reduce the president’s opportunity to fire up his base with strongly liberal or conservative picks and would limit his ability to transform the ideological makeup of the Federal judiciary.

True, on the face of it, this would be an unappetizing prospect for most presidents. One way to render commissions more attractive to an otherwise unreceptive White House would be to link them to a fast-track procedure: Judicial nominees selected on a bipartisan basis would enjoy expedited Senate Judiciary Committee hearings and be assured a prompt up-or-down vote on the floor. The use of Senate “holds” and filibusters would be ruled inadmissible. This would reduce the time, attention and political capital the White House would have to expend on the confirmation process, freeing up resources for tough legislative battles.

Congressional redistricting offers another opportunity for depolarizing reform. While population flows account for much of the growth in safe seats dominated by strong partisans, gerrymanders have probably accounted for 10–36 percent of the reduction in competitive congressional districts since 1982—not a trivial effect.

Few Western democracies draw up their parliamentary districts in so patently politicized a fashion as do U.S. state legislatures. Parliamentary electoral commissions, operating independently and charged with making reasonably objective determinations, are the preferred model abroad. To be sure, foreign commissions typically permit something that the Supreme Court’s “one-man, one-vote” standard has effectively outlawed: districts of varying population size. If that flexibility existed, it would facilitate the weighing of additional criteria such as considerations of contiguity and conformity with existing community or jurisdictional boundaries. Still, this does not mean that nothing can be done to improve a dysfunctionally politicized process.

Given the Supreme Court’s reluctance to enter the thicket of redistricting controversies, any changes will be up to the state governments. In recent years, voter initiatives and referenda in four states—Washington, Idaho, Alaska and Arizona—have established nonpartisan or bipartisan redistricting commissions. These local efforts have struggled to solve a complicated riddle: how to enhance competitiveness while respecting other parameters—such as geographical compactness, jurisdictional boundaries and the natural desire to represent “communities of interest.” Iowa’s approach, where a nonpartisan legislative staff has the last word, is often cited as a model, but it may be hard to export to states with more demographic diversity and complex political cultures. Arizona has managed to fashion workable, empirically based standards that are yielding more heterogeneous districts and more competitive elections.

A third depolarizing reform would promote the participation of less ideologically committed voters in the electoral process. Some observers do not view the asymmetric power of passionate partisans in U.S. elections as any cause for concern. Why shouldn’t political decisions be made by the citizens who care most about them? Aren’t those who care also better informed, and doesn’t their intensive involvement indicate that the outcome of the election affects their interests more than it does those of non-voters? While this argument has a surface plausibility, it is less than compelling. Although passionate partisanship infuses the system with energy, it has built-in disadvantages, including the roadblocks it erects against problem-solving. Many committed partisans prefer gridlock to compromise, not a formula for effective governance.

To broaden the political participation of less partisan citizens, who tend to be more weakly connected to the political system, a number of major democracies have made voting mandatory. Australia has done so, using small fines for non-voting but escalating for recidivism, with remarkable results. The turnout rate in Australia now tops 95 percent, and more than ever citizens regard voting as a civic obligation.

Near-universal voting under such circumstances does raise the possibility that a bulge of casual voters, with little understanding of the issues and candidates, can muddy the waters. Nonetheless, the inevitable presence of some “donkey voters”, as they are called in Australia, is more than offset by the civic benefits of higher turnouts. Candidates for the Australian house have gained an added incentive to appeal beyond their partisan bases. One wonders whether members of Congress in the United States, if subjected to wider suffrage, might also spend less time transfixed by symbolic “culture war” issues that are primarily objects of partisan fascination, and more time coming to terms with the nation’s real problems. At least campaigns continually tossing red meat to the party faithful might become a little less pervasive.

The United States is not Australia, of course. Although both are federal systems, the U.S. Constitution confers on state governments much more extensive control over voting procedures. While it might not be flatly unconstitutional to mandate voting nationwide, it would surely chafe with American custom and provoke opposition. But American-style federalism has some compensating advantages, including its tradition of using states as “laboratories of democracy” that test reform proposals before they are elevated for consideration at the national level. If a few states experimented with mandatory voting and demonstrated its democratic potential, they might smooth the way to consideration of the idea on the national level.

The American people know that everything made by human beings—including their political institutions—is imperfect. They can accept imperfection. What infuriates them is the typical pattern of denial that anything is wrong, followed eventually by an epidemic of finger-pointing that thwarts a sober assessment of what is needed to put things right. The American people can accept some delay, too, for delay can be less dangerous than haste. What they cannot, or at any rate should not, accept is delusion—the highly improbable idea that genuinely new and better policies and outcomes can emerge from institutions that are unchanged, unexamined and unreformed.



1. Gates, Landon Lecture, Kansas State University, November 26, 2007.2. See Pietro S. Nivola and David W. Brady, eds., Red and Blue Nation? Volume One: Characteristics and Causes of America’s Polarized Politics (Brookings Institution Press, 2006); and Nivola and Brady, eds., Red and Blue Nation? Volume Two: Consequences and Correction of America’s Polarized Politics (Brookings Institution Press, 2008).3. These proposals draw on Pietro Nivola and William A. Galston, “Toward Depolarization”, in Red and Blue Nation? Volume Two.

Economists Agree Time Is of the Essence for Stimulus

February 8, 2009

By Steven Mufson and Lori Montgomery
Washington Post Staff Writers
Sunday, February 8, 2009; A01

With Congress moving closer to adopting a $820 billion stimulus package and the Obama administration poised to unveil a new bank bailout plan, economists say that the federal government is taking its biggest role in the economy in a generation.

States that once aspired to blaze trails independent from Washington are turning to it for money, banks and businesses that once decried regulation now are seeking federal capital, grants or tax cuts and individuals are looking for tax relief.

“This is a seismic shift in the role of government in our society,” said Allen Sinai, chief global economist for Decision Economics. “Those who believe the government can be an effective, positive instrument for good will have another chance to try it,” said Sinai, a political independent.

While economists remain divided on the role of government generally, an overwhelming number from both parties are saying that a government stimulus package — even a flawed one — is urgently needed to help prevent a steeper slide in the economy.

Many economists say the precise size and shape of the package developing in Congress matter less than the timing, and that any delay is damaging.

“Most of the things in the package, the big dollar amounts, are things that are pretty quick stimulus and need to be done,” said Alice Rivlin, who was former president Bill Clinton’s budget director and who criticized aspects of the proposed stimulus in congressional testimony two weeks ago. “Is it a perfect package? Of course not. But we’re past that. Let’s just do it.”

Economists who initially rejected the need for fiscal stimulus have warmed to the idea, too. Several months ago, Alan Viard, a Bush administration economist now at the American Enterprise Institute, thought the right size for a government spending bill was “probably zero.” He favored reliance on the Federal Reserve to slash interest rates and existing unemployment benefits to bolster the jobless.

Now Viard shares the view that a stimulus package is needed, although he would prefer one limited primarily to tax cuts and direct benefits for victims of the recession, such as increased unemployment benefits.

“Things have gotten so bad so quickly,” Viard said. “We have now lost 3.6 million jobs, a stunning loss. But what’s more horrifying is that half that loss has occurred in the last three months. This is a severe recession. There’s no doubt about it.”

With the deal cut late Friday in the Senate, both chambers of Congress have settled on stimulus packages with about $820 billion of tax cuts and spending increases. Both packages place a heavy emphasis on spending with federal money for states and the unemployed as well a range of targets including “smart meters” for electricity, expanded broadband access, Pell grants for education and pothole repair.

The hodgepodge of tax cuts and spending programs won’t solve the country’s basic problem of rot at the heart of the banking system and excessive borrowing by large numbers of people and corporations, economists say, but it might blunt some of the effects by putting cash in the hands of hard-pressed individuals and state budget planners.

President Obama yesterday welcomed the Senate compromise on a stimulus plan and exhorted Congress to hurry to finish work on the legislation that he had originally hoped to sign on his first day in office.

Despite Obama’s plea, this week promises more haggling over the package. The Senate is expected to pass its version Tuesday, but then leaders must reconcile sharp differences with the House on a number of issues. The Senate proposal protects millions of taxpayers from the bite of the alternative minimum tax while slashing the amount of aid to ailing states for education spending and school construction that the House included in its plan.

Lobbyists and lawmakers were gearing up for a fight this week over remaining differences. That could make it difficult for House and Senate negotiators to meet Obama’s revised goal of having a measure in place for his signature by the Presidents’ Day break, which begins this weekend.

“If this is a harbinger of the future, God save us,” said Robert Reischauer, president of the Urban Institute and former director of the Congressional Budget Office. “Here we are shoveling out the goodies and we can’t agree on that. What happens when you have to shift the car in reverse, or deal with something like health reform or energy policy.”

Obama sounded the theme of urgency in his radio address yesterday. “Legislation of such magnitude deserves the scrutiny that it’s received over the last month, and it will receive more in the days to come,” he said. “The scale and scope of this plan is right. And the time for action is now.”

The president singled out the 16,000 estimated job gains that would result in Maine, whose two Republican senators, Susan Collins and Olympia J. Snowe, broke ranks to cut a compromise deal with Democrats. Sen. Arlen Specter (R-Penn.) also pledged to support the bill, and Democrats hope to pick up a few other Republicans when the final vote comes.

Despite a growing sense of urgency, economists across the political spectrum continue to criticize the congressional stimulus plans. Most economists agree that the Senate alterations in the plan would undermine stimulus aims. Taxpayers who fall under the AMT are generally well-off enough to be able to save some of the tax cuts they receive, delaying any positive effect on the economy. By comparison, school aid to states would probably be spent immediately to prevent layoffs of teachers.

N. Gregory Mankiw, a Harvard University economics professor who was chairman of former president George W. Bush’s Council of Economic Advisers, supports cuts in payroll taxes partially offset by gradual increases in gasoline taxes. He says more time should be taken to craft spending programs that would not be wasteful.

Joseph Stiglitz, a Nobel Prize-winning economist at Columbia University and former chief economist at the World Bank, said that the stimulus package was “probably too little, especially given that it is badly designed [and] we haven’t yet fixed the mortgage problem so the financial sector is likely to continue bleeding.”

Stiglitz said that most households would save rather than spend the money from tax cuts and that the business tax cuts were not closely enough linked to new investments. He said that while plans for infrastructure spending were flawed, it was “unlikely to be wasted as badly as the private financial market has wasted resources in last five years.”

Whatever its flaws, the stimulus package could create or save as many as 4 million jobs by the end of next year, helping to offset the 3.6 million jobs lost since the nation slid into recession in December 2007, according to an analysis by Sinai. Many of those jobs will be created in state and local government, with fewer coming in private sectors such as education and health, he said.

As a result, Sinai said, the eventual recovery will be driven by government spending rather than tax cuts — the first time that’s happened in the United States since President Ronald Reagan won election by vowing to get government off people’s backs.

Sinai said his models indicated an equal mix of tax cuts and government spending would be most beneficial to the economy. If spending gets into the economy quickly, it has a sharper impact on growth, he said. And while people typically save rather than spend a portion of tax cuts, Sinai argues that that “goes to repair their financial positions . . . so that a year or two from now, they’ll be in better shape to spend more.” With household finances in their worst condition in years, Sinai said, it may make sense to give people an opportunity to pay down debt and rebuild savings.

But the difference in stimulus approach is not worth the cost of delay, Sinai agrees. “The greater good is to pass quickly and without confrontational disagreements a large stimulus program,” he said.

In Hawaii on Friday, San Francisco Federal Reserve Chairman Janet Yellen added her voice to the supporters of quick action on a stimulus measure.

“In ordinary circumstances, there are good reasons why monetary, rather than fiscal, policy should be used to stabilize the economy,” she said, citing lags in adopting and implementing government spending programs. “The result is that fiscal stimulus sometimes kicks in only after the need has passed. However, the current situation is extraordinary, making the case for fiscal action very strong.”

Yellen said, “There is — and there should be — vigorous debate about the form it should take and about the likely effectiveness of particular fiscal strategies. However, it is critical that decisions on these matters be made on a timely basis so that the economy’s downward spiral is not allowed to deepen.”

There is No Peace Dividend

January 31, 2009

Reflections on empire, inequality, & “Brand Obama”

January 2009 Street’s ZSpace page

Here are 53 words that might help us situate President-elect Barack Obama in the world of power as it really is, not as many of us wish it to be: “As we understand it, Obama has been advised and agrees that there is no peace dividend…. In addition, we believe, based on discussions with industry sources, that Obama has agreed not to cut the defense budget at least until the first 18 months of his term as the national security situation becomes better understood.” These two sentences come from a report issued by the leading Wall Street investment firm Morgan Stanley one day after the November 2008 elections.

The company probably understates matters. We should not anticipate significant Pentagon expenditure reductions at any point under the new Administration, unless they are forced by popular pressure that the American power elite expects Obama to preempt. “The Democrats,” Morgan Stanley’s researchers note, “are sensitive about appearing weak on defense and we don’t expect strong cuts.”

“Defense” is in an interesting label for a giant military budget that pays for two occupations (in Iraq and Afghanistan) and 770 military bases located in more than 130 countries. The United States accounts for nearly half (48 percent) the military spending on the planet. Coming in at $1 trillion (by the measure of the U.S. Office of Management and Budget’s National Income and Product Accounts) in 2007, American “defense” spending outweighs domestic U.S federal expenditure on education by more than 8 to 1; income security by more than 4.5 to 1; nutrition by more than 11 to 1; housing by 14 to 1; and job training by 32 to 1. The military accounts for more than half of all discretionary federal spending.

The “peace dividend” refers to the notion of reversing these “perverted national priorities” (Dr. Martin Luther King’s phrase) by taking money spent on war and the preparation for war and using it to address human problems like poverty, ecological crisis, crumbling infrastructure, joblessness, and inadequate education, health, housing, and schooling.

The idea of a “peace dividend” received some attention in the U.S. around the end of the Cold War, when many progressives hoped that the collapse of the Soviet Union would encourage a shift in public resources from militarism to social health. For nearly half a century, the alleged (mythical) threat posed by Russian “communism” provided the core propagandistic justification for the expansion and use of U.S. military power. With the Soviet-specter eliminated, progressives dreamed the U.S. could now be realistically pressured to transfer significant public resources towards meeting social needs and away from the maintenance of the most spectacular and deadly military-imperial system in history.

The dream was ended by George Bush I’s two wars of invasion (Panama, 1989 and Iraq, 1990-91), Bill Clinton’s air war on Serbia (1999), and the dominant U.S. media throughout. The “military-industrial-media triangle” (John Bellamy Foster, Hannah Holeman, and Robert W. McChesney’s term) and its many enablers and allies in church, school, academia, and other wings of so-called “civil society” rapidly substituted new rationalizations and pretexts for the persistence of a permanently militarized U.S. economy and culture: purported protection and advance of “free markets” and “democracy” (falsely conflated), the U.S, right of “humanitarian intervention,” and the grave dangers posed by terrorists, drug-traffickers, and “weapons of mass destruction.”

Still, a 2004 poll by the Chicago Council of Foreign Relations found that just 29 percent of Americans support the expansion of government spending on “defense.” By contrast, 79 percent support increased spending on health care, 69 percent support increased spending on education, and 69 percent support increased spending on Social Security.


“The American Moment…Must Be Seized Anew”

Obama’s National Security cabinet picks as of mid-December 2008 are consistent with Morgan Stanley’s judgment on the president-elect’s likely policy direction. As New York Times political analyst David Sanger noted on his paper’s front page in December, those appointments include “two veteran Cold Warriors [National Security Adviser and former NATO commander James L. Jones and current and future Defense Secretary Robert Gates] and a political rival [Secretary of State Hillary Clinton] whose records are all more hawkish than the new president.”

Sanger’s commentary left out other “hawkish” appointments, including Vice President Joe Biden (a major facilitator of George W. Bush’s pre-Iraq invasion propaganda in the U.S. Senate), United Nations Ambassador Susan Rice (an eager promoter of the myth that Saddam Hussein possessed “weapons of mass destruction” and “need[ed] to be dealt with forcefully”), and White House Chief of Staff Rahm Emmanuel (a fierce opponent of antiwar sentiment inside the Democratic Party and a leading supporter of U.S.-sponsored Israeli militarism, occupation, and apartheid).

Sanger also deleted the critical fact that Obama’s real foreign policy record is much more “hawkish” than the dovish imagery his marketers crafted for liberal and progressive voters during the presidential campaign. Beyond his longstanding stealth support for the occupation of Iraq, Obama has repeatedly announced his fierce devotion to the broader underlying American empire project in numerous statements meant to demonstrate his safety to the U.S. foreign policy establishment. Declaring that “we can be [Kennedy’s] America again,” a 2007 Obama article (titled “Renewing American Leadership”) in the Council of Foreign Relations journal Foreign Affairs, Obama essentially accused the Bush administration of dropping the ball of American world supremacy. “The American moment is not over, but it must be seized anew,” Obama proclaimed, adding that “we must lead the world by deed and by example” and “must not rule out using military force” in pursuit of “our vital interests.” The last three words a code phrase for other nations’ oil, located primarily in the Middle East.

“A strong military,” Obama wrote, “is, more than anything, necessary to sustain peace.” We must “revitalize our military” to foster “peace,” Obama added, partly by adding 65,000 soldiers to the Army and 27,000 to the Marines. “We must also become better prepared to put boots on the ground…on a global scale.” Reassuring top militarists that he would not be hamstrung by international law and civilized norms when “our vital interests” are “at stake,” Obama added that “I will not hesitate to use force unilaterally, if necessary…. We must also consider using military force in circumstances beyond self-defense in order to provide for the common security that underpins global stability.”

As leading neoconservative foreign policy advisor Robert Kagan crowed after reading a typically imperial Obama speech to the Chicago Council on Global Affairs last year: “Obama talks about…maintaining ‘a strong nuclear deterrent.’ He talks about how we need to ‘seize’ the ‘American moment.’ We must ‘begin the world anew.’ This is realism? This is left-liberal foreign policy? Ask Noam Chomsky next time you see him” (Washington Post, April 29, 2007).


“Spiritual Death” Lives

According to Dr. Martin Luther King, Jr. in 1967, “A nation that continues year after year to spend more money on military defense than on programs of social uplift, is approaching spiritual death.” Thirty-one years later, as the openly imperial Obama ascends to power draped (for many) in the rebel’s clothing of Dr. King, the top 1 percent in the U.S. owns nearly 40 percent of the nation’s wealth—a natural outcome of the so-called “free market capitalism” Obama repeatedly embraced during the presidential campaign. The privileged American “overclass” enjoys astonishing opulence while more than 37 million Americans live beneath the federal government’s notoriously inadequate poverty level, even before the onset of full-blown economic crisis in the fall and winter of 2008-09. A shocking 43 percent (equaling nearly 16 million) of those officially poor Americans live in what researchers call “deep poverty,” at less than half the federal government’s notoriously inadequate poverty.

Deep poverty has been on the rise in the U.S. over recent years and decades, thanks in part to Bill Clinton and New Gingrich’s elimination (“reform”) of poor families’ entitlement to federal family cash assistance. Currently at its highest rate since 1975, the “deep poverty” measure and other terrible socioeconomic indicators will only worsen as the U.S. heads into its worst recession since the 1970s and, perhaps, into a depression.

Even without recessions or depressions, the U.S. is the industrialized world’s most unequal and wealth-top-heavy society by far. Wealthy Americans have the benefit of the finest health care in history while 45 million Americans lacked health insurance even before the latest recession. Expect that number to hit 50 million any day if it hasn’t already. African Americans are afflicted with a national median-household-wealth gap of seven black cents on the white dollar. Numerous interrelated forms of institutional racism continue to saddle black America with a heavily disproportionate burden of poverty, injury, sickness, homelessness, unemployment, incarceration, and criminal marking even as the nation celebrates Obama’s election as a symbol of its transcendence of racial bigotry.

Meanwhile, American corporations and wealthy elites get regular public assistance (corporate welfare) that belies the privileged few’s ritual proclamations of faith in the mythical notion of free market capitalism. They profit from numerous powerful state-capitalist protections and subsidies while U.S. social programs are minimal compared to those of Western Europe and Canada.

The extravagant “defense” (empire) budget that Morgan Stanley reasonably expects Obama to keep funded at fantastic levels, even while deep poverty spreads at home and abroad, is itself a giant public subsidy to leading high-tech and energy corporations like Boeing, Raytheon, General Dynamics, Lockheed Martin, and the oil majors, who have enjoyed record profits during the recent and continuing “wartime” period.

The need for a peace dividend is more urgent than ever in what John Bellamy Foster rightly calls “a period of imperialist development that is potentially the most dangerous in all of history”—one where “life on the planet as we know it can be destroyed either instantaneously through global nuclear holocaust, or in a matter of a few generations by climate change and other manifestations of environmental destruction.” As Foster, Hannah Holleman, and Robert W. McChesney recently argued in Monthly Review: “A society that supports its global position and social order though $1 trillion a year in military spending, most likely far exceeding that of all the other countries in the world put together—unleashing untold destruction on the world, while faced with intractable problems of inequality, economic stagnation, financial crisis, poverty, waste, and environmental decline at home—is a society that is ripe for change.”

Put less delicately, it is a society overdue for what the democratic socialist Dr. King called the “real issue to be faced” beyond “superficial” problems: “radical reconstruction of society itself.”


“Eagerness to Accommodate Existing Institutions”

Obama would never have been permitted to make a serious presidential run if the U.S. ruling class believed he shared King’s hopes for “radical reconstruction.” The corporate and imperial gatekeepers of U.S. power are not in the business of handing over the world’s most potent office to progressive opponents of empire and inequality. They began determining five years ago (as Obama was picked to run for the U.S. Senate) that Obama the politician was a privilege-friendly person of the state-capitalist, neoliberal center. As Larissa MacFarquhar noted last year in a carefully researched New Yorker essay: “In his view of history, in his respect for tradition, in his skepticism that the world can be changed any way but very, very slowly, Obama is deeply conservative…. It’s not just that he thinks revolutions are unlikely,” MacFarquhar added. It’s also, she wrote, that, “He values continuity and stability for their own sake, sometimes even more than he values change for the good” (“The Conciliator,” May 7, 2007).

According to liberal journalist Ryan Lizza in the New Yorker last summer: “Perhaps the greatest misconception about Barack Obama is that he is some sort of anti-establishment revolutionary. Rather, every stage of his political career has been marked by an eagerness to accommodate himself to existing institutions rather than tear them down or replace them” (“Making It,” July 21, 2008).

Left, black political scientist Adolph Reed, Jr., who once lived in Obama’s Illinois state-legislative district, described Obama in the Village Voice in 1996, at the literal beginning of Obama’s political career, as follows: “In Chicago, for instance, we’ve gotten a foretaste of the new breed of foundation-hatched black communitarian voices: one of them, a smooth Harvard lawyer with impeccable credentials and vacuous-to-repressive neoliberal politics, has won a state senate seat on a base mainly in the liberal foundation and development worlds. His fundamentally bootstrap line was softened by a patina of the rhetoric of authentic community, talk about meeting in kitchens, small-scale solutions to social problems, and the predictable elevation of process over program—the point where identity politics converges with old-fashioned middle class reform in favoring form over substance. I suspect that his ilk is the wave of the future in U.S. black politics here, as in Haiti and wherever the International Monetary Fund has sway.”

Consistent with these reflections, Obama’s guaranteed choice health care plan amounts to a probably unworkable “half-way solution” that falls well short of the public’s longstanding desire for universal national health insurance. It preserves the power and profits of the institutions most responsible for the health care crisis—private insurance and pharmaceutical corporations. “Despite Barack Obama’s avowed hopes for change,” Roger Bybee notes, the new president’s health reform, “manacled to private insurers, may ultimately deepen public cynicism of the possibility of meaningful reform” (Z Magazine, December 2008).

In a similar vein, “Obamanomics” falls short of the bold progressive initiatives and challenges to financial and corporate power required to spark equitable domestic development. As adjusted in response to the banking crisis and deepening recession, moreover, Obama’s economic program may well amount to “something akin to a national austerity program….” Instead of forward movement on jobs, education, retirement, and health care, Jack Rasmus finds, “what me may well get is ‘Let’s all tighten our belts to get through this crisis'” (Z Magazine, December 2008).

But like most of the nation’s elected office-holders, Obama supports a massive taxpayer-funded bailout of leading Wall Street financial and insurance firms deemed “too big [and powerful] to fail”—a curious government payout for parasitic enterprises that have driven the national and global economy into the ground. Morgan Stanley alone is slated to receive tens of billions of federal dollars, a giant state capital dividend approved by Obama even as the firm’s analysts observe Obama’s conventional establishment wisdom holding that “there is no peace dividend.”

Obama’s “economic team,” including Lawrence Summers (top economic advisor) and Treasury Secretary Timothy Geithner, is full of reigning neoliberals calling for the socialization of the market economy’s losses and the upward privatization of its gains.


Holding Domestic Constituencies in Check

Obama’s business-friendly centrism helped him garner record-setting corporate campaign contributions during the last election cycle. He received more than $33 million from “FIRE” (the finance-real-estate and insurance sector), including $824,202 from Goldman Sachs. He has been consistently backed by the biggest and most powerful Wall Street firms. At the same time, and by more than mere coincidence, candidate Obama enjoyed a remarkable windfall of favorable corporate media coverage—the key to his success in winning votes and small donations from middle-class and non-affluent people.

As many of his elite sponsors certainly understand, Obama’s outwardly progressive persona is perfectly calibrated to divert, capture, control, and contain popular rebellions. He is uniquely qualified to simultaneously surf, de-fang, and “manage” the U.S. and world citizenry’s rising hopes for democratic transformation in the wake of the long national Bush-Cheney nightmare. As John Pilger noted last May, “By offering a ‘new,’ young and apparently progressive face of the Democratic Party—with the bonus of being a member of the black elite—he can blunt and divert real opposition,” bringing “intense pressure on the U.S. antiwar and social justice movements to accept a Democratic administration for all its faults.”

Sadly enough, Obama’s race is part of what makes him so well matched to the tasks of mass pacification and popular “expectation management” (former Obama advisor Samantha Power’s revealing phrase). As Aurora Levins Morales noted in Z Magazine last April, “This election is about finding a CEO capable of holding domestic constituencies in check as they are further disenfranchised and…[about] mak[ing] them feel that they have a stake in the military aggressiveness that the ruling class believes is necessary. Having a black man and a white woman run helps to obscure the fact that…decline of empire is driving the political elite to the right…. Part of the cleverness of having such candidates is the fact that they will be attacked in ways that make oppressed people feel compelled to protect them.”


Imperial “Re-branding”

The logic works at the global level. A considerable segment of the U.S. foreign policy establishment thinks that Obama’s race, name (technically Islamic), experience living in (Muslim Indonesia, as a child) and visiting (chiefly his father’s homeland, Kenya) poor nations, and his nominally “anti-war” history will help them repackage U.S. world power in more politically correct wrapping. John F. Kerry, who ran for the presidency four years earlier largely on the claim that he would be a more effective imperial manager than George W. Bush, was thinking of these critical “soft power” assets when he praised Obama as someone who could “reinvent America’s image abroad.” So was Obama when he said to reporters aboard his campaign plane in the fall of 2007: “If I am the face of American foreign policy and American power…you can tell people, ‘We have a president in the White House who still has a grandmother living in a hut on the shores of Lake Victoria and has a sister who’s half-Indonesian married to a Chinese-Canadian,’ then they’re going to think that ‘he may have a better sense of what’s going on in our lives and country.'”

Obama’s global biography, ethno-cultural nomenclature, and charisma are great attractions to a predominantly caucasian U.S. foreign policy elite hoping to restore American legitimacy in a majority non-white world that has been provoked and disgusted by U.S. behavior in the post-9/11 era (and truthfully before). He is in many ways an ideal symbol of imperial “re-branding.” According to centrist New York Times columnist Nicholas Kristof just before the election, the ascendancy of a black president “could change global perceptions of the United States, redefining the American ‘brand’ to be less about Guantanamo and more about equality.”

The leading advertising trade journal Advertising Age agreed. Last October it hailed Obama as “Marketer of the Year” and praised him for producing “An Instant Overhaul for Tainted Brand America.” The journal quoted David Brain, CEO of the global public relations firm Edelman Europe, Middle East and Africa, on how “the election and nomination process is the brand relaunch of the year. Brand USA. It’s just fantastic.”

According to Nick Ragone, the senior VP of client development at the global advertising firm Omnicom Group’s Ketchum, “We’ve put a new face on [America] and that face happens to be African American.” Ragone told Advertising Age that “it takes a lot of the hubris and arrogance of the last eight years and starts to put it in the rearview mirror for us.”

Harvard Business School professor and former WWP Group (a global advertising firm) board member John Quelch (co-author of Greater Good: How Good Marketing Makes for Better Democracy) told Advertising Age that, “The election result zero-bases the image of the United States worldwide. We have a clean slate with which to work.”

Carolyn Carter, the London-based president and CEO of Grey Group Europe, Middle East and Africa (creator of the teeth-rotting “Coke Zero” ad campaign in Northern Europe) agreed, adding, “The last eight years broke faith in Brand America, and people want that faith restored.”

Enter Obama, who is “almost like Che Guevera, in a good way,” according to Foreign Policy magazine’s web editor Blake Hounshell. “He has icon status,” Hounshell explains, “with all the art around the world of his face.” The difference is that Che roused independent left and Third World challenges to the American Empire while Obama inspires captivation with the corporate-imperial U.S. According to Scott Kronick, global marketing firm Ogilvy PR’s Beijing-based president, Obama’s triumph “sends a strong message to the world that despite what many people believe and feel…America can be very open, democratic, and progressive.”

Call it the identity politics of foreign policy. The old empire wants new, faux-progressive clothes and Obama is just the person to wear them. According to the former Clinton administration official and Kissinger Associates director David J. Rothkopf, Obama’s cabinet picks epitomize “the violin model: hold power with the left hand and play the music with your right” (New York Times, November 21, 2008).

Meanwhile the poverty population’s ranks are rising and the poor are getting poorer at home and abroad. Vast swaths of suffering humanity are increasingly desperate for true progressive change and a peace dividend beneath and beyond quadrennial corporate-crafted and candidate-centered U.S. “electoral extravaganzas” (Noam Chomsky’s phrase). Obama will deliver only as much change as he can be compelled to channel by popular resistance and rebellion.

It is wonderful and historic that the American electorate put a black family in the White House. Still, Barack Obama is no magical exception to the rule observed last spring by Howard Zinn: “Let’s remember that even when there is a ‘better’ candidate (yes, better Roosevelt than Hoover, better anyone than George Bush), that difference will not mean anything unless the power of the people asserts itself in ways that the occupant of the White House will find it dangerous to ignore…. The Democratic Party has broken with its historic conservatism, its pandering to the rich, its predilection for war, only when it has encountered rebellion from below, as in the Thirties and the Sixties.”

Bailouts & Sellouts

January 31, 2009

By Edward S. Herman
Herman’s ZSpace page

It is amazing to see sums now running into the trillions being allocated almost entirely to substantial market players, many of whom were heavily involved in producing the financial debacle we are now suffering. The priorities of the U.S. establishment, including both the leading Democrats as well as the Republicans and mainstream media, are clear—at the top for funds and solicitude are the military-industrial complex and those involved in the imperial projection of power, along with the country’s overlapping business and financial elite. There is a big gap then before we get to the middle class; and much of the lower middle class and poor are treated as disposable and even threats.

What follows is, first, that the military and imperial-expansion budget, now greater than a trillion dollars, larded with incredible waste and fraud as well as overkill, and provoking a global arms race while funding the destabilization of the world, is for all practical purposes outside the orbit of discussion about how we deal with a financial crisis and serious shortage of resources for the civil society.

In the case of the financial crisis and bailout, it is remarkable how much money can be mobilized very quickly to rush to the aid of the big boys—at the expense of the taxpayer—in contrast with the invariable struggle and pain to provide relatively small sums for the benefit of ordinary citizens and even more struggle and elite anger to provide resources for the disposables and poor. Of course this is a crisis that threatens systemic disaster. But there are many crises involving millions that are chronic and serious—such as the condition of the urban poor, of the New Orleans refugees, even a wide swathe of the middle class, and the fantastic growth of imprisonment—that don’t produce an aura of crisis or financial resources to cope (except with police and prisons).

One must also be struck by the fact that management of the crisis was left in the hands of people who urged policies that led to the unfortunate result, and failed to see it even as it gathered steam. There is also the matter of gross conflict-of-interest in centering management of the financial crisis in the hands of Henry Paulson, straight from one of the crisis producers, Goldman Sachs. Mister Conflict-Of-Interest even doled out public money to his own firm as well as to that firm’s clients; Mister Steady Bungler also made policy misjudgments and shifts on a weekly basis, without threat to his authority. This is a manifestation of power, displayed throughout this business society, where, for example, labor funding or sponsoring of a PBS program can be ruled out on conflict-of-interest rules, but massive and regular business funding is not questioned. Business conflict-of-interest is normalized across the board.

Paulson can dole out these huge funds without advance notice to any independent parties, or full disclosure of recipients or terms; and where terms are disclosed, they are usually vague, not taxpayer friendly, and fail to give the government powers commensurate with the taxpayers’ investment and risk. The notion that this is “Bush socialism” is nonsensical, on a par with making Bush a socialist in paying out taxpayer money to Halliburton and Blackwater for their work in Iraq. The Bush cabal has been throwing vast sums into private hands via war contracting and privatization for years. This is looting, not socialism.

It is also notable how bipartisan the looting has been. Bipartisanship is a virtue of the Democratic Party, and the mainstream media are pleased to see that Party so bipartisan and “pragmatic,” meaning that they avoid populism and are willing, even eager, to compromise with the Republicans even when their voting constituency wants them to do otherwise. Obama seems to be clearly in that great tradition, being openly commended by Karl Rove, Joe Lieberman, Max Boot, John McCain, and David Brooks for his appointments (see Jeremy Scahill, “‘Better Than Cats!,’ Neocons, Republicans, War Criminals, Rave About Obama’s ‘Team of Rivals’,” Huffington Post, November 30, 2008), and mentioned explicitly in the New York Times for his bipartisanship and courting of the Republicans for policy inputs (Jeff Zeleny, “Initial Steps by Obama Suggest a Bipartisan Flair,” NYT, November 24, 2008). The contrast with the Republicans’ treatment of Bill Clinton, and George W. Bush’s appointments of Republican hardliners nearly across the board, could hardly be more dramatic.

The media don’t press the Republicans toward bipartisanship and pragmatism, only the Democrats, and the Democrats oblige. They have voted to fund the looting of taxpayers (and Iraqis) throughout the invasion-occupation of Iraq. Of course, it is important that they kept doing this even after the voters gave them a congressional majority in 2006, quite evidently based on the public’s rejection of the Iraq policy and desire to get out. (Democratic leaders rationalized this on the grounds that they didn’t have the 60 votes needed to override a veto; but they had enough votes to refuse funding till conditions were met that forced withdrawal, a power which they failed to use.) In the bailout looting, Nancy Pelosi made a vigorous effort to get the Democrats to go along with the initial $700 billion blank check to Paulson and she has been bipartisan all the way on this subject, just as she, Harry Reid, and Rahm Emanuel were bipartisan with Bush-Cheney on funding the Iraq occupation-pacification.

We should also note that Pelosi has showed her pragmatism and bipartisan qualities in now promising to go after New York Democratic congressperson Charles Rangel, “after a wide-ranging House investigation into ethical questions” (Raymond Hernandez, “Pelosi says Inquiry Into Ethics Questions Concerning Rangel Will Move Swiftly,” NYT, November 27, 2008), while maintaining her “off the table” stance for Bush impeachment for serial violations of the U.S. Constitution, the UN Charter, and the Geneva Agreements. Of course, Pelosi going after the Bush-Cheney cabal for violation of the Geneva Agreements would have run into the problem that while on the House Intelligence Committee she was told about the use of water-boarding in 2002 and did not object. Prosecution of Bush and Cheney for such legal violations would have been awkward not only on water-boarding, but based on Democratic votes on war funding, the PATRIOT Act, and the Military Commissions Act. Illegal war, constitution busting, and torture have been bipartisan.

One of the most depressing features of the 2008 election was that “liberal” San Francisco Democrats voted Pelosi back to the House in a crushing primary victory over Cindy Sheehan and then in an easy victory in the general election as well. Pelosi should actually be impeached for her failure to pursue a Bush-Cheney impeachment, given her obligation to uphold the Constitution. Pelosi epitomizes Democratic pragmatism, bipartisanship, and the sellout of ordinary citizens, but she thrives—is easily reelected, was lauded by Joan Claybrook of Public Citizen (“Pelosi Leads Democrats With Progressive Politics,” Public Citizen, September-October 2007), and was even invited to speak at a gathering of liberal media activists at the Kos-organized Net-Roots Nation conference in Austin, Texas in July 2007.

It is interesting that for a target country, like Serbia, or with the Khmer Rouge in Cambodia (Kampuchea), “moving forward” somehow requires that villains be tried and punished for past crimes so that victims can be satisfied with some kind of “justice” and potential villains can learn that crime doesn’t pay. But the idea that this might apply to the United States is confined to a fringe left and a human rights body like Amnesty International, which calls for precisely what the United States and its toadies call for in the case of targets—”criminal investigations…reject impunity for crimes under international law…ensure that victims of human rights violations…will have meaningful access to redress and remedy” (AI, “USA: Counter Terror with Justice: A checklist for the next US President,” November 5, 2008).

Bush, Cheney, and company are not only not removed from office before their term is up, or threatened with prosecution later, they can continue to enact rules and orders benefiting their cronies, at taxpayer expense, up to their last moment in office and without impediment. The real surprise is that they haven’t started another war, though they may still do that (this is written on December 4, 2008). But they have proved that in this country and in this global environment, U.S. leaders can get away with huge crimes, endless lying, facilitating massive crony capitalist theft, and constitution busting. There seems to be no impunity limit for U.S. leaders that serve the military-industrial complex and business and financial elite so well, even if crudely, incompetently, and perhaps even to elite disadvantage.

Although George Bush I was exceedingly vulnerable to an investigation of his relations with Saddam Hussein (weapons supply and loans), the Clinton administration declined to pursue this, in a familiar (Democratic) act of bipartisanship. Clinton was rewarded by steady Republican attacks, regular non-cooperation, even a temporary government shutdown, and eventually impeachment. The likelihood that President Obama will support the investigation and prosecution of the Bush-Cheney administration’s far more extensive and serious crimes is close to zero. Two Obama advisers have reportedly stated, “There’s little, if any, chance” of the Obama administration prosecuting those who authorized or implemented torture (Lara Jakes Jordan, “Obama advisers: No charges likely against workers who authorized harsh interrogation methods,” AP, November 18, 2008): it would violate the principles of bipartisanship so dear to the Democrats and encouraged by the media. And the Democrats are not only badly compromised themselves, they too have internalized the belief that international law does not apply to this country. Hence, “justice” for “them,” “impunity” for “us.”

We seem to be entering another era of Democratic “moves to the center” and, instead of “creating a new reality,” adjustment to the existing reality of power.

Edward S. Herman is an author, economist, political columnist, and media critic.

The Economy: Fannie Mae’s Last Stand

January 31, 2009

Fannie Mae’s headquarters, in Washington, D.C. From left: former Fannie C.E.O. Jim Johnson, Congressman Barney Frank, former OFHEO director Armando Falcon, former Fannie C.E.O. Daniel Mudd, President Bush, Treasury Secretary Henry Paulson, former Fannie C.E.O. Franklin Raines, and Alan Greenspan. Photograph by Cameron Davidson.

Many believe the government-backed mortgage giants known as Fannie Mae and Freddie Mac were major culprits in the economic meltdown. But, for decades, Fannie Mae had been under siege from powerful enemies, who resented its privileged status, its hard-driving C.E.O.’s, and its huge profits. Surveying Fannie’s deeply dysfunctional relationships with Congress, the White House, and Wall Street, the author tells of the long, vicious war—involving most of Washington’s top players—that helped propel one of the world’s most successful companies off a cliff.

by Bethany McLean February 2009

The chairman of the universe.”

“Washington, D.C.’s Medici.”

“The face of the Washington national establishment.”

“One of the most powerful men in the United States.”

All those phrases were used to describe a man you may never have heard of: Jim Johnson, the C.E.O. of mortgage giant Fannie Mae in the 1990s. Fannie was then one of the largest, most profitable companies in the world, with a stock-market value of more than $70 billion and more earnings per employee than any other company in America. (By comparison, G.M. at its peak, in 2000, was worth only $56 billion.) On one level, Johnson, now 65 years old, was just another businessman with a lot of money and multi-million-dollar houses in desirable locations from D.C. to Sun Valley, Idaho, to Palm Desert, California. Chairman of D.C.’s premier arts venue, the Kennedy Center, and one of its top think tanks, the Brookings Institution, Johnson was out “wearing white-tie and black-tie every night,” says Bill Maloni, Fannie’s former chief lobbyist. “Everyone wanted a little bit of Jim.”

But Johnson was also a political force, because the company he ran had a public mission—literally. It had been chartered by Congress to help homeownership. Johnson liked to paraphrase the old motto about General Motors: “What’s good for American housing is good for Fannie Mae,” he’d say. Accordingly, he built Fannie into what former congressman Jim Leach, a Republican from Iowa and longtime Fannie gadfly, calls “the greatest, most sophisticated lobbying operation in the modern history of finance.”

He may be right. John McCain was embarrassed last summer by revelations that his campaign manager, Rick Davis, had served as the president of the Homeownership Alliance, an advocacy group for Fannie and Freddie Mac, Fannie’s smaller brother. The “revolving door,” as people call it, between the Hill and Fannie and Freddie spun so quickly that it’s actually more surprising when someone isn’t on the list than when they are. Rahm Emanuel served on Freddie’s board! Right-wing godfather Grover Norquist lobbied for Fannie! Newt Gingrich was a consultant for Freddie, and Ralph Reed was a consultant for Fannie!

The Princeton-educated son of a Minnesota state legislator, Johnson has silver hair and round tortoiseshell glasses, which give him a warm appearance that is belied by the hard planes of his face. Indeed, he was “very warm, very nice,” in the words of one former Fannie executive, but also “very hard-ass.” In 1996, Richard Baker, a Republican representative from Louisiana, complained that the preface to a Treasury Department report on Fannie had been watered down to make it friendlier to the company. Rumors flew that this had been accomplished after Johnson, or someone else high up in the company, had simply made a call to Treasury Secretary Robert Rubin or President Bill Clinton, both of whom were Johnson’s personal friends. (Johnson and Clinton had met at a 1969 gathering on Martha’s Vineyard.) Johnson has denied calling either man, and has said that he and Rubin had a policy while Rubin was Treasury secretary that they would not discuss business. But, under Johnson, Fannie Mae had a reputation for never losing a fight. “The old political reality was that we always won, we took no prisoners, and we faced little organized political opposition” is how Daniel Mudd, son of journalist Roger Mudd and Fannie’s last real C.E.O., later described Fannie’s golden years.

On December 15, 1998, Jim Johnson’s retirement dinner was held at the National Museum for Women in the Arts. That seems to have been the second choice—according to The Washington Post, the gala was supposed to have been in the U.S. State Department’s Benjamin Franklin Room, which could be used by outsiders only if a government official requested it. The Post began making calls after it got hold of an invitation, at which point State Department lawyers pulled the plug. But the dinner was grand in any event. Rubin spoke, as did comedian Bill Cosby and Fannie board member Bill Daley, the brother of Chicago’s current mayor.

The press reported Johnson’s compensation in his final year as around $7 million, but an internal Fannie Mae analysis (which assumed a high stock price) said that the real number was closer to $21 million. Plus, he got perks that could add up to half a million a year: a consulting agreement, two support-staff employees paid for by Fannie Mae, a car, and partial payment for a driver.

There were rumors that Johnson was angling to become Treasury secretary. Instead, in 1999, he became one of the first outside directors of the investment bank Goldman Sachs, where Rubin had been C.E.O., and where current Treasury secretary Henry Paulson presided at the time. Johnson became the head of the compensation committee, making him the closest thing Hank Paulson had to a boss.

Flash forward to just 10 years later. On Friday, September 5, 2008, Treasury Secretary Paulson sat in a conference room at an obscure government agency known as the Federal Housing Finance Agency (F.H.F.A.), which had been charged with regulating Fannie and Freddie. Next to Paulson sat Jim Lockhart, the director of F.H.F.A. On Lockhart’s other side was Ben Bernanke, chairman of the Federal Reserve. Across the table sat Dan Mudd, who had become Fannie’s C.E.O. in late 2004. By the summer of 2008, Fannie and Freddie owned or guaranteed $5.2 trillion of American mortgages, roughly half the $12 trillion total. Just six weeks before the September 5 meeting, Lockhart had said publicly that Fannie Mae’s capital was “well in excess” of what it needed to survive the mortgage storm that was engulfing the nation. But at the meeting he announced that the company’s capital was, in fact, insufficient. The government officials told Mudd that his company had to give its consent to something called conservatorship, which meant that the government would take it over, pretty much wiping out shareholders—not because Fannie needed capital at that moment, but because they believed Fannie would need it in the future. Mudd was out. The message to Fannie executives, says one person who was in the room, was crystal clear: “If you oppose us, we will fight publicly and fight hard, and do not think that your share price will do well with all of the forces of the government arrayed against you.”

There was also a threat that F.H.F.A. would make life very unpleasant for both the board and management if they didn’t agree to the government’s terms, says another Fannie executive. “That’s really not true—there were no threats,” claims Lockhart, although he adds, “We were very firm.” Steve Ashley, former chairman of Fannie’s board, asked what the government wanted Fannie to do that it wasn’t already doing. The Fannie team didn’t feel that any good answers were given.

A lot of people assume they already know the story of Fannie’s fall from grace. In a narrative that has been repeated incessantly in op-eds and on cable TV, there were good guys and bad guys. The good guys were the Republicans, who had tried to rein in Fannie and Freddie (which was also put into conservatorship that same day), and the bad guys were the Democrats, who wanted to put people into houses they couldn’t afford with subprime mortgages, and Fannie itself, which took advantage of its supposed mission to enrich its executives at the expense of taxpayers. Some even argue that Fannie and Freddie—“the toxic twins,” former Connecticut Republican representative Chris Shays called them—are to blame for the entire economic meltdown. They were “the match that started this forest fire,” according to John McCain. On October 21, a group of House Republicans wrote to Attorney General Michael Mukasey, requesting that the Justice Department appoint a special counsel to investigate Fannie and Freddie executives. (The F.B.I. is investigating both Fannie and Freddie.) Jim Johnson, by last June the vetter of vice-presidential candidates for Barack Obama, had to resign the post due to allegations that he had gotten more than $7 million of loans—some at favorable rates—from scandal-ridden Countrywide Financial, a major Fannie Mae customer whose former C.E.O., Angelo Mozilo, was a friend of Johnson’s. (Johnson said at the time that he received no special favors.)

But there’s a very different—albeit equally radical—version of reality. In this version, which is told by former Fannie executives and shareholders, Fannie was shot, not because it had to be, but because it could be. “The weekend massacre” is how one former Freddie lobbyist describes the events of the September 5 weekend. “My view is [the Bush] administration said we’ve got four months to remove this thorn in our side,” says Tim Howard, who was Fannie’s chief financial officer from 1990 to 2004. “There will never be another time. We’ve got to do it now.”

It is worth noting that thus far the government has put not a dime into Fannie and only $13.8 billion into Freddie—which is a drop in the bucket compared to the taxpayer dollars that have gone to some other firms, such as the $45 billion Hank Paulson has handed Citigroup. In this alternative narrative, it was Paulson’s rash action of taking over Fannie and Freddie that helped cause the financial meltdown. As famous money manager Bill Miller, the chief investment officer of Legg Mason Capital Management, wrote in a recent letter to his investors: “When the government pre-emptively seized [Fannie and Freddie] not because they needed capital and could not get it, but because the government believed they would run out in the future, then shareholders of every other institution that needed or was perceived to need capital did the only rational thing they could do—sell, in case the government decided to pre-emptively wipe them out as well.”

In truth, Fannie was a company with extraordinarily powerful enemies. They spanned the decades, the two parties, and the ideological spectrum, from Reagan budget director David Stockman to Clinton Treasury secretary Larry Summers to President George W. Bush, and from Ralph Nader to former Federal Reserve chairman Alan Greenspan. These enemies, who detested the privileges Fannie got from its congressional charter, had long wanted to drastically curtail the company—or kill it outright. Johnson called the battle a “philosophical dispute with deep roots and many, many branches,” and it was, but it was also a personal dispute based on rivalries and jealousies. “The War of the Roses” is how a former Fannie executive describes it.

As in most wars, there is fault on both sides. Although in 2000 the Department of Housing and Urban Development (hud), under Andrew Cuomo, increased the requirements that Fannie and Freddie buy loans made to lower-income people, a dramatic increase came in 2004—under the Bush administration. Some people believe it did so merely in order to pressure the companies into agreeing to new regulation. But Fannie itself isn’t a hapless victim, either. In the end, it was Fannie executives who made a business decision to stake their future on risky mortgages that had nothing to do with helping people own homes. The company used its political power to stymie effective regulation, and its extreme aggressiveness and arrogance gave its enemies license to do things they never would have done to a normal company. And, oh, did they ever.

The Vampire Issue

Gary Gensler, the Treasury undersecretary for domestic finance in the final years of the Clinton administration, likes to tell a story about the deal Alexander Hamilton cut with Thomas Jefferson and James Madison back in 1790. Jefferson and Madison agreed that the nation would assume the debt of the states; Hamilton agreed that the capital of the country would not be in New York, but rather on the Potomac. “This was a very wise move,” says Gensler, “because for about two centuries it separated the nation’s financial capital from its political capital.” Then he chuckles a little. “It worked until Fannie Mae and Freddie Mac came along.”

The Federal National Mortgage Association (Fannie Mae) was founded in 1938, a creature of F.D.R.’s New Deal. The Federal Home Loan Mortgage Corp. (Freddie Mac) came along in 1970, when the thrift industry decided that Fannie needed a competitor. Referred to as Government Sponsored Enterprises, or G.S.E.’s, they were created to help homeownership, but that has never been because they lend money directly to homeowners. Instead, Fannie and Freddie bought mortgages from the local institutions—banks, thrifts, and mortgage originators—that had made them, which relieved those mortgage-makers of both the credit risk (the risk that the homeowner wouldn’t pay) and the interest-rate risk (the risk that the bank would earn less on the mortgage than it paid on its debt). This enabled the mortgage-makers to go out and make more loans.

In addition to having a congressional mandate to aid homeownership, Fannie and Freddie also had shareholders who wanted to see profits, just like Citigroup or General Electric or any publicly traded company. That’s because in 1968 President Lyndon Johnson, who needed money to pay for the Vietnam War, decided to remove Fannie from the government’s balance sheet by having it sell shares to the public. Freddie followed suit in 1989.

And yet, Fannie and Freddie weren’t just like Citigroup or General Electric, or any normal company, because they kept an array of special perks that came with their congressional charters. Among those perks: an exemption from state and local income taxes, presidential appointees on their boards of directors, and a line of credit with the U.S. Treasury. This last was by far the most important, because the line of credit—eventually $2.25 billion for each company—implied to many investors that the full faith and credit of the U.S. government stood behind Fannie and Freddie. Officially, everyone denied that that was the case, but this “double game”—as Rick Carnell, the Treasury undersecretary for domestic finance in the 1990s, called it—enabled the companies to raise money at a cost that was just a smidgen higher than that of the government itself, thereby providing them with an enormous competitive advantage over ordinary financial institutions.

As the mortgage market evolved, and finance grew more sophisticated, Fannie and Freddie came to make their money in two ways. One was supposedly conservative: they were paid a small fee by the mortgage-makers to guarantee that the homeowner wouldn’t default. And for most of their history, they wouldn’t buy just any loans, but rather loans that conformed to certain size limits (thereby excluding so-called jumbo loans, more than $417,000) and fairly strict credit standards. Then they repackaged these loans into what are known as mortgage-backed securities, and sold them to other investors. The new investors were willing to take the interest-rate risk, but didn’t have to worry about evaluating each and every homeowner’s ability to pay—a task of enormous proportions—because Fannie and Freddie guaranteed that. Today, this is the $3.7 trillion in mortgages Fannie and Freddie guarantee.

The other way Fannie and Freddie made money was when they began to repurchase their own mortgage-backed securities, and to buy similar securities that were created by Wall Street without the G.S.E. guarantee, and hold them in a portfolio. Then Fannie and Freddie pocketed the difference—what Greenspan called “the big fat gap”—between what the mortgages yielded and the companies’ own cost of borrowing funds. This was an immensely profitable business: Wall Street analysts estimated that it provided up to three-fourths of Fannie’s and Freddie’s earnings, and today the portfolio business comprises most of the $1.5 trillion in mortgages that Fannie and Freddie own.

This second way of making money became the source of great controversy. Critics, most notably Alan Greenspan, argued that the portfolio wasn’t worth any risk at all because it did nothing to put people in homes and existed only to make money for the companies’ executives and shareholders. He and other critics didn’t want just to modify Fannie’s and Freddie’s business. They wanted to drastically curtail it—or, better yet, wipe out the two G.S.E.’s altogether. And so it was only human nature that Fannie and Freddie fought back—hard. Or, as former Fannie chief lobbyist Bill Maloni, whose Friday-night poker games for Washington power players were the stuff of legend, wrote on a blog, “One fact of the GSE world is that you will be slaughtered either for being a sheep or a wolf, and I’d much rather meet my fate as a predator than as a lamb chop provider.”

Maloni and his bosses felt that they couldn’t lose any battle, no matter how small. “You punch my brother in the face, I’ll burn down your house” was one Fannie Mae saying. Another was “It’s better to throw one brick too many than one brick too few.”

But Fannie, no matter how aggressive it was, could never stop the criticism. Within Fannie, people called the desire to shrink them or kill them the “vampire issue”—because Fannie could never make it go away.

Jim Johnson had come to Fannie in 1990. His predecessor, David Maxwell, had been on the tennis team at Yale and was “the kind of man who sends only handwritten notes,” recalls Maloni. But Maxwell was also a tough cookie who knew how to get what he wanted. When he left, in 1991—with a $19.5 million retirement package—a humorous going-away video showed corporate cars leaving Fannie’s offices with body bags in the trunks.

Maxwell had met Johnson at a small Washington dinner party in 1985. Johnson was a partner at Shearson Lehman, where he’d landed after he and Richard Holbrooke (who would go on to become ambassador to the U.N. under President Clinton) sold a consulting firm they’d founded to the investment bank. Johnson’s world encompassed both business and politics. He had worked on the campaigns of Eugene McCarthy and George McGovern, and then served in the Carter administration as Walter Mondale’s executive assistant (and later the chair of his presidential campaign), during which time he married Mondale’s press secretary, Maxine Isaacs, now a lecturer at the Harvard Kennedy School. When Maxwell retired, he chose Johnson as his successor over protests from President George H. W. Bush’s people, who claimed Johnson was a partisan Democrat.

It was Johnson who “took the seeds that David Maxwell sowed and [grew] them far beyond what David Maxwell dreamed,” as Countrywide chief Angelo Mozilo later told a reporter. Like Maxwell, Johnson cut a charming, suave figure in society, but under his Minnesota-nice exterior was the heart of a born fighter. “In daily life, he’d say things like ‘We’re going to cut them off at the knees,’ ” says a former Fannie executive.

A key test for Johnson came early in his tenure, when Congress began work on how best to regulate Fannie and Freddie. The resulting legislation, which allowed the G.S.E.’s to hold lower amounts of capital than other financial institutions, was what one analyst later called Johnson’s “finest moment.” Fannie lobbied relentlessly, using a letter from former Fed chairman Paul Volcker, who said that if Fannie reached its proposed capital standards it would be able to maintain its solvency.

Fannie’s allies in Congress also made sure that the new regulator—which was known as the Office of Federal Housing Enterprise Oversight (ofheo) until its name was changed to the F.H.F.A. in the summer of 2008—was placed inside hud, which had no experience regulating a financial-services company, and that ofheo, unlike any other regulator, would be subject to the appropriations process, meaning its funding was at the mercy of politicians—politicians who often took their cues from Fannie.

Not surprisingly, ofheo was a notoriously weak regulator. For almost three years, from February 1997 to September 1999, the agency didn’t even have a director. “The goal of [Fannie’s] senior management was straightforward: to force ofheo to rely on [Fannie itself] for information and expertise to such a degree that Fannie Mae would essentially be regulated only by itself,” wrote ofheo in a report years later.

Johnson also addressed Fannie’s other big problem, which was that homeowners and politicians never really understood what it did. “There’s nothing in the homeowner’s life called Fannie Mae,” he’d say. So he had to show homeowners that the company was indispensable. The cornerstones of his strategy were the Fannie Mae Foundation and the Partnership Offices. In 1994, Fannie began opening offices in congressional districts around the country. They issued thousands of press releases, which usually featured a local politician prominently assisting Fannie in some good housing-related deed.

In 1995, Johnson seeded the Fannie Mae Foundation with $350 million in Fannie stock. In the ensuing years, the foundation gave away millions of dollars to organizations ranging from the Cold Climate Housing Research Center, in Fairbanks, Alaska, to the Congressional Hispanic Caucus Institute. All of this, along with the alliances Johnson built with others in housing, including homebuilders and real-estate agents, helps explain all the outcry today about Fannie’s and Freddie’s lobbying dollars—$170 million over the past decade, or just a little less than what the American Medical Association spent, according to the Associated Press. But that misses the point. It’s like counting only one arm on a giant octopus. “They ran a battle plan that would make Patton proud. It was 24-7 and never anything left to chance,” says former congressman and Fannie antagonist Richard Baker today.

Despite what right-wing critics now charge, however, Fannie and Freddie weren’t big risktakers, even after the 1992 legislation in which Congress also mandated that they had to buy a certain number of mortgages made to people with lower incomes. Critics now charge that this was when Fannie began to engage in risky lending practices. But, in reality, Fannie was extremely careful about the credit risks it took. Johnson was a master at announcing plans that sounded very grand—such as the trillion-dollar initiative, in which Fannie would buy a trillion dollars’ worth of mortgages to help housing—but didn’t really cost much. “About 98 percent were done at market rates [i.e., mortgages they would have bought anyway],” says a former employee. “We were giving away a little at the edge of the big machine.” Or, as Maloni puts it, Johnson could say to a member of Congress, “ ‘Have you seen our initiative for the handicapped?’ It might have only been for a few dozen loans, but our intent mattered.” Johnson would tell people that “the [congressional] housing goals had no teeth.” Indeed, during those years, Fannie and Freddie faced harsh criticism that they did less—not more—to support affordable housing than private lenders did.

This wasn’t because Fannie people were cynical about affordable housing. Quite the contrary: many referred to themselves as “housers,” which is slang for those who believe that better housing is the cure to all of society’s ills. But the company’s leaders knew that they couldn’t afford to make many unsafe loans, because any sign of financial weakness would be grist for their critics.

If the 1990s were a golden time for Fannie’s political power, they were for its financial power as well. Fannie’s market valuation grew from $10.5 billion at the beginning of the decade to more than $70 billion by the end. On Wall Street, Fannie and Freddie were big business—all those mortgage-backed securities and all that debt to fund their growth were sold through Wall Street firms—and “people dealt with them as if they were sovereign credits,” says one former banker. There was even talk, in those days of no federal deficit, that Fannie and Freddie debt would become the substitute for U.S. Treasuries.

The G.S.E.’s also became the place for ex-politicians to work. The Washington Monthly once declared that after he left the White House, Bill Clinton should go to Fannie because “scoring an executive post at Fannie Mae is recognized around establishment Washington as the equivalent of winning the lottery.” After all, where else could you make Wall Street–type money with no financial skills? And where else could you make so much money as a lobbyist?

In retrospect, this was a balancing act that was almost destined to fail. As Fannie and Freddie got bigger and more powerful, they struck even more fear into the hearts of those who resented their size and power. “We became dominant so quickly that we scared people,” says former Fannie chief financial officer Tim Howard today. And as a former top lobbyist for Fannie says, “A company like Fannie Mae, which has defined itself in Washington through its public mission, but which also has very well-paid executives, will have a hard time staying in the sweet spot.”

Profits of Doom

Every winter, Fannie Mae held a conference for Wall Street analysts and major investors. One year, right before Johnson retired, the theme song was a customized version of the song “The Best Is Yet to Come,” popularized by Frank Sinatra. Fannie Mae executives dressed up in top hats and tails to perform it. Frank Raines, who took over from Johnson as C.E.O., told investors that “the future’s so bright that I’m willing to set as a goal that our earnings per share will double over the next five years.” A report by the research firm Sanford Bernstein noted that the combined assets of Fannie Mae and Freddie Mac exceeded, in dollar terms, the G.D.P. of any nation except the U.S., Japan, and Germany.

When Franklin Delano Raines was named Johnson’s successor, he became the first African-American C.E.O. of a Fortune 500 company. Born in 1949 in Seattle to blue-collar parents—his mother cleaned offices at Boeing and his father was a custodian at the Seattle Parks Department—Raines went to Harvard, where he joined both the Young Democrats and the Young Republicans, and was named a Rhodes scholar. He interned in the Nixon White House and then served in the Carter administration, before leaving government to become a partner at the investment bank Lazard Frères. After 11 years at Lazard, Raines was spending four days a week on the road. He left, without his next move planned, in order to spend more time with his three young children. In 1991, when Johnson offered him the vice-chairmanship of Fannie Mae, Raines said yes—Fannie’s offices were just a mile and a half from Raines’s seven-bedroom Colonial home in Virginia.

In 1996, President Clinton lured Raines away from Fannie by appointing him the director of the Office of Management and Budget. Raines asked Clinton how long the job would last, and Clinton replied, “Until you balance the budget.” Within two years Raines produced the first balanced budget the U.S. had seen in 30 years. Later, he would be amazed to find himself painted as a partisan Democrat, because, during his time at O.M.B., Democrats had been angered by what they saw as his support for Republican fiscal policies. In 1995, Raines was appointed to the board of Boeing, where his mother had scrubbed floors. In 1998 he returned to Fannie Mae. At the time, there was talk that one day he would become the first black president of the United States.

There is no one who says that Franklin Raines isn’t incredibly smart. But praise for Raines’s intelligence is often accompanied by criticism of his interpersonal skills. “He’s very introverted,” says one former executive. “He cannot lower himself to make nice to people he considers intellectually inferior.” “Frank hurt himself,” says another. “He lacks a certain understanding of how to best position the other person so that you get what you want.”

Inside Fannie, there was also skepticism about the promise Raines made to Wall Street to double Fannie’s earnings from $3.23 per share in 1998 to $6.46 per share in 2003. “All the V.P.’s in the company looked at each other and said, ‘How is that going to happen?”’ says a former executive. The promise, combined with the lure of financial rewards, created an unhealthy pressure throughout the company. In 2000 the head of Fannie’s office of auditing gave a speech to the company’s internal auditors. “By now, every one of you must have 6.46 branded in your brains,” he said. “You must be able to say it in your sleep, you must be able to recite it forwards and backwards, you must have a raging fire in your belly that burns away all doubts, you must live, breathe, and dream 6.46 After all, thanks to Frank, we all have a lot of money riding on it.”

Almost immediately in Raines’s tenure, the criticism of the G.S.E.’s took on a new ferocity. One of the first salvos was fired by the Clinton Treasury Department under Larry Summers, who had replaced Rubin in the summer of 1999. Treasury workers knew that taking on Fannie was akin to political suicide, but “everyone jumped off together,” in the words of one former appointee, because they were all so convinced that Fannie and Freddie would eventually fall on top of taxpayers with a crushing thud. One of Summers’s goals was to weaken the perceived ties between the G.S.E.’s and the U.S. government, which was enabling the G.S.E.’s to take on too much risk. Most notably, on March 22, 2000, in congressional testimony, Gary Gensler said that the U.S. Treasury should consider cutting the lines of credit that Fannie and Freddie had with the government.

The response from Fannie Mae was immediate and furious. Tim Howard called Gensler’s comments “inept” and “irresponsible.” Fannie even tried to get the White House to distance itself from the Treasury, according to one person.

What has never been disclosed before is that, even before Gensler’s comments, and through the summer of 2000, Treasury held a series of meetings—some in a room just down the hall from Summers’s office—with Fannie’s top executives, in which Fannie tried to get Treasury to sign off publicly on a set of initiatives Fannie had devised in the hopes of appeasing its critics. Both sides describe Fannie’s strategy in the same way: keep your friends close and your enemies closer. But the negotiations came to nothing. One explanation is that the chemistry between Summers and Raines was “horrible,” in the words of one former executive. “The two of them were so alike,” says this person. “They were both arrogant, stubborn sons of bitches, and they both viewed themselves as the smartest guy in the room.”

Another explanation is that Summers realized that if Treasury supported Fannie in any public way it would only strengthen its apparent ties to the U.S. government, so he backed off. “Treasury was too smart,” Howard says now. “Larry wouldn’t bite.”

But perhaps the best explanation is that there just wasn’t a deal to be cut. Treasury officials simply didn’t believe Fannie’s arguments. As for Fannie, “you have to have some level of trust that they’re not trying to do you in, and there wasn’t that level of trust,” says another former Fannie executive.

As the critics became more vehement, Fannie’s responses became ever tougher. Its customers, including major banks, who were terrified of its rapid growth and Raines’s grand plans, set up a group called FM Watch, which began its own anti-G.S.E. lobbying effort. Fannie responded by comparing FM Watch to Slobodan Milošević, the Serb dictator who was charged with crimes against humanity for his role in the Balkan wars. “I think Frank was scared that he couldn’t be as tough as Jim, and so he overcompensated,” says a former executive.

Operation “Noriega”

When George W. Bush ran for president, part of the Republican Party platform was that “homeownership is central to the American Dream.” Those words were manna for Fannie and Freddie. And Bush appointed people to their boards, including Yale classmate Victor Ashe and campaign donor Manuel Justiz. (“It is a great honor to be appointed by the President to serve on the board of a company with such an important housing mission,” wrote the Bush appointees in a letter to ofheo in late 2001.) In 2002, Karl Rove invited Raines to Bush’s economic summit in Waco. Raines still keeps a “Doonesbury” cartoon on his wall that features an admiring Bush saying, “Franklin can tell you … ”

Perhaps most notably, after a 2002 event in Atlanta in which Bush announced his efforts to help 5.5 million black and Hispanic families buy homes before the end of the decade, both Raines and Freddie C.E.O. Leland Brendsel flew back with him on Air Force One.

Then the Bush administration’s attitude changed dramatically. Both sides point to the same catalyst: Enron. “It was as if someone flipped a switch,” Raines says today. A former Bush-administration official says that the last thing the president wanted was to be at the center of another corporate scandal, and if you were looking for likely candidates, how could you miss Fannie and Freddie, with their longtime critics and thin capitalization? Raines, for his part, thought that the administration wanted to deflect the criticism it got for its ties to Enron by pointing at what it could claim was a Democratic scandal in the making. In 2003, Bush’s chief of staff Andrew Card was put in charge of a policy-review group. Soon thereafter, Bush pulled his presidential appointees from the G.S.E.’s boards.

And then there was Fed chairman Alan Greenspan. He was friendly with Raines, had regular lunches with him, and came to the grand Christmas parties at Raines’s home—but he never got past his deep suspicion of the G.S.E.’s. To wit: the portfolio business was a ticking time bomb, and who needed Fannie and Freddie anyway? Big banks, which were supposedly subject to the discipline of the market, were better holders of mortgage risks than the G.S.E.’s.

Although it didn’t happen immediately, Greenspan’s thinking on the G.S.E.’s soon came to dominate the Bush administration’s thinking on them. “[Greenspan and the Bush administration] weren’t interested in having a strong regulator,” says Howard today. “They were interested in constraining Fannie and Freddie. Obviously, Fannie and Freddie weren’t going to agree to that.” So someone had to win, and someone had to lose.

The fight became both nasty and personal in early 2004, when Raines sent what one person calls a “fuck you” letter to Andrew Card. This came about after the homebuilders complained to Raines that Card had told them Raines had agreed to regulatory compromises the homebuilders didn’t want. After that the White House took on Fannie and Freddie in an organized, orchestrated way that was akin to how Fannie itself had long operated. Some of those on the inside jokingly referred to their assault as “Noriega”—as in Manuel Noriega, the former Panamanian dictator and drug kingpin whom the U.S. military blasted with loud, incessant rock music during its attempt to get him to leave a Vatican compound and surrender.

Congressman Barney Frank, a Massachusetts Democrat and longtime supporter of the G.S.E.’s, told a Wall Street analyst that “the [Bush] administration is engaged in a strategy of political attacks on the G.S.E.’s, designed to pressure them into accepting the administration’s regulatory-reform bill by depressing their stock prices.”

But the best weapon the G.S.E.’s opponents could have had was handed to them by Freddie Mac itself. On June 9, 2003, Freddie’s entire top management team was ousted after the company confessed to needing to re-state its earnings for the past three years. They had understated—not overstated, but understated—earnings in order to produce the smoothly growing earnings that investors most valued.

Just days before Freddie announced its accounting error, ofheo had signed off on Freddie’s management and internal controls. This very public mistake was a huge black eye—a “humiliating experience,” in the words of Steve Blumenthal, then ofheo’s deputy director—for an agency that was already smarting from years of perceived and real condescension.

Not that anyone would have guessed that ofheo’s director at the time, Armando Falcon, was a guy to take on the G.S.E. machine. A Texas Democrat who was appointed by Clinton to head ofheo in 1999, Falcon had been raised near San Antonio by a father who was an aircraft mechanic. On the surface, he seemed like a shy, gentle soul—but he was far more politically savvy and ambitious than anyone would have expected. And he had Steve Blumenthal, a longtime Republican Hill staffer who viewed himself as a warrior, by his side. Even ofheo’s supporters say that Falcon and Blumenthal were emotionally invested in getting Fannie. And even though Falcon is a Democrat, he and the White House both wanted the same thing. In early 2004, over Fannie’s protests that “there should be no question about our accounting” in the wake of Freddie’s problems, ofheo launched a review of Fannie’s finances.

Fannie fought back in classic Fannie fashion. They tried to get Falcon and Blumenthal fired. Via a staffer who was a longtime friend and poker buddy of Maloni’s, Fannie got Republican senator Kit Bond of Missouri to launch a counter-investigation into ofheo. The resulting 2004 report from the hud inspector general (I.G.) came to some startling conclusions that couldn’t be dismissed as politics as usual, however. It claimed that Falcon and Blumenthal’s campaign against the G.S.E.’s was both ugly and relentless. It accused ofheo of taking what one source within ofheo called a “publicity-driven approach to oversight” with a “very strong intent to embarrass Fannie Mae.” A witness recalled that Blumenthal was “almost gleeful” when Fannie’s stock went down. (Blumenthal denied being gleeful, but did say, “You can’t hurt them enough to matter.”) Even more troubling was that both ofheo’s chief accountant, Wanda DeLeo, and its chief examiner, Scott Calhoun, complained that Falcon and Blumenthal were overstating Fannie’s problems or prematurely reporting some of ofheo’s findings, partly for political purposes. Another witness, who wanted to remain anonymous, had this way of explaining ofheo’s strategy: “Everybody runs for cover if somebody’s accusing a company of some impropriety in terms of their accounting. All of a sudden, they don’t have any friends anymore.”

An online exchange that took place in December 2007 shows how bitter emotions were, and still are, between Fannie publicist Bill Maloni and Blumenthal.

Maloni: “A HUD IG in a GOP Administration—with no Dems involved in the process—revealed the game you and your friends were playing.… Why would a regulator turn to guerilla tactics and try and financially injure one of its regulated institutions?”

Blumenthal, who didn’t directly answer the question, responded: “It has been my privelege [sic] to fight people like you all my life. Corrupt, fundamentally dishonest, cowards.… The HUD IG didn’t intimidate me, and a Chevy Chase wanna-be thug doesn’t either.” (Maloni now lives in Chevy Chase.)

Falcon and Blumenthal accused Fannie’s management of seeking to “misapply and ignore accounting principles” in order to meet Wall Street’s earning expectations. Although much of this was due to the implementation of a complicated new accounting rule for derivatives—one that caused hundreds of other companies to re-state their results as well—Falcon also accused Fannie of improperly deferring $200 million of expenses in 1998 to the following year in order to meet earnings targets and pay management’s bonuses. Both the Department of Justice and the S.E.C. opened investigations into possible accounting fraud at Fannie Mae.

At a congressional hearing on October 6, 2004, Raines and Falcon faced off. Falcon defended his work; Raines defended himself and his company. At the end of the hearing, longtime G.S.E. antagonist Richard Baker, the Republican from Louisiana, threw a curveball. More than a year earlier, he had requested information from ofheo on the compensation of Fannie’s top executives. He hadn’t released it, because Fannie had hired Ken Starr—the former special prosecutor who investigated Bill Clinton—to represent it, and had gone so far as to threaten “criminal proceedings” against anyone who supposedly violated privacy laws to disclose the information. Now Baker had found his moment. He put up a chart showing that 20 of Fannie’s top executives—including three lobbyists—had earned more than $1 million in 2002, and 9 had made more than $3 million. Today, Baker says that when he brought the chart out “the whole room blew up. It was the most animated room I’d ever seen in a hearing.”

If Fannie had any hope of prevailing, that was demolished on December 15, 2004, when the S.E.C. sided with ofheo and said that Fannie would have to re-state years of earnings, wiping out as much as $9 billion in profits. Under pressure from the board, which was itself under pressure from ofheo, Raines retired and Howard resigned.

One and a half years later, on May 23, 2006, ofheo issued its final report on Fannie Mae. The agency claimed that Fannie’s executives “deliberately and systematically” created earnings “illusions” to hit Fannie’s earnings-per-share targets from 1998 through 2004. Fannie agreed to pay the government $400 million. Christopher Cox, chairman of the S.E.C., promised to “vigorously pursue” the people responsible for this “extensive financial fraud.”

At the end of that year, ofheo sued Frank Raines, along with Tim Howard and controller Leanne Spencer, demanding the payment of $100 million in civil fines and returned bonuses that could exceed $115 million. ofheo said that Raines, in particular, had gotten $90 million in total compensation from 1998 to 2003, of which more than $52 million was directly tied to achieving earnings-per-share targets.

But astonishingly, given the extreme rhetoric from ofheo—Falcon even called Fannie a “government-sponsored Enron”—no criminal charges were filed against Fannie Mae or any of its executives. And despite Cox’s promises, the S.E.C. never filed civil charges against any Fannie Mae executive, either. This past spring, ofheo trumpeted the news that the former Fannie executives had paid $31.4 million to settle the charges against them, with Raines agreeing to forgo cash, stock, and other benefits of $24.7 million. But the headline number was an illusion. In Raines’s case, the bulk of his settlement consisted of stock options that were so out of the money they would never be worth anything, along with $5.3 million that ofheo called “other benefits,” but which Raines says was a “totally made up number.” Nor did Raines agree to keep his mouth shut. In fact, he wanted to respond to the settlement by saying that “the process against me began with lies and ended with lies,” but was persuaded by his lawyers to say instead that “the process invoked against me by ofheo was fundamentally unfair.”

Raines “had to settle because something was wrong,” says current F.H.F.A. director Jim Lockhart. He adds, “It was not one of my happier days. Over 20 percent of the agency’s budget was legal expenses. We were just being eaten up, and [Raines] knew it.”

To this day, Raines insists that he was sabotaged by his enemies. He tells friends that he told Clinton, “They spent more time and money investigating me than you!” In 2007, in a civil suit that is still proceeding against Fannie and its former executives, Raines subpoenaed the White House for what his lawyers called “evidence that officials in the most powerful office in the country were part of a plan to influence the political debate about Fannie Mae.” (Of course, Raines can afford to be aggressive, because as part of his “retirement,” Fannie Mae is paying his legal bills.)

“Frank Raines continues to try to re-write history to protect his reputation, but the history is clear,” counters White House deputy press secretary Tony Fratto.

But, in fact, the history isn’t perfectly clear. There is no question that Fannie Mae’s accounting problems were real, and that under Raines the company had an unhealthy focus on earnings growth, but, still, one has to wonder about the solidity of the charge that Raines led an Enron-like enterprise. While some believe the lack of prosecution merely reflects the Justice Department’s unwillingness to take on a deeply complicated accounting case, others argue that it didn’t have a case, because there is a line between aggressive accounting and intentional fraud. And, in fact, an internal Fannie Mae investigation led by former senator Warren Rudman found no evidence that Raines knew the company’s accounting policies departed significantly from generally accepted principles. For Fannie Mae, the distinction didn’t much matter, because its reputation was tarnished beyond repair.

Despite that, no new legislation for regulation of the G.S.E.’s made it through Congress. While it is true that votes often broke along partisan lines, with the Democrats siding with Fannie and Freddie, it’s also true that Republicans often broke ranks. In one instance, Senator Bob Bennett, a Republican from Utah, sabotaged a bill by adding an amendment that favored the G.S.E.’s. (Bennett’s son worked for Fannie’s partnership office in Utah.) Congressman Mike Oxley, an Ohio Republican and a recipient of much campaign cash from the G.S.E.’s, also introduced bills that the administration thought were too weak. “I think the administration, for whatever reason, wants to do a lot more than is possible,” said Oxley. Says former congressman Richard Baker today, “There were Democrats and Republicans who had reservations.… It was not a partisan thing.”

Maybe the truth is that, as one person puts it, “everyone was still scared of Fannie Mae and Freddie Mac.” Or maybe the truth is that everyone—not just Democrats, and not just Republicans—was terrified that hurting Fannie and Freddie would, as the G.S.E.’s always said, hurt the housing market. “Everybody had a fear of the unknown,” says consultant Bert Ely, another longtime G.S.E. critic.

The End of the Holy War

When Raines was dethroned, the board called Dan Mudd, then the company’s chief operating officer, at seven a.m., just as Mudd was getting dressed for work, and asked him to step in. Mudd couldn’t be more different from Raines and Johnson. He’s “not a rock star,” as one former Fannie employee puts it, he’s not a Democrat, and he’s all businessman. (He ran G.E. Capital Japan before joining Fannie, in 2000.) A self-deprecating ex-Marine, he was not close to Raines, and he had thought about leaving the company because he didn’t like what he called the “arrogant, defiant, my-way Fannie Mae.” But he stayed because, as he later said, “I’m not a quitter.”

Mudd immediately embarked on a strategy of conciliation with ofheo. He visited members of its staff and Congressman Baker, and he even gave ofheo examiners their own badges so they didn’t need a Fannie Mae escort when they were at the company’s offices. “I thought for a very long time that it was our fault, because we were heavy-handed, because we had a propaganda machine,” he says now. “I thought the only way to solve it was to make it Fannie’s problem. It’s like having an argument with your spouse. There’s no use in being right. You have to find the way forward.”

There are some who think Mudd had no choice, and some who are far more critical. Says Maloni, “Dan’s attitude is great if you don’t live in a jungle where all the other animals are trying to eat you.” Even Mudd himself says today, “I thought there were things we could do to be a normal company. I did some, and it turned out they didn’t make a difference.” It was perhaps telling that during his years as C.E.O., he says, no one in the White House would ever take his calls.

By mid-2006 there was a new actor in this long-running drama: Hank Paulson, the former Goldman Sachs C.E.O. who had just become Treasury secretary. Unlike the advisers who surrounded Bush, Paulson did not believe that the G.S.E.’s were the bogeymen of the financial system. After all, they had been major clients of his for years, and the ties between Goldman and Fannie ran deep. Nor did Paulson want any part of what he called “the closest thing I’ve witnessed to a Holy War.”

Paulson quickly began to move away from what one observer calls the “extreme rigidity” of the administration’s position. Then, on the Tuesday night before the 2006 Thanksgiving weekend, he “threw down the gauntlet to change course on where the administration was going,” says someone familiar with the events. He “aggressively argued that the White House should soften its position” and cut a deal for new regulation—which Paulson strongly believed was necessary—with Barney Frank, who had just been named the chairman of the House Financial Services Committee. Bush, who had granted Paulson an unprecedented degree of independence in exchange for his taking the job, soon gave him the authority to change existing policy, according to one inside source.

“I was aghast,” says a longtime G.S.E. foe, expressing a common attitude. “Here we were fighting trench warfare with Fannie and Freddie, and Paulson says, ‘Let’s cut a deal and say we won.’ Some of us really did believe they were a house of cards.”

That fall, Barney Frank told The Washington Post that Paulson had told him he wasn’t going to use the Treasury’s authority to limit Fannie’s and Freddie’s ability to raise money by issuing new bonds. The Bush administration had won that right in 2004, and other Treasury officials had been saying the government would use it. With Paulson’s backing down from Treasury’s position, the White House had lost one of its major clubs against the G.S.E.’s.

At the same time, a critical change was occurring in Fannie’s and Freddie’s businesses. By the mid-2000s, the mortgage market was radically different than it had been in Fannie’s and Freddie’s golden years. What we now all know as the subprime business had taken off, and a whole new breed of opportunistic lenders, such as IndyMac and Washington Mutual, were selling their mortgages to Wall Street, which churned out its own mortgage-backed securities. These were often referred to as private-label securities, or P.L.S.’s, because they bypassed Fannie and Freddie and didn’t have the G.S.E. imprimatur. As a result, Fannie and Freddie, which had always been selective as to which mortgages met their criteria for purchase, saw their market share plunge. Shareholders and customers were begging them to dive into this new, highly profitable world.

Although both companies resisted due to their worries about the riskiness of the new products, eventually senior executives disregarded internal warnings, because the lure of big profits was too great. “We’re rushing to get back into the game,” Mudd told analysts in the fall of 2006. “We will be there.” Both companies did two major things. For their portfolios, they bought Wall Street’s P.L.S.’s. They also began to guarantee so-called Alt-A mortgages—loans made to people who had better credit scores than a subprime customer’s, but who might lack a standard job and pay stub. (These mortgages came to be known as “liar loans,” because either the customers or the brokers, or both, were often just making up the information on the applications.) By the spring of 2008, the companies owned a combined $780 billion of the riskiest mortgages, according to the Congressional Budget Office, even though they had bought P.L.S.’s that were rated Triple A by the rating agencies and they thought their Alt-A product was conservative. But they bought in bulk.

The Green Team

For a brief time, in the summer and fall of 2007, it did look as if the G.S.E.’s would be the saviors of the mortgage market. That is, if you didn’t look too closely and instead just listened to what congressional Democrats were pushing hard, and what the powers that were, including Paulson, Bernanke, Lockhart, and, yes, even President Bush, started saying. As the banks that everyone had said could handle mortgage risk better than the G.S.E.’s deserted the market, stumbling under the weight of billions of dollars in losses on subprime mortgages, there wasn’t anyone else to turn to. Even so, “is there anything dumber than the suggestion that the institutions to rescue the U.S. mortgage market are institutions that are leveraged 60 to 1 and only own U.S. mortgages?” asks one G.S.E. opponent.

Not surprisingly, Fannie and Freddie—egged on by Democrats—seized the opportunity to prove how critical they were to the market. By the first quarter of 2008, they were buying 80 percent of all U.S. mortgages, roughly double their market share from two years earlier.

To some observers, the most remarkable moment came on March 19, 2008, when ofheo held a press conference to announce a deal that Bob Steel—a former Goldman Sachs partner who had joined the Treasury Department shortly after Paulson—had brokered with Fannie and Freddie. The deal was that the G.S.E.’s, which had already sold a combined $14 billion in preferred stock in late 2007, would raise as much as another $10 billion in capital. In return, new ofheo director Jim Lockhart agreed to lower the amount of capital the G.S.E.’s were required to hold, enabling them to acquire another $200 billion in mortgages. Several people who were involved with the discussions say that the theme was that they were all in this together. They say Steel would use the line “We want to come out of this with everyone on the green team.” (Possibly Mudd used the phrase first.) Says Lockhart today, “We could not afford them not being able to provide funding to the housing market.”

ofheo (with Treasury’s support) cut this deal despite the fact that the G.S.E.’s losses from mortgages’ going bad were already escalating. By the spring of 2008, the two had reported combined losses of $9.5 billion over the previous year. And they had just $81 billion in capital, which was 1.5 percent of the $5.2 trillion in mortgages they owned or guaranteed. In other words, if they had to make good on their promises, they had very little money with which to do so. (Skeptics on the Street believed that ofheo’s calculation of Fannie’s and Freddie’s capital was deeply flawed and made the G.S.E.’s look healthier than they were.)

Fannie’s $300 billion Alt-A portfolio accounted for roughly 50 percent of its credit losses. At Freddie, the numbers were similar. Although both companies justified their purchases of risky loans based on their need to meet hud’s affordable-housing goals, former Fannie employees say that, while the P.L.S. purchases did aid in meeting the goals (which, given the abusiveness of these loans, is an abomination), the Alt-A loans did not. In other words, Fannie dove into Alt-A not because of its mission but because of its bottom line—and because its executives feared that Fannie would become irrelevant if it continued to say no to this brave new world.

By the summer of 2008, the market was going from bad to worse, and Fannie’s and Freddie’s stocks were plunging. International banks, which held big chunks of both companies’ debt, were panicking, and asking if the U.S. government stood behind the debt. On July 13, Paulson announced a plan under which Treasury would backstop all of the G.S.E.’s debt and buy equity if needed. “If you’ve got a bazooka, and people know you’ve got it, you may not have to take it out,” Paulson told lawmakers.

Said President Bush about Fannie and Freddie, “We must ensure that they can continue providing access to mortgage credit during this time of financial stress.”

Said Lockhart, “At a very difficult time in the market, the enterprises have the flexibility and sound operations needed to support their mission.” That was when he also said that their capital levels were “well in excess” of federal requirements.

Paulson’s plan was signed into law as part of legislation that—finally!—created a new G.S.E. regulator: the F.H.F.A. This legislation was based on a deal Paulson had cut with Barney Frank. (Despite the criticism of Paulson and Steel, they did succeed where their predecessors had failed, and helped create a far tougher regulator—although by then it was too late.) Slipped in was a provision that exempted Fannie’s and Freddie’s boards from shareholder lawsuits—which was an enormous threat—if they agreed to conservatorship in time of crisis. Fannie didn’t fight this provision, because Mudd thought that conservatorship would require a negotiation. “It’s like the president has the right to fire a nuclear weapon, but it’s unlikely he’ll do so,” as Mudd put it. And maybe there was also a little hubris at work. “I used to say that if two accounting scandals [and] a Republican Congress and White House couldn’t kill us, how could you kill us ever?” says a former executive.

Others knew better. “Fannie Mae figured they could give the government an enormous loaded gun and they’d never fire it,” says Tim Howard.

Paulson’s bazooka to help Fannie and Freddie failed. It failed for a mixture of reasons. Investors were unsure what their eventual losses would be. Both companies announced terrible 2008 second-quarter results, with Fannie losing $2.3 billion and Freddie losing $821 million. But investors were also unsure what the new legislation meant. No one wanted to risk putting money into the G.S.E.’s, only to have the government radically raise capital requirements—or step in and wipe the shareholders out. And so, as if the second-quarter results hadn’t caused enough alarm on their own, the legislation had the perverse effect of ensuring the companies would be unable to raise new capital, even as everyone began to say that they had to do so.

Maybe Fannie’s executives should have anticipated what happened next, but they didn’t. After taking the red-eye back from a short family trip over Labor Day weekend—a trip he’d had to reschedule four times—Mudd got a letter from Lockhart that abruptly changed the tone. It “condemned everything we’d ever done,” says one person familiar with the letter’s contents. (Lockhart agrees it was a “severe letter” but says he had given “verbal warnings” about what his agency saw as a “significant deterioration” in their financial position.) On Friday morning, Mudd was summoned to the meeting at F.H.F.A. at three p.m. that day. When the Fannie contingent arrived, there had been no preparation for the meeting, so they were wandering around the lobby when Bernanke came in the front door. The Fannie people also spotted a Wall Street Journal reporter, who had been given advance notice of the meeting, lurking outside the door. It was “almost comical if it weren’t tragic,” Mudd has since joked.

In a conference room off his office, Lockhart told Fannie, he says today, that “pending losses … were going to make it such that [Fannie and Freddie] could not function and fulfill their mission” of supporting the housing market. Then government officials told Fannie that the company had to give its consent to conservatorship.

As for the terms, they were fairly straightforward, with one exception. The government would acquire $1 billion of preferred shares, giving it 80 percent of the company and pretty much wiping out the existing shareholders. Although the government would provide no upfront cash, it would put in money up to a combined $200 billion for Fannie and Freddie if needed. Both Mudd and Syron were out, and in short order they were told to forfeit their “golden parachutes.” The exception was an odd detail: Fannie and Freddie would be allowed to grow their portfolios through 2009 in order to help the mortgage market, but then would have to shrink them to $250 billion each. To Fannie people, that provision seemed like a clear indication that their adversaries had had a hand in the battle that ended the war.

Although Freddie agreed to conservatorship at a separate meeting that same day, the Fannie contingent headed to Sullivan & Cromwell’s law offices and called all of their board members to fly to Washington on Saturday for a deliberation. The board came to the conclusion that they had no choice. They could not “single-handedly declare war on the federal government!” Mudd said. Added board chairman Steve Ashley, according to people who were present: “We’ve closed the book on 70 years of housing policy in this country.”

There are a lot of conflicting views on why Paulson abruptly stopped supporting the G.S.E.’s. The best explanation is probably that he was convinced they needed large amounts of capital, and there was no way, given what a Treasury official calls the polarizing quality of the G.S.E.’s in Washington, that he could simply cut them a check without punishing their shareholders and executives.

When a CNBC host asked Paulson what he thought the losses would be, he said, “We didn’t sit there and figure this out with a calculator.” In truth, there’s no way to know, because the ultimate number will depend on what happens with the housing market, and on what activities Fannie and Freddie undertake at the direction of their new owner: you! Estimates, which depend on whether you talk to a G.S.E. friend or foe, range from as low as $30 billion for Fannie to well over the $100 billion the government has allocated to each G.S.E.

But a few things are clear. One is that the argument that Fannie and Freddie caused our entire economic calamity is absurd. Yes, the volume of bad mortgages that Fannie and Freddie bought may have blown the bubble bigger than it otherwise would have been. But to put the blame entirely on Fannie and Freddie is to exempt all the other players, including the mortgage originators who sold subprime mortgages and Wall Street, which packaged up the bad mortgages and sold them to investors around the globe.

Another thing that’s clear is that the critics were both right and very wrong about Fannie and Freddie. Yes, their executives and shareholders made fortunes in the glory years, and, yes, taxpayers are now bearing the brunt of whatever losses there are. Just as critics always warned, it’s “the privatization of profits and the socialization of risks.” But what the critics missed is that that wasn’t unique to Fannie and Freddie. It turns out our entire financial sector was operating under that same premise—and to a far greater degree than Fannie and Freddie.

The last thing is that what happened on September 7 didn’t solve anything. In fact, quite the opposite. “It is a hodgepodge of nothing,” says one Wall Streeter. One key idea was, as the Treasury put it, that Fannie and Freddie would “work to increase the availability of mortgage finance.” In other words, the government takeover would reduce the cost of Fannie’s and Freddie’s funds, thereby enabling them to raise money at cheap rates and pump that money into the mortgage market. In a great irony, almost everyone, even some longtime critics, now agree that’s necessary. As Larry Summers recently said, “They have to be used to keep the flow of capital going to the housing market.”

But the terms of the conservatorship are confusing, because the government backing lasts only through 2009, and government officials refuse to confirm that the U.S. actually guarantees Fannie’s and Freddie’s debt. Instead, they say there is an “effective guarantee”—which means nothing in a market as untrusting as this one. And so, Fannie’s and Freddie’s cost of funds has shot higher, making it economically unfeasible for them to buy up a slew of mortgages. (Lockhart continues to defend the conservatorship. “If we hadn’t done it, there would probably have been a run on the bank,” he says, adding, “My view is that conservatorship is working at this point. We prevented a downward spiral.”)

In other words, in the greatest irony of all, the G.S.E.’s critics have finally gotten what they wanted—Fannie’s and Freddie’s perceived ties to the government have been weakened—just when no one wants that anymore. “There is culpability somewhere,” says a former Fannie executive. “Whether it is a conspiracy or incompetence, I don’t know.” And in some ways, that sums up the entire story—on both sides.

Niall Ferguson: America Needs to Cancel Its Debt

January 31, 2009

niall-ferguson.jpg

As dedicated V.F. readers already know, Niall Ferguson “gets” the economic collapse. Now, the historian and bestselling author is sharing his insights with a new book, The Ascent of Money, and an accompanying TV special (which means regular people might actually absorb some of what he has to say).

And what he has to say is rather terrifying, with profound implications for an Obama presidency and, beyond that, the future of the United States as a superpower. I tried putting my most basic questions about the economy to Ferguson, and here’s how it went:

Michael Hogan: First of all, this whole financial collapse is great timing for your book. Are you psyched?

Niall Ferguson: Well, I can say with a degree of self-satisfaction that it wasn’t luck. Two and a half years ago I decided to write this book, because I was sure that this financial crisis was going to happen, and the reason I was sure was because people kept coming up to me—whether it was investment bankers or hedge fund managers—telling me that volatility was dead that there would never be another recession. I just thought, “These people have completely disconnected from reality, and financial history is going to come back and bite them in the ass.”

In the book, you identify the five stages of a bubble. What stage are we at now?

We’re pretty much at the last stage, which is the Panic stage.

If you remember roughly how it goes, you begin with some kind of Displacement or shift that changes the economic environment. I would say in this case, the displacement was really caused by the wall of Asian savings coming into the U.S. and keeping interest rates lower than would have normally been in the cycle.

Then you get the Euphoria, which is when people say, “God, now prices can only go up, we should buy more. We should borrow more, because this is a one-way bet.” And then more and more people enter the market, first-time buyers, and that’s the classic run-up phase of the bubble. Then there’s this sort of irrational-exuberance Mania—which came at the end of ’06, when we still had property prices rising at an annualized rate of 20 percent. But then you get Distress. That’s when the insiders, the smart people, start to look at one another and say, “This is nuts, we should get out.” That’s when the John Paulsons start to short the market.

And then you get the shift into downward movement of prices, which ultimately culminates in Panic, when everybody heads for the exit together. In this bubble, it happened in a strange kind of slow motion, because the game was really up in August 2007, but it wasn’t really a full-fledged panic—at least across the board—until Lehman Brothers, September 15, 2008, more than a year later. Wile E. Coyote ran off the cliff in August of ’07, but he didn’t really look down until over a year later.

What do you think happens next to the stock market, to the real estate market, and to the banks?

Well, they have further to fall, without doubt, because we’re going to get almost a third phase of the crisis. The third phase of the crisis is when rising unemployment starts to impact the real estate market and consumer spending generally. So, we go another leg down. Unemployment is rising at a very rapid rate. It could go as high as 10 percent, and it’s just going to keep going up in the next two quarters, maybe throughout the year, because this is bad. This is worse than the early 80s. This is as bad as it’s been since the 30s.

And in those conditions, there’s going to be further negative movement of real estate prices, and further negative movement of stock prices. People are going to get horrible earnings reports, and when the earnings reports turn out to be worse than anybody expected, the prices of most corporations are going to head south. So, it’s certainly not over. Best case, the rate of decline begins to slow, so we’re not falling vertiginously. We start to fall more gradually.

Is a $700 billion stimulus, like the one Obama is talking about, better than nothing?

Well, it is better than nothing.

Why?

Well, I think we have to realize that nothing would be the Great Depression. So it will be a “success” if output only contracts by five or seven percent. It will be a “success” if unemployment only reaches 11 percent, because in the Great Depression output contracted 30 percent, and unemployment went to 25 percent.

These measures that we’re taking at the moment are preventative measures. They’re really designed to prevent a complete implosion of the economy. That’s why I call this, the “Great Repression,” with an “R,” because we are repressing this problem. But, that’s not the same as a cure. And what we’re going to see will look very disappointing, because we’ll be comparing it to the recovery of the sort that we used to see. In a traditional post-war recession, there would be a shock; the Fed would cut rates; there would be some kind of fiscal stimulus; and the economy would quite quickly recover.

The reason that won’t work this time, and this is the key point, is that the whole U.S. economy became excessively leveraged in the last ten years. The debt burden, as a proportion of G.D.P., is in the region of 355 percent. So, debt is three and a half times the output of the economy. That’s some kind of historic maximum, and those debts aren’t going away.

So we’ve all been bingeing on money that we didn’t have.

That we borrowed. And we borrowed it from abroad, ultimately. This has been financed by borrowing from petrol exporters, and borrowing from Asian central banks, and sovereign wealth funds. But yeah, whether it was the people who refinanced their mortgages and spent the money that they pocketed, or banks that juiced their returns by piling on the leverage, the whole system became excessively indebted. And notice: what is the policy response? You guessed it, more debt. And, now it’s federal debt.

So you end up in a situation where you’re curing a debt problem with more debt. Is that going to bring about a sustained recovery? I find that hard to believe.

So I guess the unanswerable question is, what could you do to solve this problem?

Well I’ll tell you what you have to do—you actually have to cancel the debt. There are historical precedents for this.

Excessive debt burdens in the past tended to be public sector debts. What we’ve got now is an exceptional level of private debt. There’s never been an economy in history that’s had so much private debt. Britain and America today lead the world in the indebtedness of the household sector and the banking sector and the corporate sector. But debt is debt; it doesn’t even matter if it’s household debt or government. Once it gets to a certain level, there is a problem.

In the past, when excessive debt burdens were accumulated by government, they tended to do one of two things: either they defaulted—this is the Argentine solution—where you say, “Ah, I’m sorry, I’m afraid we’re not going to be able to meet the interest payments this month, and never again will we make the interest payments.”

The other scenario is inflation, where the real debt burden is eroded because the money that it’s denominated in loses value.

I don’t think we’re really going to be out of the woods here until something of that sort happens to the huge debt burdens of the U.S. economy. Either these debts will have to be fundamentally written off in some way, or inflation will have to reduce the real burden.

Don’t either of those scenarios spell the end of America as the world’s unrivalled superpower?

Well, it certainly will be extremely painful. And that is why we have to look very closely at the attitude of the foreign creditors, because the U.S. owes the rest of the world a lot of money. Half the federal debt is held by foreigners. And if the U.S. either defaults on debt or allows the dollar to depreciate, the rest of the world is going to say, “Wait a second, you just screwed us.” And that’s, I think, the moment at which the United States experiences the British experience—when, in the dark days of the 60s and 70s, Britain fundamentally lost its credibility and ceased to be a financial great power. The I.M.F. had to come in, and the pound plunged to unheard-of depths.

And George Soros became a billionaire, right?

George Soros and others made some serious money off the back of it, certainly. I mean, somebody can make an awful lot of money off a massive dollar sell-off this year.

How badly could the Chinese screw us if they wanted to?

Well, they would have a difficulty in that they would kind of be screwing themselves. This is their dilemma. There’s a sort of “death embrace” quality to this, I think that someone’s talked about mutually assured financial destruction. The Chinese have got, we know, reserves in the region of $1.9 trillion, and 70 percent [of it is] dollar denominated, probably. That’s a huge pile of treasury bonds, not to mention Fannie and Freddie debt that they’ve accumulated over the last decade, when they’ve been intervening to keep their currency weak, and earning these vast amounts of foreign currency by running these trade surpluses. Now, politically, it might be quite tempting for the Chinese to phone up and say, “We really disagree with you about, let’s say, Taiwan and Japan and North Korea. You’d better listen to us, because otherwise, People’s Bank of China starts selling ten-year treasuries, and then you guys are dead.”

But then their investments become worthless.

Then you lose about five percent of China’s GDP, and that’s a hard sell—even for an authoritarian regime. So, they have a dilemma, and they are discovering the ancient truth that, when the debt is big enough, it’s the debtor who has the power, not the creditor.

But, then again, these things aren’t always the result of calculated policy, decisions. There’s a sense in which a catalyst elsewhere could force the hand of People’s Bank of China. It doesn’t need to be the Chinese who start the run of the dollar. It could be Middle Eastern investors.

In which case the Chinese might just follow and cut their losses.

Well, they might have no alternative. They might be facing the decision that, “If we hold on, you know, we’re left really holding the hot potato.” So, that is a big worry of theirs. I know it’s a big worry of theirs. They’re thinking, “Can we somehow sneak out of some of these decisions without anybody noticing?” That’s why they’re so secretive.

One of the great problems for anybody trying to make a decision about currency is, where else do you go? Short-term, it seems to me that everybody is kind of stuck trying to avoid this dollar crisis because it would be so expensive for those people who are invested in the U.S. But you shouldn’t assume—you can’t assume—that this is a stable state of affairs. It’s anything but that. It’s very, very precarious.

Your book is about moments in history where there were innovations—the creation of money, the creation of credit, the creation of bonds, the stock market, and so on. And the people who were at the wheel during the run up to the bubble seemed convinced that they had overseen an innovation on this level. Now we’re seeing that maybe they didn’t. Fifty or 500 years from now, when someone writes a book like this one, do you think they’ll look back and see something valuable that came out of this?

I’m sure they will. They’ll look back and they’ll say, “What an extraordinary proliferation of new financial instruments and business models there was between 1980 and 2006. And then the crisis came along, and it was like one of those events in natural history: asteroid hits the Earth, environment becomes a lot colder, only the strong survive.

Some species will be extinct: investment banks are already extinct, and hedge funds will go extinct in six months. But they won’t all disappear, and the strong and well managed—and lucky!—will survive. The derivatives market will contract, but it won’t disappear, because those are useful things. They are simply insurance policies. Too many of them were sold at bad prices. It was clear that the models which were being used to price, say, credit default swaps were fundamentally unrealistic about the probability of defaults. That doesn’t mean that the underlying idea of being able to buy protection against default is a bad one.

And that’s characteristic of financial history. If you go back to, say, the banking innovations of the 17th and 18th century, when new banks proliferated all over the English-speaking world, from Scotland to Massachusetts and beyond, banks were invented, and then along would come a financial crisis, and large numbers of them would go bust. But yeah, the ones that survived generally ended up being better banks, and I think that’s the cheerful news. This is an evolutionary system, there is an element of Darwinian, of the survival of the fittest, and although crises seem to be an integral part of the system, no crisis has been completely fatal to it.

The Big Fix

January 30, 2009


Published: January 27, 2009

I. WHITHER GROWTH?

The economy will recover. It won’t recover anytime soon. It is likely to get significantly worse over the course of 2009, no matter what President Obama and Congress do. And resolving the financial crisis will require both aggressiveness and creativity. In fact, the main lesson from other crises of the past century is that governments tend to err on the side of too much caution — of taking the punch bowl away before the party has truly started up again. “The mistake the United States made during the Depression and the Japanese made during the ’90s was too much start-stop in their policies,” saidTimothy Geithner, Obama’s choice for Treasury secretary, when I went to visit him in his transition office a few weeks ago. Japan announced stimulus measures even as it was cutting other government spending. Franklin Roosevelt flirted with fiscal discipline midway through the New Deal, and the country slipped back into decline.

Geithner arguably made a similar miscalculation himself last year as a top Federal Reserve official who was part of a team that allowed Lehman Brothers to fail. But he insisted that the Obama administration had learned history’s lesson. “We’re just not going to make that mistake,” Geithner said. “We’re not going to do that. We’ll keep at it until it’s done, whatever it takes.”

Once governments finally decide to use the enormous resources at their disposal, they have typically been able to shock an economy back to life. They can put to work the people, money and equipment sitting idle, until the private sector is willing to begin using them again. The prescription developed almost a century ago by John Maynard Keynesdoes appear to work.

But while Washington has been preoccupied with stimulus and bailouts, another, equally important issue has received far less attention — and the resolution of it is far more uncertain. What will happen once the paddles have been applied and the economy’s heart starts beating again? How should the new American economy be remade? Above all, how fast will it grow?

That last question may sound abstract, even technical, compared with the current crisis. Yet the consequences of a country’s growth rate are not abstract at all. Slow growth makes almost all problems worse. Fast growth helps solve them. As Paul Romer, an economist at Stanford University,has said, the choices that determine a country’s growth rate “dwarf all other economic-policy concerns.”

Growth is the only way for a government to pay off its debts in a relatively quick and painless fashion, allowing tax revenues to increase without tax rates having to rise. That is essentially what happened in the years after World War II. When the war ended, the federal government’s debt equaled 120 percent of the gross domestic product (more than twice as high as its likely level by the end of next year). The rapid economic growth of the 1950s and ’60s — more than 4 percent a year, compared with 2.5 percent in this decade — quickly whittled that debt away. Over the coming 25 years, if growth could be lifted by just one-tenth of a percentage point a year, the extra tax revenue would completely pay for an $800 billion stimulus package.

Yet there are real concerns that the United States’ economy won’t grow enough to pay off its debts easily and ensure rising living standards, as happened in the postwar decades. The fraternity of growth experts in the economics profession predicts that the economy, on its current path, will grow more slowly in the next couple of decades than over the past couple. They are concerned in part because two of the economy’s most powerful recent engines have been exposed as a mirage: the explosion in consumer debt and spending, which lifted short-term growth at the expense of future growth, and the great Wall Street boom, which depended partly on activities that had very little real value.

Richard Freeman, a Harvard economist, argues that our bubble economy had something in common with the old Soviet economy. The Soviet Union’s growth was artificially raised by massive industrial output that ended up having little use. Ours was artificially raised by mortgage-backed securities, collateralized debt obligations and even the occasional Ponzi scheme.

Where will new, real sources of growth come from? Wall Street is not likely to cure the nation’s economic problems. Neither, obviously, is Detroit. Nor is Silicon Valley, at least not by itself. Well before the housing bubble burst, the big productivity gains brought about by the 1990s technology boom seemed to be petering out, which suggests that the Internet may not be able to fuel decades of economic growth in the way that the industrial inventions of the early 20th century did. Annual economic growth in the current decade, even excluding the dismal contributions that 2008 and 2009 will make to the average, has been the slowest of any decade since the 1930s.

So for the first time in more than 70 years, the epicenter of the American economy can be placed outside of California or New York or the industrial Midwest. It can be placed in Washington. Washington won’t merely be given the task of pulling the economy out of the immediate crisis. It will also have to figure out how to put the American economy on a more sustainable path — to help it achieve fast, broadly shared growth and do so without the benefit of a bubble. Obama said as much in his inauguration speech when he pledged to overhaul Washington’s approach to education, health care, science and infrastructure, all in an effort to “lay a new foundation for growth.”

For centuries, people have worried that economic growth had limits — that the only way for one group to prosper was at the expense of another. The pessimists, from Malthus and the Luddites and on, have been proved wrong again and again. Growth is not finite. But it is also not inevitable. It requires a strategy.

II. THE UPSIDE OF A DOWNTURN

TWO WEEKS AFTER THE ELECTION, Rahm Emanuel, Obama’s chief of staff, appeared before an audience of business executives and laid out an idea that Lawrence H. Summers, Obama’s top economic adviser, later described to me as Rahm’s Doctrine. “You never want a serious crisis to go to waste,” Emanuel said. “What I mean by that is that it’s an opportunity to do things you could not do before.”

In part, the idea is standard political maneuvering. Obama had an ambitious agenda — on health care, energy and taxes — before the economy took a turn for the worse in the fall, and he has an interest in connecting the financial crisis to his pre-existing plans. “Things we had postponed for too long, that were long term, are now immediate and must be dealt with,” Emanuel said in November. Of course, the existence of the crisis doesn’t force the Obama administration to deal with education or health care. But the fact that the economy appears to be mired in its worst recession in a generation may well allow the administration to confront problems that have festered for years. That’s the crux of the doctrine.

The counterargument is hardly trivial — namely, that the financial crisis is so serious that the administration shouldn’t distract itself with other matters. That is a risk, as is the additional piling on of debt for investments that might not bear fruit for a long while. But Obama may not have the luxury of trying to deal with the problems separately. This crisis may be his one chance to begin transforming the economy and avoid future crises.

In the early 1980s, an economist named Mancur Olson developed a theory that could fairly be called the academic version of Rahm’s Doctrine. Olson, a University of Marylandprofessor who died in 1998, is one of those academics little known to the public but famous among his peers. His seminal work, “The Rise and Decline of Nations,” published in 1982, helped explain how stable, affluent societies tend to get in trouble. The book turns out to be a surprisingly useful guide to the current crisis.

In Olson’s telling, successful countries give rise to interest groups that accumulate more and more influence over time. Eventually, the groups become powerful enough to win government favors, in the form of new laws or friendly regulators. These favors allow the groups to benefit at the expense of everyone else; not only do they end up with a larger piece of the economy’s pie, but they do so in a way that keeps the pie from growing as much as it otherwise would. Trade barriers and tariffs are the classic example. They help the domestic manufacturer of a product at the expense of millions of consumers, who must pay high prices and choose from a limited selection of goods.

Olson’s book was short but sprawling, touching on everything from the Great Depressionto the caste system in India. His primary case study was Great Britain in the decades after World War II. As an economic and military giant for more than two centuries, it had accumulated one of history’s great collections of interest groups — miners, financial traders and farmers, among others. These interest groups had so shackled Great Britain’s economy by the 1970s that its high unemployment and slow growth came to be known as “British disease.”

Germany and Japan, on the other hand, were forced to rebuild their economies and political systems after the war. Their interest groups were wiped away by the defeat. “In a crisis, there is an opportunity to rearrange things, because the status quo is blown up,” Frank Levy, an M.I.T. economist and an Olson admirer, told me recently. If a country slowly glides down toward irrelevance, he said, the constituency for reform won’t take shape. Olson’s insight was that the defeated countries of World War II didn’t rise in spite of crisis. They rose because of it.

The parallels to the modern-day United States, though not exact, are plain enough. This country’s long period of economic pre-eminence has produced a set of interest groups that, in Olson’s words, “reduce efficiency and aggregate income.” Home builders and real estate agents pushed for housing subsidies, which made many of them rich but made the real estate bubble possible. Doctors, drug makers and other medical companies persuaded the federal government to pay for expensive treatments that have scant evidence of being effective. Those treatments are the primary reason this country spends so much more than any other on medicine. In these cases, and in others, interest groups successfully lobbied for actions that benefited them and hurt the larger economy.

Surely no interest group fits Olson’s thesis as well as Wall Street. It used an enormous amount of leverage — debt — to grow to unprecedented size. At times Wall Street seemed ubiquitous. Eight Major League ballparks are named for financial-services companies, as are the theater for the Alvin Ailey dance company, a top children’s hospital in New York and even a planned entrance of the St. Louis Zoo. At Princeton, the financial-engineering program, meant to educate future titans of finance, enrolled more undergraduates than any of the traditional engineering programs. Before the stock market crashed last year, finance companies earned 27 percent of the nation’s corporate profits, up from about 15 percent in the 1970s and ’80s. These profits bought political influence. Congress taxed the income of hedge-fund managers at a lower rate than most everyone else’s. Regulators didn’t ask too many hard questions and then often moved on to a Wall Street job of their own.

In good times — or good-enough times — the political will to beat back such policies simply doesn’t exist. Their costs are too diffuse, and their benefits too concentrated. A crisis changes the dynamic. It’s an opportunity to do things you could not do before.

England’s crisis was the Winter of Discontent, in 1978-79, when strikes paralyzed the country and many public services shut down. The resulting furor helped elect Margaret Thatcher as prime minister and allowed her to sweep away some of the old economic order. Her laissez-faire reforms were flawed in some important ways — taken to an extreme, they helped create the current financial crisis — and they weren’t the only reason for England’s turnaround. But they made a difference. In the 30 years since her election, England has grown faster than Germany or Japan.

III. THE INVESTMENT GAP

ONE GOOD WAY TO UNDERSTAND the current growth slowdown is to think of the debt-fueled consumer-spending spree of the past 20 years as a symbol of an even larger problem. As a country we have been spending too much on the present and not enough on the future. We have been consuming rather than investing. We’re suffering from investment-deficit disorder.

You can find examples of this disorder in just about any realm of American life. Walk into a doctor’s office and you will be asked to fill out a long form with the most basic kinds of information that you have provided dozens of times before. Walk into a doctor’s office inmany other rich countries and that information — as well as your medical history — will be stored in computers. These electronic records not only reduce hassle; they also reduce medical errors. Americans cannot avail themselves of this innovation despite the fact that the United States spends far more on health care, per person, than any other country. We are spending our money to consume medical treatments, many of which have only marginal health benefits, rather than to invest it in ways that would eventually have far broader benefits.

Along similar lines, Americans are indefatigable buyers of consumer electronics, yet asmaller share of households in the United States has broadband Internet service than in Canada, Japan, Britain, South Korea and about a dozen other countries. Then there’s education: this country once led the world in educational attainment by a wide margin. It no longer does. And transportation: a trip from Boston to Washington, on the fastest train in this country, takes six-and-a-half hours. A trip from Paris to Marseilles, roughly the same distance, takes three hours — a result of the French government’s commitment to infrastructure.

These are only a few examples. Tucked away in the many statistical tables at the Commerce Department are numbers on how much the government and the private sector spend on investment and research — on highways, software, medical research and other things likely to yield future benefits. Spending by the private sector hasn’t changed much over time. It was equal to 17 percent of G.D.P. 50 years ago, and it is about 17 percent now. But spending by the government — federal, state and local — has changed. It has dropped from about 7 percent of G.D.P. in the 1950s to about 4 percent now.

Governments have a unique role to play in making investments for two main reasons. Some activities, like mass transportation and pollution reduction, have societal benefits but not necessarily financial ones, and the private sector simply won’t undertake them. And while many other kinds of investments do bring big financial returns, only a fraction of those returns go to the original investor. This makes the private sector reluctant to jump in. As a result, economists say that the private sector tends to spend less on research and investment than is economically ideal.

Historically, the government has stepped into the void. It helped create new industries with its investments. Economic growth has many causes, including demographics and some forces that economists admit they don’t understand. But government investment seems to have one of the best track records of lifting growth. In the 1950s and ’60s, the G.I. Bill created a generation of college graduates, while the Interstate System of highways made the entire economy more productive. Later, the Defense Department developed the Internet, which spawned AOLGoogle and the rest. The late ’90s Internet boom was the only sustained period in the last 35 years when the economy grew at 4 percent a year. It was also the only time in the past 35 years when the incomes of the poor and the middle class rose at a healthy pace. Growth doesn’t ensure rising living standards for everyone, but it sure helps.

Even so, the idea that the government would be playing a much larger role in promoting economic growth would have sounded radical, even among Democrats, until just a few months ago. After all, the European countries that have tried guiding huge swaths of their economies — that have kept their arms around the “commanding heights,” in Lenin’s enduring phrase — have grown even more slowly than this country in recent years. But the credit crunch and the deepening recession have changed the discussion here. The federal government seems as if it was doing too little to take advantage of the American economy’s enormous assets: its size, its openness and its mobile, risk-taking work force. The government is also one of the few large entities today able to borrow at a low interest rate. It alone can raise the capital that could transform the economy in the kind of fundamental ways that Olson described.

“This recession is a critical economic problem — it is a crisis,” Summers told me recently. “But a moment when there are millions of people who are unemployed, when the federal government can borrow money over the long term at under 3 percent and when we face long-run fiscal problems is also a moment of great opportunity to make investments in the future of the country that have lagged for a long time.”

He then told a story that John F. Kennedy liked to tell, about an early-20th-century French marshal named Hubert Lyautey. “The guy says to his gardener, ‘Could you plant a tree?’ ” Summers said. “The gardener says, ‘Come on, it’s going to take 50 years before you see anything out of that tree.’ The guy says, ‘It’s going to take 50 years? Really? Then plant it this morning.’ ”

IV. STIMULUS VS. TRANSFORMATION

THE OBAMA ADMINISTRATION’S FIRST CHANCE to build a new economy — an investment economy — is the stimulus package that has been dominating policy discussions in Washington. Obama has repeatedly said he wants it to be a down payment on solving bigger problems. The twin goals, he said recently, are to “immediately jump-start job creation and long-term growth.” But it is not easy to balance those goals.

For the bill to provide effective stimulus, it simply has to spend money — quickly. Employing people to dig ditches and fill them up again would qualify. So would any of the “shovel ready” projects that have made it onto the list of stimulus possibilities. Even the construction of a mob museum in Las Vegas, a project that was crossed off the list after Republicans mocked it, would work to stimulate the economy, so long as ground was broken soon. Pork and stimulus aren’t mutually exclusive. But pork won’t transform an economy. Neither will the tax cuts that are likely to be in the plan.

Sometimes a project can give an economy a lift and also lead to transformation, but sometimes the goals are at odds, at least in the short term. Nothing demonstrates this quandary quite so well as green jobs, which are often cited as the single best hope for driving the post-bubble economy. Obama himself makes this case. Consumer spending has been the economic engine of the past two decades, he has said. Alternative energy will supposedly be the engine of the future — a way to save the planet, reduce the amount of money flowing to hostile oil-producing countries and revive the American economy, all at once. Put in these terms, green jobs sounds like a free lunch.

Green jobs can certainly provide stimulus. Obama’s proposal includes subsidies for companies that make wind turbines, solar power and other alternative energy sources, and these subsidies will create some jobs. But the subsidies will not be nearly enough to eliminate the gap between the cost of dirty, carbon-based energy and clean energy. Dirty-energy sources — oil, gas and coal — are cheap. That’s why we have become so dependent on them.

The only way to create huge numbers of clean-energy jobs would be to raise the cost of dirty-energy sources, as Obama’s proposed cap-and-trade carbon-reduction program would do, to make them more expensive than clean energy. This is where the green-jobs dream gets complicated.

For starters, of the $700 billion we spend each year on energy, more than half stays inside this country. It goes to coal companies or utilities here, not to Iran or Russia. If we begin to use less electricity, those utilities will cut jobs. Just as important, the current, relatively low price of energy allows other companies — manufacturers, retailers, even white-collar enterprises — to sell all sorts of things at a profit. Raising that cost would raise the cost of almost everything that businesses do. Some projects that would have been profitable toBoeing, Kroger or Microsoft in the current economy no longer will be. Jobs that would otherwise have been created won’t be. As Rob Stavins, a leading environmental economist, says, “Green jobs will, to some degree, displace other jobs.” Just think about what happened when gas prices began soaring last spring: sales of some hybrids increased, but vehicle sales fell overall.

None of this means that Obama’s climate policy is a mistake. Raising the price of carbon makes urgent sense, for the well-being of the planet and of the human race. And the economic costs of a serious climate policy are unlikely to be nearly as big as the alarmists — lobbyists and members of Congress trying to protect old-line energy industries — suggest. Various analyses of Obama’s cap-and-trade plan, including one by Stavins, suggest that after it is fully implemented, it would cost less than 1 percent of gross domestic product a year, or about $100 billion in today’s terms. That cost is entirely manageable. But it’s still a cost.

Or perhaps we should think of it as an investment. Like so much in the economy, our energy policy has been geared toward the short term. Inexpensive energy made daily life easier and less expensive for all of us. Building a green economy, on the other hand, will require some sacrifice. In the end, that sacrifice should pay a handsome return in the form of icecaps that don’t melt and droughts that don’t happen — events with costs of their own. Over time, the direct economic costs of a new energy policy may also fall. A cap-and-trade program will create incentives for the private sector to invest in alternative energy, which will lead to innovations and lower prices. Some of the new clean-energy spending, meanwhile, really will replace money now flowing overseas and create jobs here.

But all those benefits will come later. The costs will come sooner, which is a big reason we do not already have a green economy — or an investment economy.

V. CURING INEFFICIENCIES

WASHINGTON’S CHALLENGE on energy policy is to rewrite the rules so that the private sector can start building one of tomorrow’s big industries. On health care, the challenge is keeping one of tomorrow’s industries from growing too large.

For almost two decades, spending on health care grew rapidly, no matter what the rest of the economy was doing. Some of this is only natural. As a society gets richer and the basic comforts of life become commonplace, people will choose to spend more of their money on health and longevity instead of a third car or a fourth television.

Much of the increases in health care spending, however, are a result of government rules that have made the sector a fabulously — some say uniquely — inefficient sector. These inefficiencies have left the United States spending far more than other countries on medicine and, by many measures, getting worse results. The costs of health care are now so large that it has become one problem that cannot be solved by growth alone. It’s qualitatively different from the other budget problems facing the government, like the Wall Street bailout, the stimulus, the war in Iraq or Social Security.

You can see that by looking at various costs as a share of one year of economic output — that is, gross domestic product. Surprisingly, the debt that the federal government has already accumulated doesn’t present much of a problem. It is equal to about $6 trillion, or 40 percent of G.D.P., a level that is slightly lower than the average of the past six decades. The bailout, the stimulus and the rest of the deficits over the next two years will probably add about 15 percent of G.D.P. to the debt. That will take debt to almost 60 percent, which is above its long-term average but well below the levels of the 1950s. But the unfinanced parts of Medicare, the spending that the government has promised over and above the taxes it will collect in the coming decades requires another decimal place. They are equal to more than 200 percent of current G.D.P.

During the campaign, Obama talked about the need to control medical costs and mentioned a few ideas for doing so, but he rarely lingered on the topic. He spent more time talking about expanding health-insurance coverage, which would raise the government’s bill. After the election, however, when time came to name a budget director, Obama sent a different message. He appointed Peter Orszag, who over the last two years has become one of the country’s leading experts on the looming budget mess that is health care.

Orszag is a tall, 40-year-old Massachusetts native, made taller by his preference for cowboy boots, who has risen through the Democratic policy ranks over the last 15 years. He received a Ph.D. from the London School of Economics, later joined the Clinton White House and, from 2007, was the director of the Congressional Budget Office. While there, he devoted himself to studying health care, believing that it was far more important to the future of the budget than any other issue in front of Congress. He nearly doubled the number of health care analysts in the office, to 50. Obama highlighted this work when he announced Orszag’s appointment in November.

In Orszag’s final months on Capitol Hill, he specifically argued that health care reform should not wait until the financial system has been fixed. “One of the blessings in the current environment is that we have significant capacity to expand and sell Treasury debt,” he told me recently. “If we didn’t have that, and if the financial markets didn’t have confidence that we would repay that debt, we would be in even more dire straits than we are.” Absent a health care overhaul, the federal government’s lenders around the world may eventually grow nervous about its ability to repay its debts. That, in turn, will cause them to demand higher interest rates to cover their risk when lending to the United States. Facing higher interest rates, the government won’t be able to afford the kind of loans needed to respond to a future crisis, be it financial or military. The higher rates will also depress economic growth, aggravating every other problem.

So what should be done? Orszag was technically prohibited from advocating policies in his old job. But it wasn’t very hard to read between the lines. In a series of speeches around the country, in testimony to Congress and in a blog that he started (“Director’s Blog”), he laid out a fairly clear agenda.

Orszag would begin his talks by explaining that the problem is not one of demographics but one of medicine. “It’s not primarily that we’re going to have more 85-year-olds,” he said during a September speech in California. “It’s primarily that each 85-year-old in the future will cost us a lot more than they cost us today.” The medical system will keep coming up with expensive new treatments, and Medicare will keep reimbursing them, even if they bring little benefit.

After this introduction, Orszag would typically pause and advise his audience not to get too depressed. He would put a map of the United States on the screen behind him, showing Medicare spending by region. The higher-spending regions were shaded darker than the lower-spending regions. Orszag would then explain that the variation cannot be explained by the health of the local population or the quality of care it receives. Darker areas didn’t necessarily have sicker residents than lighter areas, nor did those residents necessarily receive better care. So, Orszag suggested, the goal of reform doesn’t need to be remaking the American health care system in the image of, say, the Dutch system. The goal seems more attainable than that. It is remaking the system of a high-spending place, like southern New Jersey or Texas, in the image of a low-spending place, like Minnesota, New Mexico or Virginia.

To that end, Orszag has become intrigued by the work of Mitchell Seltzer, a hospital consultant in central New Jersey. Seltzer has collected large amounts of data from his clients on how various doctors treat patients, and his numbers present a very similar picture to the regional data. Seltzer told me that big-spending doctors typically explain their treatment by insisting they have sicker patients than their colleagues. In response he has made charts breaking down the costs of care into thin diagnostic categories, like “respiratory-system diagnosis with ventilator support, severity: 4,” in order to compare doctors who were treating the same ailment. The charts make the point clearly. Doctors who spent more — on extra tests or high-tech treatments, for instance — didn’t get better results than their more conservative colleagues. In many cases, patients of the aggressive doctors stay sicker longer and die sooner because of the risks that come with invasive care.

The first step toward turning “less efficient” doctors, in Seltzer’s euphemism, into “efficient” doctors would be relatively uncontroversial. The government would have to create a national version of his database and, to do so, would need doctors and hospitals to have electronic medical records. The Obama administration plans to use the stimulus bill to help pay for the installation of such systems. It is then likely to mandate that, within five years, any doctor or hospital receiving Medicare payment must be using electronic records.

The next steps will be harder. Based on what the data show, Medicare will have to stop reimbursing some expensive treatments that don’t do much good. Private insurers would likely follow Medicare’s lead, as they have on other issues in the past. Doctors, many of whom make good money from extra treatments, are sure to object, just as Mancur Olson would have predicted. They will claim that, whatever the data show, the treatments are benefiting their patients. In a few cases — though, by definition, not most — they may be right. Even when they are not, their patients, desperate for hope, may fight for the treatment.

The most pessimistic point that Orszag routinely made during his time on Capitol Hill was that the political system didn’t deal well with simmering, long-term problems. It often waited until those problems became a crisis, he would say. That may be a kind of corollary to Rahm’s Doctrine, but it does highlight the task before the Obama administration. It will need to figure out how it can use one crisis as an excuse to prevent several more.

VI. GRADUATES EQUAL GROWTH

A GREAT APPEAL of green jobs — or, for that matter, of a growing and efficient health care sector — is that they make it possible to imagine what tomorrow’s economy might look like. They are concrete. When somebody wonders, What will replace Wall Street? What will replace housing? they can be given an answer.

As answers go, green jobs and health care are fine. But they probably aren’t the best answers. The best one is less concrete. It also has a lot more historical evidence on its side.

Last year, two labor economists, Claudia Goldin and Lawrence Katz, published a book called “The Race Between Education and Technology.” It is as much a work of history — the history of education — as it is a work of economics. Goldin and Katz set out to answer the question of how much an education really matters. They are themselves products of public schools, she of New York and he of Los Angeles, and they have been a couple for two decades. They are liberals (Katz served as the chief economist under Robert Reich inBill Clinton’s Labor Department), but their book has been praised by both the right and the left. “I read the Katz and Goldin book,” Matthew Slaughter, an associate dean of Dartmouth’s business school who was an economic adviser to George W. Bush, recently told me, “and there’s part of me that can’t fathom that half the presidential debatesweren’t about a couple of facts in that book.” Summers wrote a blurb for the book, calling it “the definitive treatment” of income inequality.

The book’s central fact is that the United States has lost its once-wide lead in educational attainment. South Korea and Denmark graduate a larger share of their population from college — and Australia, Japan and the United Kingdom are close on our heels.

Goldin and Katz explain that the original purpose of American education was political, to educate the citizens of a democracy. By the start of the 20th century, though, the purpose had become blatantly economic. As parents saw that high-school graduates were getting most of the good jobs, they started a grass-roots movement, known as the high-school movement, to demand free, public high schools in their communities. “Middletown,” the classic 1929 sociological study of life in Indiana, reported that education “evokes the fervor of a religion, a means of salvation, among a large section of the population.”

At the time, some European intellectuals dismissed the new American high schools as wasteful. Instead of offering narrowly tailored apprentice programs, the United States was accused of overeducating its masses (or at least its white masses). But Goldin and Katz, digging into old population surveys, show that the American system paid huge dividends. High-school graduates filled the ranks of companies like General Electric and John Deere and used their broad base of skills to help their employers become global powers. And these new white-collar workers weren’t the only ones to benefit. A high-school education also paid off for blue-collar workers. Those with a diploma were far more likely to enter newer, better-paying, more technologically advanced industries. They became plumbers, jewelers, electricians, auto mechanics and railroad engineers.

Not only did mass education increase the size of the nation’s economic pie; it also evened out the distribution. The spread of high schools — by 1940, half of teenagers were getting a diploma — meant that graduates were no longer an elite group. In economic terms, their supply had increased, which meant that the wage premium that came with a diploma was now spread among a larger group of workers. Sure enough, inequality fell rapidly in the middle decades of the 20th century.

But then the great education boom petered out, starting in the late 1960s. The country’s worst high schools never got their graduation rates close to 100 percent, while many of the fast-growing community colleges and public colleges, which were educating middle-class and poorer students, had low graduation rates. Between the early 1950s and early ’80s, the share of young adults receiving a bachelor’s degree jumped to 24 percent, from 7 percent. In the 30 years since, the share has only risen to 32 percent. Nearly all of the recent gains have come among women. For the first time on record, young men in the last couple of decades haven’t been much more educated than their fathers were.

Goldin and Katz are careful to say that economic growth is not simply a matter of investing in education. And we can all name exceptions to the general rule. Bill Gatesdropped out of college (though, as Malcolm Gladwell explains in his recent book, “Outliers,” Gates received a fabulously intense computer-programming education while in high school). Some college graduates struggle to make a good living, and many will lose their jobs in this recession. But these are exceptions. Goldin’s and Katz’s thesis is that the 20th century was the American century in large part because this country led the world in education. The last 30 years, when educational gains slowed markedly, have been years of slower growth and rising inequality.

Their argument happens to be supported by a rich body of economic literature that didn’t even make it into the book. More-educated people are healthier, live longer and, of course, make more money. Countries that educate more of their citizens tend to grow faster than similar countries that do not. The same is true of states and regions within this country. Crucially, the income gains tend to come after the education gains. What distinguishes thriving Boston from the other struggling cities of New England? Part of the answer is the relative share of children who graduate from college. The two most affluent immigrant groups in modern America — Asian-Americans and Jews — are also the most educated. In recent decades, as the educational attainment of men has stagnated, so have their wages. The median male worker is roughly as educated as he was 30 years ago and makes roughly the same in hourly pay. The median female worker is far more educated than she was 30 years ago and makes 30 percent more than she did then.

There really is no mystery about why education would be the lifeblood of economic growth. On the most basic level, education helps people figure out how to make objects and accomplish tasks more efficiently. It allows companies to make complex products that the rest of the world wants to buy and thus creates high-wage jobs. Education may not be as tangible as green jobs. But it helps a society leverage every other investment it makes, be it in medicine, transportation or alternative energy. Education — educating more people and educating them better — appears to be the best single bet that a society can make.

Fortunately, we know much more than we did even a decade ago about how education works and doesn’t work. In his book, “Whatever It Takes,” (and in this magazine, where he is an editor), Paul Tough has described some of the most successful schools for poor and minority students. These schools tend to set rigorous standards, keep the students in school longer and create a disciplined, can-do culture. Many of the schools, like several middle schools run by an organization called KIPP, have had terrific results. Students enter with test scores below the national average. They leave on a path to college.

The lessons of KIPP — some of the lessons, at least — also apply to schools that are not so poor. Last year, the Gates Foundation hired an economist named Thomas Kane to oversee a big new push to prepare students for college. Kane is one of the researchers whose work shows that teachers may matter more than anything else. Good teachers tend to receive high marks from parents, colleagues and principals, and they tend to teach their students much more than average teachers. Bad teachers tend to do poorly on all these metrics. The differences are usually apparent after just a couple of years on the job. Yet in a typical school system, both groups receive tenure.

The Obama administration has suggested that education reform is an important goal. The education secretary is Arne Duncan, the former school superintendent in Chicago, who pushed for education changes there based on empirical data. Obama advisers say that the administration plans to use the education money in the stimulus package as leverage. States that reward good teaching and use uniform testing standards — rather than the choose-your-own-yardstick approach of the No Child Left Behind law — may get more money.

But it is still unclear just how much of a push the administration will make. With the financial crisis looming so large, something as sprawling and perennially plagued as education can seem like a sideshow. Given everything else on its agenda, the Obama administration could end up financing a few promising pilot programs without actually changing much. States, for their part, will be cutting education spending to balance their budgets.

A few weeks ago, I drove to Shepherd University in West Virginia to get a glimpse of both the good and bad news for education. Shepherd is the kind of public college that will need to be at the center of any effort to improve higher education. Located in a small town in the Shenandoah Valley, it attracts mostly middle-class students — from the actual middle class, not the upper middle class — and it has a graduation rate of about 35 percent.

Several years ago, the state of West Virginia started a scholarship program, called Promise, in part to lift the graduation rate at places like Shepherd. The program is modeled after those in several Southern states, in which any high-school student with a certain minimum grade-point average (often 3.0) and certain SAT scores gets a hefty scholarship to any state school. When West Virginia officials were designing their program, though, they noticed a flaw with the other programs. The students weren’t required to take a course load that was big enough to let them graduate in four years. In some cases they were required to keep a minimum grade-point average, which encouraged them, perversely, to take fewer courses. Many students drifted along for a few years and then dropped out.

So West Virginia changed the rules. It offered a bigger carrot — free tuition at any public college — but also a stick. Students had to take enough courses each semester so that they could graduate in four years. Judith Scott-Clayton, a young economist who analyzed the program, concluded that it had raised the on-time graduation rate by almost 7 percentage points in a state where many colleges have a graduation rate below 50 percent.

Given those results, the Promise scholarship might seem like an ideal public policy in a deep recession. It pays for school at a time when many families are struggling. It keeps students busy when jobs are hard to come by. It also has the potential to do some long-term good. But nearly everyone I interviewed in West Virginia — the students, the president of Shepherd and other education officials — worried that financing would be reduced soon. The program is expensive, and state revenue is declining. Something has to give.

VII. A MATTER OF NORMS

WHAT STRUCK ME ABOUT the Shepherd students I met was that they didn’t seem to spend much time thinking about the credit requirement. It had become part of their reality. Many college students today assume they will not graduate in four years. Some even refer to themselves as second- or third-years, instead of sophomores or juniors. “It’s just normal all around not to be done in four years,” Chelsea Carter, a Shepherd student, told me. “People don’t push you.” Carter, in fact, introduced herself to me as a third-year. But she is also a Promise scholar, and she said she expected to graduate in four years. Her younger sister, now in her first year in the program at Shepherd, also plans to graduate in four years. For many Promise scholars, graduating on time has become the norm.

Economists don’t talk much about cultural norms. They prefer to emphasize prices, taxes and other incentives. And the transformation of the American economy will depend very much on such incentives: financial aid, Medicare reimbursements, energy prices and marginal tax rates. But it will also depend on forces that aren’t quite so easy to quantify.

Orszag, on his barnstorming tour to talk about the health care system, argued that his fellow economists were making a mistake by paying so little attention to norms. After all, doctors in Minnesota don’t work under a different Medicare system than doctors in New Jersey. But they do act differently.

The norms of the last two decades or so — consume before invest; worry about the short term, not the long term — have been more than just a reflection of the economy. They have also affected the economy. Chief executives have fought for paychecks that their predecessors would have considered obscenely large. Technocrats inside Washington’s regulatory agencies, after listening to their bosses talk endlessly about the dangers of overregulation, made quite sure that they weren’t regulating too much. Financial engineering became a more appealing career track than actual engineering or science. In one of the small gems in their book, Goldin and Katz write that towns and cities with a large elderly population once devoted a higher-than-average share of their taxes to schools. Apparently, age made them see the benefits of education. In recent decades, though, the relationship switched. Older towns spent less than average on schools. You can imagine voters in these places asking themselves, “What’s in it for me?”

By any standard, the Obama administration faces an imposing economic to-do list. It will try to end the financial crisis and recession as quickly as possible, even as it starts work on an agenda that will inspire opposition from a murderers’ row of interest groups: Wall Street, Big Oil, Big Coal, the American Medical Association and teachers’ unions. Some items on the agenda will fail.

But the same was true of the New Deal and the decades after World War II, the period that is obviously the model for the Obama years. Roosevelt and Truman both failed to pass universal health insurance or even a program like Medicare. Yet the successes of those years — Social Security, the highway system, the G.I. Bill, the National Science Foundation, the National Labor Relations Board — had a huge effect on the culture.

The American economy didn’t simply grow rapidly in the late 1940s, 1950s and 1960s. It grew rapidly and gave an increasing share of its bounty to the vast middle class. Middle-class incomes soared during those years, while income growth at the very top of the ladder, which had been so great in the 1920s, slowed down. The effects were too great to be explained by a neat package of policies, just as the last few decades can’t be explained only by education, investment and the like.

When Washington sets out to rewrite the rules for the economy, it can pass new laws and shift money from one program to another. But the effects of those changes are not likely to be merely the obvious ones. The changes can also send signals. They can influence millions of individual decisions — about the schools people attend, the jobs they choose, the medical care they request — and, in the process, reshape the economy.

David Leonhardt is an economics columnist for The Times and a staff writer for the magazine.