For much of the last century, the United States led the world in technological innovation—a position it owed in part to well-designed procurement programs at the Defense Department and NASA. During the 1940s, for example, the Pentagon funded the construction of the first general-purpose computer, designed initially to calculate artillery-firing tables for the U.S. Army. Two decades later, it developed the data communications network known as the ARPANET, a precursor to the Internet. Yet not since the 1980s have government contracts helped generate any major new technologies, despite large increases in funding for defense-related R & D. One major culprit was a shift to procurement efforts that benefit traditional defense contractors while shutting out start-ups.
Bad procurement policy is just one reason the United States has begun to lose its technological edge. Indeed, the multibillion-dollar valuations in Silicon Valley have obscured underlying problems in the way the United States develops and adopts technology. An increase in patent litigation, for example, has reduced venture capital financing and R & D investment for small firms, and strict employment regulations have strengthened large employers and prevented the spread of knowledge and skills across the industry. Although the United States remains innovative, government policies have, across the board, increasingly favored powerful interest groups at the expense of promising young start-ups, stifling technological innovation.?