The Integrated Circuit at Fifty

Gordon Moore

Fairchild Semiconductor's Gordon Moore, namesake of Moore's law. (Fairchild Semiconductor)

Isaac Asimov once hailed the invention of the integrated circuit as “the most important moment since man emerged as a life form.” Today, 50 years after its invention, the integrated circuit—the basic building block of digital electronics—continues to shape societies around the globe. It is opening lines of communication, transforming work and economies, and accelerating changes in science and technology.

The integrated circuit first emerged as researchers responded to the cold war demands of the U.S. military during the 1950s. With the increased use of electronic systems, especially in high-stakes aerospace efforts like strategic bombers and the intercontinental ballistic missiles, the military demanded improvements in the reliability and size of electronic components. American semiconductor researchers responded, exploring diverse avenues for miniaturizing and enhancing the performance of components like transistors and diodes. With silicon occupying a growing share of the semiconductor industry’s attention, many researchers grappled with ways to produce silicon components that were more reliable, smaller, and cheaper.

By the late 1950s such organizations as Bell Labs, the RCA Labs, the Diamond Ordnance Fuse Laboratory, Sprague Electric, Texas Instruments, and Fairchild Semiconductor began to take seriously the possibility of making real a long-held ideal: creating an entire circuit in a single piece of semiconductor material. If it could be done, it would be far better to create an entire circuit in a single semiconductor block than to make circuits by wiring together many individual parts. An integrated circuit would lead to greater reliability and miniaturization and would meet the existing needs of the military market.

Researchers submitted a flurry of patent applications that contained the keys to the integrated-circuit puzzle. The patent application of Texas Instruments’ Jack Kilby was based on his work demonstrating that the various circuit elements could be created in a single piece of semiconductor material. At Sprague Electric, Kurt Lehovec’s application contained a simple method for electrically isolating such parts from one another. And Fairchild Semiconductor’s Jean Hoerni submitted two applications, embodying his novel adaptation of the silicon manufacturing processes first developed at Bell Labs into the “planar process.” Hoerni’s planar process made possible semiconductor devices that were more reliable, smaller, and cheaper. And the application of Fairchild Semiconductor’s Robert Noyce showed how—by exploiting Hoerni’s planar process—the components of an integrated circuit could be electrically interconnected.

While these key concepts for the planar integrated circuit found expression in the patent applications of the spring of 1959, it was not until the summer of 1960 that the concepts were first realized as working devices. After months of work a team at Fairchild Semiconductor led by Jay Last succeeded in making the first planar integrated circuits. These circuits have been continually developed, modified, and transformed for the last five decades, leading to today’s chips. Thousands upon thousands of researchers and technologists have continually pushed silicon technology, making integrated circuits exponentially cheaper and more powerful every few years. This is the exponential development path, named Moore’s law—after Fairchild Semiconductor’s Gordon Moore—that underlies the transformation of digital electronics.

David C. Brock is a senior research fellow with CHF’s Center for Contemporary History and Policy and the editor of Understanding Moore’s Law: Four Decades of Innovation (CHF, 2006).