When Was The First Computer Chip Developed?

3 Answers | Add Yours

fact-finder's profile pic

fact-finder | (Level 3) Valedictorian

Posted on

The computer chip, or integrated circuit, was developed in the late 1950s by two researchers who were working independently of one another. Jack Kilby (1923– ) developed a chip at Texas Instruments in 1958, and Robert Noyce (1927–1990) invented a similar device at Fairchild Semiconductor in 1959. The computer chip is an electronic device made of a very small piece of silicon (a nonmetallic element that, next to oxygen, is the most abundant element in the Earth's crust), which is usually less than one-quarter inch, or one centimeter, square. A computer microprocessor is a single chip that holds all of the computer's logic and arithmetic. It is responsible for interpreting and executing instructions given by a computer program (software). The microprocessor can be thought of as the brain of the computer's operating system. Today the chip typically has hundreds of thousands miniature transistors (devices that transmit electricity) and other electrical circuit components, which are all interconnected. Since its development in the late 1950s, the number of tiny components contained on a chip has steadily risen. The performance of computers therefore has been improved, since chips execute a computer's control, logic, and memory functions.

Besides computers, many other consumer electronic devices rely on the computer chip, including the microwave oven, the videocassette recorder (VCR), and calculators.

Further Information: "Integrated Circuit." MSN Encarta. [Online] Available http://www.encarta.msn.com/find/Concise.asp?z=1&pg=2&ti=761570221, November 8, 2000; Ross, Frank Xavier. The Magic Chip: Exploring Microelectronics. New York: J. Messner, 1984; The Making of a Silicon Chip. [Online] Available http://www.intel.com/OneDigitalDay/explore/chip/, November 8, 2000.

We’ve answered 317,587 questions. We can answer yours, too.

Ask a question