Search Header Logo
reading drag and drop

reading drag and drop

Assessment

Presentation

World Languages

University

Hard

FREE Resource

2 Slides • 1 Question

1

Moore's Law

The use of computers has permeated every sphere of life: business, education, medicine, entertainment, and home. The explosive growth of the computer industry was first predicted by the man who wrote Moore's Law in 1965. The industry has been booming with no sign of slowing down, and his law still holds true over forty years later. The question remains: how long will the trend last?

Gordon Moore, cofounder of Intel and inventor of Moore's Law, was an engineer in the technology industry. He was in a position to study the technological trends and came up with a projection about the future of technology. Moore's Law states that the number of transistors, or basic electronic switches, on a computer chip would double every two years. If this proved to be true, chips would increase in speed and capabilities progressively every few years. He made the prediction in 1965, before many technological advances were made and before such growth was considered. Yet this principle, merely an idea at the beginning, drove the chip industry to great heights by increasing competition and setting a goal for those in the computer industry to strive toward. The transistors that Moore spoke about, when interconnected on a chip, made up an integrated circuit. His law addressed these circuits also by stating that the use of integrated circuits would keep the costs of electronics down, which it has done through the years. What began as a prediction proved to be the standard that many companies strove to meet.

Moore was the first to publish his observations about the direction technology was going. Each computer chip requires a certain number of transistors, and the more it has, the more it can do. If technology was not as advanced as it is currently, mobile phones, digital cameras, navigation systems, and other digital electronics would not be nearly as sophisticated. It was said that "practically everything digital has depended critically on the swift improvement of chip density." Businesses realize the necessity of keeping up with trends in technology because they suffer a performance disadvantage and a cost increase when they fall behind their competitors.

Chips are made of silicon, and the greater the silicon density, the more transistors there are per chip. However, there is a limit to how dense a chip can be, and researchers are reaching the realm of quantum physics, which deals with pieces as small as atoms. Once the industry maximizes its current capabilities, it will have to consider other scientific ways to advance chips. Another real issue as technology advances is updating interfaces and software on computers to match chip capabilities, or else the advanced chips will not be of any use to users. Moore realized that one day technology would meet with limitation. He recently predicted that Moore's Law will reach its end in ten to twenty years, at which point innovative technology experts will be put to work to bridge technology between its limitations and its future.

2

Multiple Select

Question image

Create a summary of the passage by choosing THREE options.

1

With the rapid growth in computer chips, industries have seen Moore's Law at work.

2

The computer chip's silicon density is the link to future technology.

3

The standard used by technology companies was to double the number of transistors per chip every two years.

4

Gordon Moore's statement in 1965 was a prediction that turned out to be completely

accurate.

5

Electronics containing newer chips have more functions and capabilities.

3

Moore's Law

The use of computers has permeated every sphere of life: business, education, medicine, entertainment, and home. The explosive growth of the computer industry was first predicted by the man who wrote Moore's Law in 1965. The industry has been booming with no sign of slowing down, and his law still holds true over forty years later. The question remains: how long will the trend last?

Gordon Moore, cofounder of Intel and inventor of Moore's Law, was an engineer in the technology industry. He was in a position to study the technological trends and came up with a projection about the future of technology. Moore's Law states that the number of transistors, or basic electronic switches, on a computer chip would double every two years. If this proved to be true, chips would increase in speed and capabilities progressively every few years. He made the prediction in 1965, before many technological advances were made and before such growth was considered. Yet this principle, merely an idea at the beginning, drove the chip industry to great heights by increasing competition and setting a goal for those in the computer industry to strive toward. The transistors that Moore spoke about, when interconnected on a chip, made up an integrated circuit. His law addressed these circuits also by stating that the use of integrated circuits would keep the costs of electronics down, which it has done through the years. What began as a prediction proved to be the standard that many companies strove to meet.

Moore was the first to publish his observations about the direction technology was going. Each computer chip requires a certain number of transistors, and the more it has, the more it can do. If technology was not as advanced as it is currently, mobile phones, digital cameras, navigation systems, and other digital electronics would not be nearly as sophisticated. It was said that "practically everything digital has depended critically on the swift improvement of chip density." Businesses realize the necessity of keeping up with trends in technology because they suffer a performance disadvantage and a cost increase when they fall behind their competitors.

Chips are made of silicon, and the greater the silicon density, the more transistors there are per chip. However, there is a limit to how dense a chip can be, and researchers are reaching the realm of quantum physics, which deals with pieces as small as atoms. Once the industry maximizes its current capabilities, it will have to consider other scientific ways to advance chips. Another real issue as technology advances is updating interfaces and software on computers to match chip capabilities, or else the advanced chips will not be of any use to users. Moore realized that one day technology would meet with limitation. He recently predicted that Moore's Law will reach its end in ten to twenty years, at which point innovative technology experts will be put to work to bridge technology between its limitations and its future.

Moore's Law

The use of computers has permeated every sphere of life: business, education, medicine, entertainment, and home. The explosive growth of the computer industry was first predicted by the man who wrote Moore's Law in 1965. The industry has been booming with no sign of slowing down, and his law still holds true over forty years later. The question remains: how long will the trend last?

Gordon Moore, cofounder of Intel and inventor of Moore's Law, was an engineer in the technology industry. He was in a position to study the technological trends and came up with a projection about the future of technology. Moore's Law states that the number of transistors, or basic electronic switches, on a computer chip would double every two years. If this proved to be true, chips would increase in speed and capabilities progressively every few years. He made the prediction in 1965, before many technological advances were made and before such growth was considered. Yet this principle, merely an idea at the beginning, drove the chip industry to great heights by increasing competition and setting a goal for those in the computer industry to strive toward. The transistors that Moore spoke about, when interconnected on a chip, made up an integrated circuit. His law addressed these circuits also by stating that the use of integrated circuits would keep the costs of electronics down, which it has done through the years. What began as a prediction proved to be the standard that many companies strove to meet.

Moore was the first to publish his observations about the direction technology was going. Each computer chip requires a certain number of transistors, and the more it has, the more it can do. If technology was not as advanced as it is currently, mobile phones, digital cameras, navigation systems, and other digital electronics would not be nearly as sophisticated. It was said that "practically everything digital has depended critically on the swift improvement of chip density." Businesses realize the necessity of keeping up with trends in technology because they suffer a performance disadvantage and a cost increase when they fall behind their competitors.

Chips are made of silicon, and the greater the silicon density, the more transistors there are per chip. However, there is a limit to how dense a chip can be, and researchers are reaching the realm of quantum physics, which deals with pieces as small as atoms. Once the industry maximizes its current capabilities, it will have to consider other scientific ways to advance chips. Another real issue as technology advances is updating interfaces and software on computers to match chip capabilities, or else the advanced chips will not be of any use to users. Moore realized that one day technology would meet with limitation. He recently predicted that Moore's Law will reach its end in ten to twenty years, at which point innovative technology experts will be put to work to bridge technology between its limitations and its future.

Show answer

Auto Play

Slide 1 / 3

SLIDE