Why the laws of physics will eventually decay Moore's Law

In summary, the article discusses how eventually when transistors get to a certain size, they will not be able to operate anymore and will end up melting. It is expected that this will happen in the next few decades, but so far we've been able to shrink some structures that small. The benefits of miniaturization, such as lighter, cheaper, and faster products, outweigh the negative effects of not being able to switch transistors fast.
  • #1
jaydnul
558
15
I was reading an article and it said that eventually, when you get silicon transistors to a certain size, they won't be able to operate anymore and will end up melting. I have always wondered the following... what is the point of trying to make transistors smaller when we can just make the actual chip bigger? I mean i wouldn't mind a slightly larger laptop if it had double the processing power by just adding a second microprocessor, thus doubling the amount of transistors while keeping the sames size. What is wrong with my assumption? Obviously it would have been done by now if it would work.
 
Physics news on Phys.org
  • #2
It was expected that transistors would not scale to just a few atoms in the 1990's but so far we've been able shrink some structures that small. The current chip sizes are driven by money, when you spend 5 billion dollars on a fab and equipment getting the most from each wafer is the primary goal and the way you do that is to make each chip smaller (and maintain yield) because the price of processing the whole wafer is about the same for 200 or a 1000 die on a wafer.
http://spectrum.ieee.org/semiconductors/design/the-highk-solution
http://spectrum.ieee.org/semiconductors/nanotechnology/ohms-law-survives-at-the-atomic-scale
 
  • #3
If you add a processor, you either have to use a bigger battery in your laptop or suffer reduced time between charges. You also add more heat which must be dissipated in some manner. Although you theoretically have doubled your processing power, your software must be re-written to work in two processors simultaneously (if this can be done at all).

Miniaturization of electronic components is not just a technical exercise, the benefits of the process result in products which are lighter, cheaper, and faster.
 
  • #4
Ok if we don't take the power wasted in heat loss in count basically doubling the transistors/the processing power in a single chip or splitting the chips in half wouldn't make that much of a difference in terms of power consumption as only due to heat.
Anything other than that you don't get the so called "free lunch" just because you squeeze the whole "transistor kindergarden" under one roof.
And pretty much I think it is the case as when mobile phones started to come out with those shiny screens and many functions some years ago they were draining the battery way faster, than the "stone age" first nokias years ago.I think the bright colorful screen and the larger processor contributed to that mainly.
 
  • #5
It does work. It also makes a chip twice as expensive when you double up that way.
 
  • #6
lundyjb said:
I was reading an article and it said that eventually, when you get silicon transistors to a certain size, they won't be able to operate anymore and will end up melting.
The voltage used for the smaller transitors can be reduced to eliminate the heat problem, but then the switching speed is also reduced. In order to maintain fast switching speeds, the ratio of voltage versus transistor size needs to be kept relatively high, but then heat becomes an issue, especially if the ratio of transistor count per unit area on a chip is high, which is why the clock limit on most air cooled consumer processors has been limited to about 4ghz for almost a decade. By reducing transitor density (larger chips), there's more cooling surface per transitor, but this isn't cost effective and only used on special high performance chips. The other alternative is using a better cooling method, such as liquid cooling.
 
  • #7
Moore's law may become almost irrelevant with the introduction of cloud computing and (genuine) broadband data connections. Individual processors may only be needed to 'supervise' the machine / human interface.
Moore's Law doesn't limit its prediction to Silicon technology, either. Quantum computing could be a serious gear change in speed (although the software would need to be re-jigged a bit to make it usable.
 
  • #9
I always thought the speed of light created an upper limit on the clock speeds, regardless of temperature. That's why even liquid nitrogen/helium cooled processors still only reach max speeds of about 7GHz, because beyond that speed even huge voltages aren't enough to switch from 0 to 1 in time.

Maybe that's all wrong?!
 
  • #10
MikeyW said:
I always thought the speed of light created an upper limit on the clock speeds, regardless of temperature. That's why even liquid nitrogen/helium cooled processors still only reach max speeds of about 7GHz, because beyond that speed even huge voltages aren't enough to switch from 0 to 1 in time.

Maybe that's all wrong?!

We're at 8Ghz+ already: http://hothardware.com/News/AMD-Breaks-Frequency-Record-with-Upcoming-FX-Processor/
 
  • #11
Note that Moore's original law is about transistor count per chip, not density of transitors or speed, and he only stated that it would last at least 10 years (from 1965 to 1975+). Since chip sizes haven't increased much in the last few years, it has evolved in being related to chip density.

http://en.wikipedia.org/wiki/Moore's_law
 
  • #12
lundyjb said:
I mean i wouldn't mind a slightly larger laptop if it had double the processing power by just adding a second microprocessor, thus doubling the amount of transistors while keeping the sames size. What is wrong with my assumption?

What's wrong with this assumption is that you're looking only a year or two back. Maybe your laptop (processor) would only be 2x larger...but if you look back 10 years, your laptop processor would be 100x larger. Look back another 10 years, your laptop (just the processor) would be 100000x larger. Except it wouldn't, because it couldn't work.

Also we're talking about a laptop here. How about cell phones? How about the myriad of other gadgets that just wouldn't be possible without miniaturization...and another myriad which we can't even think of today, which won't be possible without even further miniaturization.

Of course, as others have mentioned, the more you can squeeze into each chip the more money you make. When printing these transistors, you are essentially printing money. That's what's driving companies to squeeze more and more into each chip. If they don't, somebody else will. I actually work for the company that makes these "money printers", so you could say my job is to keep Moore's law going!
 
  • #13
The spirit of Moor's law I think is that computational power per unit of mass or per unit of volume will increase exponentially.

I don't think we're anywhere close to any sort of theoretical maximum yet. I base this on the fact that the human brain is still orders of magnitude better then any man made computer of similar size in terms of what it can process. I also suspect the human brain is far from the theoretical maximum.
 
  • #14
The human brain is a very 'approximate' calculating machine. It works so differently from any conventional man-made computer that you can't compare performances reliably. Some inspired human chess playing can still flummox even the best computers, when time is limited.
 
  • #15
sophiecentaur said:
The human brain is a very 'approximate' calculating machine. It works so differently from any conventional man-made computer that you can't compare performances reliably. Some inspired human chess playing can still flummox even the best computers, when time is limited.

Really? I thought that computerized chess programs were already to a level where no human can beat the best of them anymore.
 

Related to Why the laws of physics will eventually decay Moore's Law

1. Why is Moore's Law important in the study of physics?

Moore's Law is important in the study of physics because it describes the trend that the number of transistors on a microchip will double about every two years, leading to exponential growth in computing power.

2. What is meant by the "decay" of Moore's Law?

The "decay" of Moore's Law refers to the fact that the trend of doubling the number of transistors on a microchip every two years is not sustainable in the long run due to technological and physical limitations.

3. What factors contribute to the eventual decay of Moore's Law?

There are several factors that contribute to the eventual decay of Moore's Law, including the limitations of current technology, the increasing complexity and cost of manufacturing smaller transistors, and the physical limitations of the materials used in chip manufacturing.

4. How will the decay of Moore's Law affect the advancement of technology?

The decay of Moore's Law will likely result in a slower rate of advancement in technology, as the exponential growth in computing power will no longer be sustainable. This may lead to a shift in focus towards finding new technologies and methods to continue the advancement of technology.

5. Is there a possibility for Moore's Law to continue indefinitely?

It is highly unlikely that Moore's Law will continue indefinitely, as it is limited by physical and technological constraints. However, scientists and researchers are constantly exploring new technologies and materials that may prolong the trend for a little longer.

Similar threads

  • Other Physics Topics
Replies
5
Views
1K
  • Other Physics Topics
Replies
10
Views
2K
  • Introductory Physics Homework Help
Replies
26
Views
718
Replies
6
Views
968
  • Other Physics Topics
Replies
9
Views
2K
Replies
11
Views
3K
  • Classical Physics
Replies
30
Views
2K
  • Quantum Interpretations and Foundations
2
Replies
51
Views
5K
  • Astronomy and Astrophysics
Replies
5
Views
2K
Replies
11
Views
6K
Back
Top