Page 1 of 4 1234 LastLast
Results 1 to 10 of 37

Thread: When will moores law hit a big hard atomic wall?

  1. #1
    If you can't hack it, you don't own it! Oneslowz28's Avatar
    Join Date
    Mar 2007
    Location
    Aiken, Sc
    Posts
    5,084

    Default When will moores law hit a big hard atomic wall?

    I was reading an article in a past issue of eetimes and they were speculating that moores law may hit an atomic wall by 2020. This means that our electronics will no longer shrink. So does that mean to speed up we will have to move back to increasing the size of our devices to speed them up?

    So when do you think it will happen?

  2. #2
    Will YOU be ready when the zombies rise? x88x's Avatar
    Join Date
    Oct 2008
    Location
    MD, USA
    Posts
    6,334

    Default Re: When will moores law hit a big hard atomic wall?

    Well, strictly speaking, Moore's law doesn't deal with shrinking stuff, just them becoming more powerful.

    As for the issue though, personally I think one of two things is gonna happen:

    1) Computers get so powerful that it doesn't really matter if they keep doubling. If we're only using 10% of the available power, what good does it do to double and only use 5%?

    2) The cloud takes off, and we move all our processing to central locations, where it doesn't matter if they get bigger.

    Looking at stuff lately, I think for a large portion of the population it's gonna be #2. With the rise of mobile devices that are always connected to the internet with increasingly fast wireless connections, it won't be long until working off the cloud won't feel any different than working locally.

    I think the only real technical barrier will be gamers. Until some sort of remote thing gets perfected, I don't see PC gamers moving away from operating locally on as powerful a machine as possible.
    That we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours, and this we should do freely and generously.
    --Benjamin Franklin
    TBCS 5TB Club :: coilgun :: bench PSU :: mightyMite :: Zeus :: E15 Magna EV

  3. #3
    Wait, What? knowledgegranted's Avatar
    Join Date
    Feb 2009
    Location
    USA
    Posts
    569

    Default Re: When will moores law hit a big hard atomic wall?

    You guys are missing the point here in Moores Law.

    Moores law was based on electricity, and electrical components. Biocomputers are the next big thing. They can be fast and better than our own human brains.

    EDIT:

    There is also gonna be another programming revolution. To fit more functionality, in small spaces.
    It's like JFK announcing the moon mission. He had no expertise in space travel, and no way of knowing if it would work. He just announced "we're going to the moon" and then they made it happen because everyone was on the same page and working towards the same goal. If he had said "well, let's get some people in space, and we'll see how far out we can get, and if I find someone to make a rocket strong enough, we could possibly approach the moon's orbit and maybe land" it wouldn't have happened.

  4. #4
    Resident EE mtekk's Avatar
    Join Date
    Dec 2007
    Location
    Minnesota
    Posts
    469

    Default Re: When will moores law hit a big hard atomic wall?

    Quote Originally Posted by x88x View Post
    Well, strictly speaking, Moore's law doesn't deal with shrinking stuff, just them becoming more powerful.
    Moore's Law specifically deals with the doubling of transistor counts in commercial chips every 22 months or so (due to physics we realize this via feature size shrinks).

    Quote Originally Posted by knowledgegranted
    You guys are missing the point here in Moores Law.

    Moores law was based on electricity, and electrical components. Biocomputers are the next big thing. They can be fast and better than our own human brains.
    Well, if you want to talk about bio computers, let's talk about how they will be larger than silicon based systems (physics pretty much dictates this). And, don't forget to mention strict environment control they require (can't be too hot, cold, wet, or dry otherwise they die). IMHO, the only interesting thing about biological computers is that they self assemble/self program.

    A circa 1990s CPU can do math faster than a human, but that doesn't make it better. The brain is a marvelous thing, it's ability to reorganize and process information is amazing. This is what we can learn from. We will have a programming revolution (or renaissance if you will) as many of today's programs are lazy and bloated. When a hardware upgrade is not available, optimizing software will be norm, as it was before the mid 1990s.
    Quote Originally Posted by xRyokenx View Post
    ...I'm getting tired of not being able to figure this crap out because it's apparently made for computer-illiterate people by computer-illiterate people. lol

  5. #5
    Will YOU be ready when the zombies rise? x88x's Avatar
    Join Date
    Oct 2008
    Location
    MD, USA
    Posts
    6,334

    Default Re: When will moores law hit a big hard atomic wall?

    Quote Originally Posted by mtekk View Post
    (due to physics we realize this via feature size shrinks).
    I read his comment about devices growing as the size of the entire chip growing when we inevitably hit the limit of how small we can shrink silicon semiconductors.

    Along those lines, I remember reading somewhere recently that a group was having great success in making circuits I think using a graphite/silicon blend? They were making circuits smaller than would have been possible with straight silicon. ...I wish I could remember where I saw that...
    That we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours, and this we should do freely and generously.
    --Benjamin Franklin
    TBCS 5TB Club :: coilgun :: bench PSU :: mightyMite :: Zeus :: E15 Magna EV

  6. #6
    Water Cooled silverdemon's Avatar
    Join Date
    Jun 2006
    Location
    Delft, Netherlands
    Posts
    520

    Default Re: When will moores law hit a big hard atomic wall?

    That would be 'grafene' (don't if it is spelled right, in dutch it is 'grafeen') which is (I believe) a sheet of single-atom thickness graphite (or carbon)

    This stuff makes for higher clock speeds and possibly a bit smaller feature sizes.
    However, to get back to the topic-question: it doesn't matter what material you use, protons, neutrons and electrons stay the same size, so that's probably a physical barrier we will encounter...

    ...unless we will be able to craft machines from smaller parts, like quarks or something, but I don't see that happening very soon

    Quote Originally Posted by x88x View Post
    1) Computers get so powerful that it doesn't really matter if they keep doubling. If we're only using 10% of the available power, what good does it do to double and only use 5%?
    That might be possible, but I think we will hit this physical barrier (someone said 2020) before our computers are 'too powerful'. And for scientific use a computer can't be too powerful anyways...

  7. #7
    AARGH dr.walrus's Avatar
    Join Date
    Mar 2008
    Location
    Ho Chi Minh City
    Posts
    993

    Default Re: When will moores law hit a big hard atomic wall?

    Quote Originally Posted by Oneslowz28 View Post
    I was reading an article in a past issue of eetimes and they were speculating that moores law may hit an atomic wall by 2020. This means that our electronics will no longer shrink. So does that mean to speed up we will have to move back to increasing the size of our devices to speed them up?

    So when do you think it will happen?
    Funnily enough, I had this question in an exam a few months ago.

    People have been saying Moore's law will cease to hold true since the 80s, and it simply hasn't happened. There are a few things worth noting that increasing numbers of transistors doesn't directly equate to higher speed.

    Increasing the size of processors causes the major problem of increased power consumption and heat. We can make processors much more effective by using more appropriate architectures. A question not often enough asked is 'what alternatives do we have to x86 in the desktop market'. The x86 architecture is a sort of bodge that just DOES everything we want it to do, but often with frigtening inefficiency.

    And it's unfair to place that onus solely on the hardware industry. Software code is so incredibly bloated, unoptimised, and in some cases it could be argued that there is an implied level of collusion between software and hardware manufacturers. Want to use new software - buy a new computer - which comes with more software... There are much better engineering solutions than 'add more transistors' as we have been doing. Our movement to parallel processing is to prevent the power dissipation of improving individual functional units, but what it's actually done is reduce the amount of everyday performance we get per transistor. It seems strange, but software is presenting a major barrier in itself.

    Back to the question, in terms of the decreasing size of lithography, it's pretty much agreed that we've got about ten years or so left, because of the effect of quantum tunnelling. By then, we're gonna need a new technology to manufacture ICs, or whatever their replacements will be. And from there, we're a long way from approaching the maximum density of computing power.

    The following from Wikipedia sums up the accepted technical wisdom about the theoretical potential of computing density quite nicely:

    Quote Originally Posted by wikipedia
    Futurists such as Ray Kurzweil, Bruce Sterling, and Vernor Vinge believe that the exponential improvement described by Moore's law will ultimately lead to a technological singularity: a period where progress in technology occurs almost instantly.[49]
    Although Kurzweil agrees that by 2019 the current strategy of ever-finer photolithography will have run its course, he speculates that this does not mean the end of Moore's law:
    Moore's law of Integrated Circuits was not the first, but the fifth paradigm to forecast accelerating price-performance ratios. Computing devices have been consistently multiplying in power (per unit of time) from the mechanical calculating devices used in the 1890 U.S. Census, to [Newman's] relay-based "[Heath] Robinson" machine that cracked the Lorenz cipher, to the CBS vacuum tube computer that predicted the election of Eisenhower, to the transistor-based machines used in the first space launches, to the integrated-circuit-based personal computer.[50]
    Kurzweil speculates that it is likely that some new type of technology (possibly optical or quantum computers) will replace current integrated-circuit technology, and that Moore's Law will hold true long after 2020.
    Lloyd shows how the potential computing capacity of a kilogram of matter equals pi times energy divided by Planck's constant. Since the energy is such a large number and Plancks's constant is so small, this equation generates an extremely large number: about 5.0 * 1050 operations per second.[49]
    He believes that the exponential growth of Moore's law will continue beyond the use of integrated circuits into technologies that will lead to the technological singularity. The Law of Accelerating Returns described by Ray Kurzweil has in many ways altered the public's perception of Moore's Law. It is a common (but mistaken) belief that Moore's Law makes predictions regarding all forms of technology, when it has only actually been demonstrated clearly for semiconductor circuits. However many people including Richard Dawkins have observed that Moore's law will apply - at least by inference - to any problem that can be attacked by digital computers and is in it essence also a digital problem. Therefore progress in genetics where the coding is digital 'the genetic coding of GATC' may also advance at a Moore's law rate. Many futurists still use the term "Moore's law" in this broader sense to describe ideas like those put forth by Kurzweil but do not fully understand the difference between linear problems and digital problems.

  8. #8
    Resident EE mtekk's Avatar
    Join Date
    Dec 2007
    Location
    Minnesota
    Posts
    469

    Default Re: When will moores law hit a big hard atomic wall?

    Quote Originally Posted by x88x View Post
    I read his comment about devices growing as the size of the entire chip growing when we inevitably hit the limit of how small we can shrink silicon semiconductors.

    Along those lines, I remember reading somewhere recently that a group was having great success in making circuits I think using a graphite/silicon blend? They were making circuits smaller than would have been possible with straight silicon. ...I wish I could remember where I saw that...
    Well increasing the physical size causes manufacturability issues, hell just ask Nvidia how easy large dies are to make (they can't make a 512 shader GF100 due to defects as it's over a 500mm^2 die). Not to mention it sucks gobs of down power.

    People have said the next thing is to go to different materials, which if for each drop in size they have to jump from material to material it will be very expensive (some say GaAs is the future, my semiconductors professor said it will always be the future). Others have said optical processors are the next big thing. Unfortunately, they are right they will be big (physically), but they have some benefits. Some are backing spintronics (store states in "up" and "down" spin of electrons), which is really cool, but who knows when that will be viable. Others believe will will hit a wall, but it won't matter at that time as we'll have moved on to focusing on creating organic semiconductors that cost much less.
    Last edited by mtekk; 07-09-2010 at 11:59 AM. Reason: stupid autocorrector made spintronics into electronics
    Quote Originally Posted by xRyokenx View Post
    ...I'm getting tired of not being able to figure this crap out because it's apparently made for computer-illiterate people by computer-illiterate people. lol

  9. #9
    Mentally Underclocked mDust's Avatar
    Join Date
    Aug 2009
    Location
    Michigan
    Posts
    1,639

    Default Re: When will moores law hit a big hard atomic wall?

    I believe photons will replace electrons altogether. Physicists have been working on how to manipulate the speed of photons for decades. In 1999 they successfully brought photons to a complete stop and in 2007 they figured out how to trap the entire spectrum and bring it to a halt. A few years from now they'll be able to manipulate the photons efficiently and a few years after that they will demonstrate how their process can store and manipulate data. I would expect that several large computer manufacturers would rabidly snatch up this tech to bring it to the market a few years later...which would be around 2020. In another decade we will literally be computing at the speed of light!
    http://www.livescience.com/strangene...d-rainbow.html

    I was reading about this elsewhere (no link, sorry) where they talked about how they were developing a logic system on a nano scale. It was pretty impressive! The various wavelengths can all be processed simultaneously while individual photons within a wavelength can be slowed to allow more time-sensitive data through. Unfortunately, they weren't imaginative enough to reinvent how the logic gates or physical parts of an IC work, so they ended up with an analog of current tech that simply computed with a different particle...but much, much faster.
    I'll procrastinate tomorrow.

  10. #10
    Yuk it up Monkey Boy! Airbozo's Avatar
    Join Date
    Jun 2006
    Location
    In the Redwoods
    Posts
    5,272

    Default Re: When will moores law hit a big hard atomic wall?

    Quote Originally Posted by mtekk View Post
    ...
    A circa 1990s CPU can do math faster than a human, but that doesn't make it better. ....
    One thing people forget is that even though the calculations _may_ be faster than most human brains, there is still the step of input that is done by a human. This is where humans can be faster than a computer. I know people who can calculate large numbers faster than you can type the problem into a computer.
    "...Dumb all over, A little ugly on the side... "...Frank Zappa...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •