# Michio Kaku's Dark Prediction For the End of Moore's Law



## AquaNekoMobile (Feb 26, 2010)

http://www.salon.com/technology/com...wt/feature/2011/03/19/moores_law_ends_excerpt



> "An excerpt from Michio Kaku's new book appears at salon.com, in which he sees a dark economic future within the next 20 years as Moore's law is brought to an end when single-atom transistors give way to quantum states. Kaku predicts: 'Since chips are placed in a wide variety of products, this could have disastrous effects on the entire economy. As entire industries grind to a halt, millions could lose their jobs, and the economy could be thrown into turmoil.'"


----------



## peterpd99 (Oct 18, 2010)

very interesting...thanks for sharing..ohh btw...good site.


----------



## solarz (Aug 31, 2010)

The solution is, of course, quantum computing. That is the next great revolution in computation, and would make our modern computers look like abacuses.


----------



## Zebrapl3co (Mar 29, 2006)

Meh, if he only discovered that in the last few years, then he's obviously 12 years late.
Moore's law was never a law in the first place. A more correct term would be Moore's hypothesis. That's what it's been all these years. It's the laymen and media; who have zero grasp of what it takes to make a hypothesis a law, that keeps hyping it up and call it Moore's "law".
Moore's "law", was doomed to fail the minute it was mentioned. "Just because some one threw a rock in to the sky and it looks like it's heading it a really fast direction does not mean it will reach syncronous orbit around the Earth." That is all I can say about Moore's "law".
But as Solarz have said, there's alway quantum computing. Then there is organic computing, then there's 3 dimentional computing (we're still at it's infancy on this concept.)

*Never pay again for live sex! | Hot girls doing naughty stuff for free! | Chat for free!*


----------



## solarz (Aug 31, 2010)

Zebrapl3co said:


> Meh, if he only discovered that in the last few years, then he's obviously 12 years late.
> Moore's law was never a law in the first place. A more correct term would be Moore's hypothesis. That's what it's been all these years. It's the laymen and media; who have zero grasp of what it takes to make a hypothesis a law, that keeps hyping it up and call it Moore's "law".
> Moore's "law", was doomed to fail the minute it was mentioned. "Just because some one threw a rock in to the sky and it looks like it's heading it a really fast direction does not mean it will reach syncronous orbit around the Earth." That is all I can say about Moore's "law".


Well, the issue is not really whether Moore's "law" is really a law, but rather what would happen if computation stopped following this exponential growth.

I think it's a rather "sky is falling" article. Sure, CPU speed will stop doubling every 18 months, but so what? There are always ways for smart companies to make money. Just look at Apple and its iPhone iterations. The article makes a VERY sketchy jump from "CPU speeds stop increasing" to "Nobody will buy new computers anymore".

Another interesting question is, apart from hard-core gamers, who exactly NEEDS to keep up with the CPU advancement? I mean, has the average consumer's computer usage habits changed in, say, the last 10 years?

The only difference I can see is that they've moved from IE 5 and Netscape to IE 8, Firefox, and Chrome. They've moved from Word 2003 to (maybe) Word 2011. (I'm using Office 2003 Lite.) Why do those things even need exponential computation growth? Apart form dealing with the bloat created by Microsoft themselves, that is.


----------

