• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Stupid Moore's Law

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

>HyperlogiK<

Member
Joined
Nov 10, 2004
Location
Sword Base
Why is it that so many people tout 'Moore's Law' as if it is some kind of cosmological necessity?

People treat it like some kind of physical law, but unlike say 'the conservation of energy' it only worked from 1967 to 1975 and then he had to change it because it wasn't accurate any more. It has more or less held for bulk manufactured semiconductor devices since then, but normally we expect 'Law's' to be good for more than 8 years at a time.

The other problem I have are the articles claiming that it is somehow responsible for advances in the semiconductor industry e.g. 'Moore's law drives progress in the digital world' (University of Washington Comp Sci department). Would the engineers work a little less hard if they were only doing it for millions of dollars rather than the holy grail of fulfilling Moore's Law?
 
Last edited:
Why is it that so many people tout 'Moore's Law' as if it is some kind of cosmological necessity?

My guess? Science fiction movies, and the perception that Moore's Law will be a way to get to them in our lifetime.

I could be entirely wrong, but that is my guess.
 
It is more a business mantra then anything. If companies are keeping to moores law, then they deem themselves up to speed. And if they exceed it, they are doing even better.
 
ya moore's law is not a law in that this is how the world work. Its a law in that thats what moore determined how his company needed to work in order to exist.
 
The other problem I have are the articles claiming that it is somehow responsible for advances in the semiconductor industry e.g. 'Moore's law drives progress in the digital world' (University of Washington Comp Sci department). Would the engineers work a little less hard if they were only doing it for millions of dollars rather than the holy grail of fulfilling Moore's Law?

You're misinterpreting the articles. They mean that that the rapid growth in processing speed (i.e., results of Moore's law) drive innovation in other fields. Access to faster processors and more memory most certainly allow for newer uses.
 
You're misinterpreting the articles. They mean that that the rapid growth in processing speed (i.e., results of Moore's law) drive innovation in other fields. Access to faster processors and more memory most certainly allow for newer uses.

If I misinterpret them then it is only because of their ambiguous writing. If tech journalists were properly literate then there wouldn't be any confusion. Having said that, I think the sentence 'Moore's law drives progress in the digital world' is quite clear and quite wrong.

You know the actual Moore's Law has nothing more to do than the increasing number of transistors that can fit on an integrated circuit.

I am aware of that, but it doesn't make it any more of a law than if it had to do with performance or anything else. It's still just an observation, that started to become inaccurate after a few years, was subsequently changed and which has in it's revised form more or less held since then [1975].
 
By "digital world" do you mean microchip manufacture?

Could any such a law apply to driving progress in the "digital world"? One answer would be "progress" but that is simply too basic to qualify. At the moment it could be attributed to consumer demand + competition, as the "digital world" is still in its infancy but this doesn't take military applications and the like into account.
 
You're misinterpreting the articles. They mean that that the rapid growth in processing speed (i.e., results of Moore's law) drive innovation in other fields. Access to faster processors and more memory most certainly allow for newer uses.

I definitely agree with this. More powerful computers have made so much possible at a lower cost, all those Pixar films for one :)
 
My main gripe with Moore's Law is that it makes the computers I build obsolete too quickly for my liking.
 
Gordon Moore's himself has stated on numerous occasions that the law was initially made in the form of an observation and forecast and nothing more, it was taken and ridden by others and not by more himself.
Gordon Moore has also previously stated in interviews that the law cannot be sustained indefinitely

Anyway it means nothing in real performance terms any more, with clustering and shared resource environments you will be able to scale as quickly as you require in the future. Although this is also just an observation and forecast if you wish we could refer to it as Menace Law :rolleyes:
 
It will deffinatly be interesting to see where it pans out. seems like yesterday I was droolin' over an advertisement for a 100mhz pentium and soon after that scoffing at the mention of 1ghz machines while reading popular science. HEHE
 
It's an impressive and simple term used to attract attention from general public for the industry.
 
If I misinterpret them then it is only because of their ambiguous writing. If tech journalists were properly literate then there wouldn't be any confusion. Having said that, I think the sentence 'Moore's law drives progress in the digital world' is quite clear and quite wrong.

I guess I'm just used to deciphering the meandering logic of today's journalists.... but seriously, it's depressing to have to guess at what they say. The sentence is clearly wrong, but from the perspective of most ignorant tech journalists out there, I would say the intent is the one I mentioned previously (cause they don't have the time to look up what Moore's law actually is).

Anyway it means nothing in real performance terms any more, with clustering and shared resource environments you will be able to scale as quickly as you require in the future. Although this is also just an observation and forecast if you wish we could refer to it as Menace Law :rolleyes:

Ha, all that does is shift the burden from the hardware to the software. There needs to be a Moore's law for cores: every six months, by finding new and ingenious ways to parallelize tasks, the amount of cores that can be fully utilized doubles.

Now all we need to do is get started...
 
The hardware advances are outstripping the software's ability to use that extra power. Distributed computing applications and multiple server VMs seem to be the exception. Of course Microsoft and several other so called feature rich applications are doing the best job they can to suck up all that extra bandwidth :p
 
the concept of moores law is true, however the materials we are crafting chips with also have a little thing called "absolute physical limits" this is why we are seeing cores being slapped together, because the processing core you could say is nearly maxed out, frequencies are topped out at 5ghz? or there about, when information travels so fast over a chip with so many freggin transistors, you get cross talk, saturation, corruption whatever you want to call it.

all thats left is to make the chip die smaller and place 2 together, then 3, then 4 and so forth...., and soon that technique will be exhausted. until some new material is found or a method which totally opens up a new way of computing, our current day moores law will bottom out with this technology.

moores law is true, its like pi, you can always cut a smaller piece of the pie, in your mind mathematically you can always divide in half for a smaller portion. but in the physical world there are limits.
 
Back